How do computers interpret binary code for text?

  • Thread starter Thread starter Steven Ellet
  • Start date Start date
  • Tags Tags
    Binary Bolts Nuts
AI Thread Summary
Computers read binary by interpreting sequences of 1s and 0s as digital signals, which represent on and off states. Input devices like keyboards convert physical actions into binary codes, such as ASCII or Unicode, which the CPU understands. The "correct format" refers to how these binary values are organized into bytes (8-bit, 16-bit, etc.) for processing. The CPU operates on these binary values using Boolean logic, where it evaluates conditions based on high and low voltage signals. Data from hard drives is read through changes in magnetization, which are converted into binary signals that the CPU processes. The entire operation of translating user input into actions involves multiple assembly instructions and interactions with the operating system, highlighting the complexity behind seemingly simple tasks like typing a character. Understanding these processes requires knowledge of digital logic and computer architecture.
Steven Ellet
Messages
85
Reaction score
3
If I am too vague, please let me know
How do computers read binary?
I know that the 1s and 0s represents on and off.
if I spell out "hello" in binary using lamps 0=off and 1=on and I take that and try to feed that info to my pc, well I'm going to fail because my pc has no way of handling that input. Despite that fact, my pc can handle my hard drive, how?
 
Technology news on Phys.org
Steven Ellet said:
If I am too vague, please let me know
How do computers read binary?
I know that the 1s and 0s represents on and off.
if I spell out "hello" in binary using lamps 0=off and 1=on and I take that and try to feed that info to my pc, well I'm going to fail because my pc has no way of handling that input. Despite that fact, my pc can handle my hard drive, how?

Your PC has digital interfaces that convert the digital signals into codes with the correct format for it to use.

The lamp could be the led transmitter of a remote audio device connected to the SPDIF input of a computer for digital audio with a light-link cable. The interface would take the off/on light signals and convert them to a format the CPU could use.
https://en.wikipedia.org/wiki/S/PDIF
 
Letters are represented by numbers called character codes. When you type a letter on the keyboard the hardware converts the button press to a keyboard scancode which is then converted to a character code such as ascii or unicode based on where in the world you come from. Basically there are standards that define keyboards in the US, Europe and Asia which are different for each country.

http://www.computerhope.com/issues/ch001632.htm

http://homepage.cs.uri.edu/book/binary_data/binary_data.htm

https://en.wikipedia.org/wiki/ASCII

https://en.wikipedia.org/wiki/Unicode
 
nsaspook said:
convert the digital signals into codes with the correct format
What is the "correct format?" and I was under the impression that all computing (at least on my pc) boiled down to 1s and 0s
 
The correct format means the keyboard letter is converted to an 8-bit number in ASCII or a 16-bit number for Unicode or...
 
Steven Ellet said:
What is the "correct format?" and I was under the impression that all computing (at least on my pc) boiled down to 1s and 0s

The correct format is how the 1s and 0s are arranged in different byte (8-bit, 16-bit, etc ...) sized codes the devices in the computer use. If the CPU is a 64-bit machine it normally would need data translated in the the correct word (usually several bytes wide) size for it to run effectively and a 32-bit device would need the data sized correctly for it to operate on effectively.
 
nsaspook said:
how the 1s and 0s are arranged
So it remains fundamentally still ones and zeros, having said that, the CPU still need to understand the ones and zeros regardless of how it is "arranged"
.
 
Steven Ellet said:
So it remains fundamentally still ones and zeros, having said that, the CPU still need to understand the ones and zeros regardless of how it is "arranged"
.

Correct but ones and zeros are just the Boolean logic levels the computer needs for its procedural operations that use Boolean expressions. The actual electrical level of ones and zeros could be just about anything with two stable states.
 
nsaspook said:
Correct but ones and zeros are just the Boolean logic levels the computer needs for its procedural operations that use Boolean expressions. The actual electrical level of ones and zeros could be just about anything with two stable states.

Boolean logic means does x = y TRUE or FALSE
How does the pc read x and y?
 
  • #10
Steven Ellet said:
Boolean logic means does x = y Yes or No
How does the pc read x and y?

The x and y could be just about any binary code or even single bits in one byte of data where the Yes or No is the result of a Boolean equality operator on x and y.

How does the pc read x and y is a very broad question as the electrical and digital format of data from the platter and heads of a regular hard-drive to the logic gates inside the CPU involves many changes in electrical levels and data formats along the way.

For just applications programming you're normally not really interested in all that detail. The symbolic logic of your programming language and OS hide all those things from you.
 
  • #11
It would be helpful if you read some of the links I posted in post #3.

The second link talks about the various conversions that @nsaspook referred to in his post.
 
  • #12
Steven Ellet said:
Boolean logic means does x = y TRUE or FALSE
How does the pc read x and y?

As has been said, it depends on the format. Digital logic is *typically* designed so that a high voltage on a wire (say, 5V) is interpreted as a 1 and zero volts on the line is interpreted as a 0. There are many, many variations to this, and you usually need a clock (which is a wire that periodically switches from high to low voltage and back again) to tell the computer when to sample the wire to see if a 1 or 0 is being communicated.

You asked about hard drives. VERY simply, the hard drive has a magnetized read head that moves over a platter that looks a lot like a CD. At each spot, if the platter is magnetized the read head will spit out a bit of current. If it is not magnetized, the read head will spit out a different amount of current. These currents are interpreted by circuits in the hard drive controller to be 0 and 1 and then they are sent to the CPU.
 
Last edited:
  • #13
It might help to read up a little on assembly language. It is the next layer up from binary.

The 1s and 0s are interpreted in a specific way to produce ultra-simple commands whereby everything more complex is done by the computer.
A particular hex code (16 bits) will be read as a command like JMP, then another hex code provides a memory address.
Another hex code might be ADD, then another hex code pointing at another address.
This means basically, go to this address take the value you find there and add it to this other value at this other address.

This is (sort of) how a processor assembles all of its complex instructions.

It'll do dozens or hundreds of these to accomplish the simplest tasks like opening a file.
 
  • #14
analogdesign said:
You asked about hard drives. ... the hard drive has a magnetized read head that moves over a platter that looks a lot like a CD.
The read head is not magnitized itself, but it senses changes in magnetization direction that represent the bits on a disk as they pass nearby the head. The orientation of the magnetization was changed to perpendicular a few years ago to increase density. Wiki article:

http://en.wikipedia.org/wiki/Disk_read-and-write_head
 
  • #15
jedishrfu said:
Letters are represented by numbers called character codes. When you type a letter on the keyboard the hardware converts the button press to a keyboard scancode which is then converted to a character code such as ascii or unicode based on where in the world you come from. Basically there are standards that define keyboards in the US, Europe and Asia which are different for each country.

http://www.computerhope.com/issues/ch001632.htm

http://homepage.cs.uri.edu/book/binary_data/binary_data.htm

https://en.wikipedia.org/wiki/ASCII

https://en.wikipedia.org/wiki/Unicode

The following is my understanding of what a pc goes through to type "C" on a text doc based upon your second link
C=01000011=67
Input:'C'
computer reaction:
___________Step 1______I_________Step 2_________I________Step 3
Program: C__--________I________01000011_______I_________67
_____________l_________I____________l_____________I__________l
_____'C'=01000011___I______check database____I____check database
______________________I____________l_____________I__________l
______________________I_______01000011=67_____I____67=Display C
______________________I____________l______________I__________l
______________________I____return to program____I________Done
if i am wrong please provide updated diagram
 
  • #16
Steven Ellet said:
The following is my understanding of what a pc goes through to type "C" on a text doc based upon your second link
C=01000011=67
Input:'C'
computer reaction:
___________Step 1______I_________Step 2_________I________Step 3
Program: C__--________I________01000011_______I_________67
_____________l_________I____________l_____________I__________l
_____'C'=01000011___I______check database____I____check database
______________________I____________l_____________I__________l
______________________I_______01000011=67_____I____67=Display C
______________________I____________l______________I__________l
______________________I____return to program____I________Done
if i am wrong please provide updated diagram
It's much more involved than this. Just to write a single character to a file takes something on the order of 100 to 200 assembly instructions. In addition to translating a bit pattern (01000011) to a character ('C'), the program has to gather up information about the file to be written to and call into the operating system to actually carry out the write operation.
 
  • #18
Steven Ellet said:
The following is my understanding of what a pc goes through to type "C" on a text doc based upon your second link
C=01000011=67
Input:'C'

Your questions explain why a large percentage of people who do understand the tiny details and know that every action in a digital computer system is a precisely programmed and designed set of actions translated from our thoughts is unlikely to ever have true intelligence.
 
Last edited by a moderator:
  • #19
nsaspook said:
Your questions explain why a large percentage of people who do understand the tiny details and know that every action in a digital computer system is a precisely programmed and designed set of actions translated from our thoughts is unlikely to ever have true intelligence.
I tried to diagram this sentence, but was unable to do so. My diagram ended up looking like a Texas road map. :oldsurprised:
 
  • #20
Mark44 said:
I tried to diagram this sentence, but was unable to do so. My diagram ended up looking like a Texas road map. :oldsurprised:

Real Texas road maps show a straight line for a few hundred miles then a sharp left to a bar.
 
  • #21
Your CPU has no concept of letters, it's your program that tells the CPU how to interpret the letters from the binary. For example "hello" is C is not the same binary as "hello" in Pascal (most of it's the same, but the first and last characters are different.)

Most of the characters adhere to this standard: http://www.asciitable.com/ or this one https://en.wikipedia.org/wiki/UTF-8.
 
Back
Top