Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

What is the word size of a processor?

  1. Sep 4, 2011 #1
    I know the word size of a processor is how many bits it can 'process' at a time but I'd like some elaborations of that.
     
  2. jcsd
  3. Sep 4, 2011 #2

    phinds

    User Avatar
    Gold Member
    2016 Award

    I don't know what elaboration it needs. That's it. It is the organizational structure size of the processor. It's how big the registers are. It's how wide the ALU is. It is, as you said, the # of bits a processor can process at one time.

    It is NOT, by the way, the fundamental MEMORY structure size. That's the byte. These days folks think "byte = 8 bits" but that's not correct. There are mainframe processors where the byte size = the word size = 32 bits. On PCs the byte IS 8 bits, but it's not defined as 8 bits. The problem, to the extent that one can consider it a problem, of having a byte size smaller than a word size is that it then requires multiple memory fetches to get a word for processing. That's why mainframes (think FAST) often have byte size = word size.
     
  4. Sep 4, 2011 #3
    I thought word is 16bits and has nothing to do with how many bits the processor can process at one time.

    My knowledge is in the grandfather's days, but I really think the word is defined as 16 bit just like the byte is 8 bits and never change!!! Today when they talk about 64 bit bus is 4 word wide!!!! Correct me if I am wrong as I am an analog guy that over step into the digital world.
     
  5. Sep 4, 2011 #4
    For the ALU, register, and the processor itself, how is the word size physically expressed?
     
  6. Sep 4, 2011 #5

    phinds

    User Avatar
    Gold Member
    2016 Award

    That is incorrect. My post is correct. I've been doing and teaching this stuff since 1962.
     
  7. Sep 4, 2011 #6

    phinds

    User Avatar
    Gold Member
    2016 Award

    As a number. For the Z80 and early Intel processors (8080) it was 8 bits.

    For mini-computers in the 1970's it was 16 bits.

    For some mainframes it has been various values. I've seen 32, 64, 66 and I vaguely recall one that I think had 80 bits. Very weird.

    When computers have bytes size less than word size, then, as I said, it requires multiple fetches for a word to process. Actual physical fetching is often augmented by various schemes to avoid too much slowdown, and pipelines are often used for the same reason. Only when the instruction path changes (a jump instruction) does the pipeline have to be dumped and restarted for instruction fetches. Data fetches are more likely to be random so often can't avoid the slowdown.
     
  8. Sep 4, 2011 #7
    I really enjoyed reading your post, phinds. Thanks.
    I'll remember most of the terminology but I won't really understand it without further elaboration.
     
  9. Sep 5, 2011 #8

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    So what's a 'nibble'?
     
  10. Sep 5, 2011 #9

    phinds

    User Avatar
    Gold Member
    2016 Award

    As I recall, a nibble is 4 bits --- a chunk size taken because the value it holds it can be expressed with one hex "digit".

    Don't know if that was ever "formally" defined, but I've seen it used a lot as meaning 4 bits. The name, obviously, is chosen because it is part of a byte, not a whole "bite".
     
  11. Sep 5, 2011 #10

    phinds

    User Avatar
    Gold Member
    2016 Award

    Hm ... I'm not clear on what further elaboration you need here. Can you be more specific? What exactly is it that doesn't quite sink in? Can you even pinpoint it? (I know I sometimes have a hard time figuring out just what something won't stick with me).
     
  12. Sep 5, 2011 #11
    I want to know how it is that each word is read individually.
     
  13. Sep 5, 2011 #12

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    That's what I would have said, too - just checking that we are both on the same hymn sheet. But I have never heard of a Byte as being anything other than 8bits. 'Words' have always, to me, contained a certain number of Bytes- e.g. "a four Byte word", with 32bits. I haven't ever seen a Byte being specifically (re?)defined at the beginning of any text, which would be necessary if it could ever be taken as other than 8 bits.

    I still have a 'facts' booklet (July 1966) for an Elliott 803 computer which used 19 and 39 bit words, with 5 hole punched paper tape and 35mm sound film as the bulk storage medium. Moreover, the instructions were specified in Octal. There is no mention of the word "Byte" in the whole booklet. A real blast from the past. The department soon got into DEC mini computers and then the Byte appeared over my horizon. Amazing that this long word architecture went away, only to come back again.
     
  14. Sep 5, 2011 #13

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Most off the shelf memory basically uses 8bit storage. For the processor to use it, it is necessary to take the data out in four, 8 or any other number of dollops. This is done by what I think they call a Memory Chip Controller which, I guess, assembles data from a number of different locations. This presents the processor with already- assembled words, saving it a lot of time.

    This really isn't the place to go into the details of computer design - you need a big book and a hot towel for this, I think! Beware all the jargon and acronyms. :smile:
     
  15. Sep 5, 2011 #14

    phinds

    User Avatar
    Gold Member
    2016 Award

    I agree w/ all of that, BUT ... mainframes, at least the early ones, did not use 8-bit chunks. The byte was the same as the word ... generally 32 bits, but sometimes more, and mini-computers also did not use 8 bit chunks, the byte was the same as the word (16 bits). I don't know what modern mainframes use.

    I can't put my hands on a formal definition of "byte" but I recall from my early days, it WAS specifically defined as "the smallest addressable chunk of memory" and over time, as that quantity settled in on 8 bits for computers that most of the world is familiar with, "byte" came to take on the de facto meaning of 8 bits and I have had folks argue w/ me vehemently that it IS ONLY 8 BITS.

    treehouse, let me add this about the architecture:

    A computer has what's called a CPU (central processing unit), an ALU (arithmetic/logic unit) and an MAR (memory address register), among other things. The MAR is filled with a memory address and the fetch electronics gets the contents of that address. If the byte size is smaller than the word size, then multiple fetches are performed. This fills up a register in the CPU with the contents of memory. If that is an instruction, it goes to the instruction decoding register which has a whole ton of logic on its output to decipher the instruction and do stuff. If it's a data value, it goes to the ALU (or possibly to a specific register if so directed) where it can be used in subsequenst data manipulations.

    It's not really as complicted as it might sound and it's tons of fun to learn, but as sophiecentaur, you DO need a big book and a hot towel (I would have said headache pills) because even though it's fun, it does get a bit weird until you get it all down in your head.
     
  16. Sep 5, 2011 #15

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Ah. I see we are using the Anglo Saxon version of ye worde Byte. lol
    The 803 was the nearest I came to actual computer hardware in those days. I don't think they had even heard of the word.
     
  17. Sep 5, 2011 #16

    phinds

    User Avatar
    Gold Member
    2016 Award

    Actually, as I recall, the word byte really was NOT used much at all in the early days, as it was really only of interest to us specialists. I don't think programmers, for example, used it. I designed computer hardware so was heavily into hardware architecture. I think it was only with the advent of personal computers that it started being used and because they all used an 8-bit byte, people started early on in the PC era to use the word as synonymous with 8 bits.
     
  18. Sep 5, 2011 #17
    What architecture makes it such that the word size is all that is processed at a time?
     
  19. Sep 5, 2011 #18

    rbj

    User Avatar

    there are a few things correct about what phinds sez here and a few things that are not. if we're referring to a generic processor, the registers inside may be of different widths. some registers may be twice as wide as others.

    it also matters what data one is referring to. some DSP and RISC chips have opcode word size that is larger than the data word size.

    the most consistent definition that is accurate in a wide variety of situations is the word size of a processor is the the width of the ALU (arithmetic logic unit). this is not always the number of bits that the processor processes "at a time". some embedded chips may have an internal ALU word of 32 bits, but have a data bus width of 8 bits, so the chip has to access four 8-bit words (at different times at the nanosecond scale) before it can process them.

    i, personally, like it if a chip has ALU width, data bus width, address bus width, and opcode width all of the same width. i really hate fiddling with efficiently stuffing 48-bit opcodes into 32-bit wide memory spaces.

    phinds, your knowledge is anachronistic. nowadays, nearly a half century from 1962, if you buy or spec any chip, hard drive, thumb drive, blank CD, or whatever storage device, and if the memory capacity is expressed in "bytes" (often they spec it in bits), after any 8-to-6 (or whatever) coding scheme, the number of bits of storage available to the user is always 8 times the number of bytes that the spec. that is always the case and if phinds thinks differently, his knowledge of usage of the word is a few decades out of date.

    to his (or her) credit:

    from wikipedia:
    note the term "ubiquitous acceptance".
     
  20. Sep 5, 2011 #19

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    I don't think you are going to get a suitable answer here, to be honest. Some of us are taking trips down memory lane and others of us are in the modern day. What you need is to look all this up (there must be thousands of books that are good enough to make a start with - or even Wiki). The whole business of computer architecture is far too complicated for a 'question and answer' method of learning. I / we have no idea of how much you actually know so any answers can't be tailored to your needs, I fear. You could waste a lot of your own time trying to do it this way.
    Specific and not open questions are better suited to this sort of forum.
     
  21. Sep 5, 2011 #20

    phinds

    User Avatar
    Gold Member
    2016 Award

    rbj, you're correct of course, I just didn't even want be bring in DSPs and RISC type processors. Then you have the modified Harvard architecture where the data memory and the instruction memory are not on the same address buss. All this stuff is way too complicated for the OP.

    I agree that 8 bits is the de facto std for "byte" (and folks now would think it wrong to use it any other way) but as the wiki points out, that's not really the definition.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: What is the word size of a processor?
Loading...