Book Recommendation? - How computers / processors work

AI Thread Summary
The discussion centers on finding book recommendations for understanding the relationship between simple circuits, processors, and high-level programming. The individual seeking advice has a background in basic programming and electronics but feels lost in the complexities of how these elements connect. Recommendations include introductory texts on microcontrollers and assembly programming, which help bridge the gap between high-level code and hardware. A specific suggestion is "Computer Organization and Design, Fourth Edition: The Hardware/Software Interface," which provides insights into the hardware-software relationship. Additionally, studying basic microcontrollers like PIC is advised to gain foundational knowledge before diving into more advanced computer architecture literature.
zhermes
Messages
1,253
Reaction score
3
Book Recommendation? -- How computers / processors work

I'm familiar with basic, high-level programming (e.g. C) and the principles behind compilation, operating systems, etc. I'm also familiar with the basics simple electronics, e.g. circuits, transistors, simple logic gates, etc.

The space between these levels is a complete mystery; a magical black-box.

Does anyone have a book recommendation to fill in this (massive) gap?

I'd like to understand how simple circuits are built into processors, processors into computers, and perhaps how the high-level software code is translated into lower-level hardware-minded code.

I'm a physics graduate student, so I think I can handle a good amount of technical language and detail -- but perhaps not a full EE or CE level of it.

Thanks for your recommendations!
 
Physics news on Phys.org


I'd recommend a good intro to microcontrollers or intro to assembly programming textbook. The abstraction starts disappearing when you have to figure out what register / memory location you have to shove bits into in order to perform various tasks--and when you end up banging your head against the wall trying to debug an innocuous-tiny looking program.

Unfortunately, I have to leave these generic, since I don't have any specific ones (when I was learning, I used Harman's The Motorola MC68332 Microcontroller: Product Design, Assembly Langauge Programming and Interfacing. It's nearly 20 years old, but the 332 (and it's descendants) live on today, powering a great number of embedded electronic devices.
 


Studying basic microcontrollers like PIC etc will give you initial insight to the topic. Afterwards pick any computer architecture book for more information
 
The book is fascinating. If your education includes a typical math degree curriculum, with Lebesgue integration, functional analysis, etc, it teaches QFT with only a passing acquaintance of ordinary QM you would get at HS. However, I would read Lenny Susskind's book on QM first. Purchased a copy straight away, but it will not arrive until the end of December; however, Scribd has a PDF I am now studying. The first part introduces distribution theory (and other related concepts), which...
I've gone through the Standard turbulence textbooks such as Pope's Turbulent Flows and Wilcox' Turbulent modelling for CFD which mostly Covers RANS and the closure models. I want to jump more into DNS but most of the work i've been able to come across is too "practical" and not much explanation of the theory behind it. I wonder if there is a book that takes a theoretical approach to Turbulence starting from the full Navier Stokes Equations and developing from there, instead of jumping from...

Similar threads

Back
Top