Okay, this makes a bit more sense (no pun intended):Ah, ok. See below.
Yes, but the "wiring" is a lot more complicated, at least in all modern computers (where "modern" here means "since about 1970 or so" ).
As far as "translating" things to and from bits is concerned, though, that happens very close to your typing and seeing things on your screen. Your keyboard generates a sequence of bits for each key you type and sends them to the computer; and your computer's monitor translates sequences of bits into what appears on your screen. Everything else in between is bits, and you can think about them abstractly without even having to know the details of how they are physically represented in things like transistors and capacitors and currents in wires. And it's virtually impossible for a single human mind to grasp what's going on in a modern computer without thinking about the bits abstractly; things at the level of transistors and capacitors and currents in wires are way, way too complicated to be able to comprehend at that level while also comprehending how all those things connect to the keystrokes you type and the things you see on your screen. You have to abstract away the actual physical hardware and focus on the bits (and indeed on things at even higher levels of abstraction than that) if you want to understand what's going on at the level of keystrokes and images on screens.
The specific sequence of bits which are generated by the keyboard are somehow related to the ASCII and Unicode stuff that I've been reading about in my Intro to Computer Science text? These are the specific standardization for each symbol or something, right?
The bits, which are represented by voltage and subsequent current through the wiring, somehow reach the monitor in such a way that utilizes the design of all of this to interact with materials which display light in the appropriate place when the electricity hit it, or something of this nature?