An interrupt (like from a driver) generates an OS signal. The currently executing thread is interrupted, and the program state (registers, usermode stacks, and program pointer) are saved - usually to the interrupt stack.
The function doing this is called an interrupt handler. These guys live in microcontroller firmware, device drivers, and parts of the OS kernel.
There are two classic ways for microprocessors/processors/CPU's to receive input. One is to allow hardware signals to "interrupt" the operating system, which causes execution to jump over to an "interrupt subroutine" for a bit, then return to the operating system. The interrupt subroutine will probably be placing input characters in a buffer, or some similar thing.
The alternatively to interrupt-driven inputs is polling. With a microprocessor, it may be possible to "turn off" most interrupts, and instead, the operating system can be programmed differently, so that it runs a loop which periodically polls each input pin to see it's state. In a polling situation, the logic that was in the interrupt service routine is now right inside the OS and executes right after the polling of that pin.
Polling is rarely used. We had to use it in telecom, sometimes, when handling really fast bit streams in fiber optics communications. That's because we had to make sure that nothing could "interrupt" the timely handling of bit pushing. The risk of an interrupt-driven system is that interrupts are prioritized, so if a high-priority interrupt happens too frequently, the lower-priority interrupts will never execute. Polling allows a way to guarantee that all input pins will be treated with approximately the same priority, simply by making sure that whatever is done at each pin does not take to long before you go on to the next pin.
So...think of a classroom where anyone is allowed to speak. Some people will interrupt a lot, and quieter folks will get little chance to say anything. Then think of a round-robin situation, where everyone is given a chance to speak (for say, a maximum time of one minute). It is easy to understand in that kind of scenario, but in my experience, students have had trouble grasping the difference in microprocessors.