How Does a Pocket Calculator Convert Binary to Decimal?

  • Thread starter Thread starter seetherulez
  • Start date Start date
  • Tags Tags
    Calculators
Click For Summary

Discussion Overview

The discussion revolves around how pocket calculators convert binary data into decimal format. Participants explore the underlying technology of calculators, comparing simpler models to more advanced scientific and graphing calculators, and consider the role of microprocessors and software in this conversion process.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification

Main Points Raised

  • One participant inquires about the mechanisms used by calculators to convert binary to decimal, suggesting that calculators function similarly to computers with microprocessors.
  • Another participant explains that low-end calculators may contain a simple integrated circuit with hardcoded logic for binary-decimal conversion, while more expensive models likely have a microprocessor and ROM for more complex operations.
  • A later reply emphasizes that the complexity of the calculator determines its components, noting that cheaper models may consist of basic logic gates, whereas scientific calculators have more advanced computing capabilities.
  • Another contribution clarifies that even simple calculators have a CPU integrated with other components, and the software is typically fixed in ROM, indicating a fundamental similarity to computers.
  • Participants discuss the historical context of calculators leading to the development of modern computers, highlighting the evolution of technology in this area.

Areas of Agreement / Disagreement

Participants generally agree that the complexity of a calculator influences its design and functionality, with some suggesting that higher-end calculators are essentially computers. However, there is no consensus on the specifics of how binary to decimal conversion is implemented across different models.

Contextual Notes

Some limitations include the lack of detailed technical specifications for various calculator models, and the discussion does not resolve the exact methods used for binary-decimal conversion in all types of calculators.

Who May Find This Useful

This discussion may be of interest to individuals curious about the technology behind calculators, those studying computer architecture, or anyone interested in the historical development of computing devices.

seetherulez
Messages
4
Reaction score
0
hello, I'm new here so help me out please.
question: what does a pocket calculator use to convert binary into decimal data?
I've always been curious about this.
a calculator is nothing but a simple computer right? therefore it's core component consists of a microprocessor which instructs and processes in binary, so what is doing all the rest?
obviously it needs some kind of programming language like BASIC to convert the computer logic into decimals, and it needs some kind of interrupt handler for the keypad and the LCD screen. in a desktop computer this is done by the BIOS, so my question is does a calculator
have a bios containing actual software for all this or is it hardwired into the CPU or what?
 
Technology news on Phys.org
Welcome to PhysicsForums!

Some calculators are just highly-integrated computers. A bit of history: modern computers were made possible by the calculator boom of the 70s. It paved the way for the general microprocessor, and ramped up their frequencies. When this collapsed (how many calculators do scientific / engineering types need?) the processors (and some of the other associated ICs) were well-positioned to enable personal computers (not just 'the' PC) to come into being.

In any case, when you take apart your low-end calculators (the ones that only have a single-line display and 5 or 6 buttons in addition to the numeric pad) you'll find an epoxy blob and not much else. That blob probably contains a little sliver of silicon with the display driver, keypad interpreter, and a little bit of logic which can perform the basic math functions on the numbers you punch in, and not much more (why bother wasting space / money / time on any more?) The conversion from binary to decimal isn't particularly difficult, and is probably hardcoded into that little sliver.

Your more expensive scientific computer probably contains something closer to a microprocessor, along with a ROM (Read Only Memory) containing the program necessary for the calculator to carry out its operations. Yes, there's probably a routine which takes care of the decimal-binary conversion.

Your high-end graphing calculators frequently use an off-the-shelf microprocessor (fun fact: the TI-89 uses a slightly updated version of the 68000 used in the original Macintosh), have RAM, and can accept user programs, or firmware updating. The firmware will have some routine or other to convert the binary data to decimal (or hexadecimal, or whatever) for display purposes.

The BIOS in your computer doesn't do much aside from initializing your system (and, well, having a program that allows you to change a bunch of these parameters). I believe interrupts are usually handled by the CPU itself (interrupt is signaled on the system bus, fusing determines whether or not this is valid, CPU sets aside what it's doing to address the interrupt, CPU resumes what it does after the interrupt).

In the above discussion, the CPU is often treated as if it were some stand-alone unit. Some microprocessors also have on-board flash memory, which allows you to reduce your count by one or two ICs (you don't need an external IC to store your programs / firmware on). However, there usually isn't very much of this (it makes your chips more complicated and bigger, which makes it more susceptible to manufacturing defects and more expensive).
 
ok, so basically it depends on how expensive the calculator is, a cheap calc like you get from someone in a gift basket is just a couple of logic gates but a scientific calc actually has the basics of a computer right?
thanks for the reply btw:
 
A simple calculator still has a CPU, it's just integrated onto the same chip as the keypad and screen interfaces. The software is probably also burned into a ROM on the same chip so it can't be changed.
But fundemantaly it has the same components as a computer except possibly lacking any memory.
The first CPUs (Intel 4004) were built to be used in calculators to replace previous operations based on just gates.

As MATLABdude said, at the higher end there is really no difference between a calculator and a computer.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
14K
Replies
13
Views
4K
Replies
4
Views
2K
  • · Replies 34 ·
2
Replies
34
Views
22K
Replies
6
Views
4K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 10 ·
Replies
10
Views
4K
  • Sticky
  • · Replies 13 ·
Replies
13
Views
8K
Replies
3
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K