How long until transistors become completely obsolete?

  • Thread starter Thread starter Equilibrium
  • Start date Start date
  • Tags Tags
    Transistors
AI Thread Summary
Transistors continue to be integral in both digital and analog applications, with advancements like insulated gate bipolar transistors enhancing their functionality. While quantum computing holds potential for future developments, the fundamental principles of transistor action remain relevant and widely utilized in the semiconductor industry. Innovations in materials, such as graphene and diamond substrates, are being explored, but the core concept of transistors is unlikely to become obsolete. The evolution of transistor technology is expected to adapt rather than disappear, as they are still produced and used extensively. Overall, the discussion emphasizes that while alternatives are being researched, transistors are not on the verge of obsolescence.
Equilibrium
Messages
81
Reaction score
0
Devices today are getting smaller and smaller which makes the circuit layouts more complicated and hard to design. And knowing that microelectronics is just limited to the size of the atoms. Are people today now developing a replacement on transistors? Just like in the past when they are replacing the vacuum tubes. Or is it (BJT/FET transistor theory) still widely used by EE's on the Semiconductor industry?
 
Engineering news on Phys.org
Transistors become completely obsolete? Please avoid predicting the future. Some of us still use vacuum tube audio amplifiers.

Bobbywhy
 
Transistors are typically used in two modes - digital ( as switches) or analog. While the potential for quantum computers exists in the future - potentially replacing the digital / data switching functions, the analog side is still valuable and not much on the horizon that I know of. Of course I write this from the power electronics field - we have 200A "Switches" on a single chip - I can not see this being replaced by quantum devices. I would like to say never replace- but the only good use for "never" is never say never...
 
To me it seems unlikely that the transistors are going to become entirely obsolete. It seems much more likely that the technology will evolve and adapt.
 
Will there be any need to put a quantum computer to run a coke machine when you can use a transistor based circuit that can be built for a few dollars?

When then do you think transistors will become obsolete?
 
what would you replace transistors with? what device would you use to either switch currents (transistors used in digital circuits) or act as a continuous valve to current (transistors used in analog circuits)?
 
Even quantum computers are supposed to run on conventional electronics (->transistors), quantum computing is just an additional feature available to them. You don't want to calculate 2+3 with quantum logic, and many operations of a computer are as simple as that.
 
Equilibrium said:
Devices today are getting smaller and smaller which makes the circuit layouts more complicated and hard to design. And knowing that microelectronics is just limited to the size of the atoms. Are people today now developing a replacement on transistors? Just like in the past when they are replacing the vacuum tubes. Or is it (BJT/FET transistor theory) still widely used by EE's on the Semiconductor industry?

Transistors are still produced and used in millions, but improved versions are already available.

Insulated gate bipolar transistors are available which have a very high input impedance and operate at high voltages. They also have protective diodes built into the package and this makes them easier to use in circuits.
They can drive inductive loads without external protective diodes.

So, they are starting to blur the distinction between transistors and integrated circuits. Maybe this is the future of transistors.
 
People are working on modifying how transistors are made (see for example research on single-electron transistors or graphene-based devices) but the concept of transistor action is not going away in the foreseeable future.

The integrated electronics field is absolutely dominated by CMOS FETs. So yes, that theory is used quite a bit.
 
  • #10
"Transistor" was once a trademark like "Coke" but has become a general term.

It's origin is practical-
Question: What is a Transistor?

Answer: At its core, a transistor is an electronic component used in a circuit to control a large amount of current or voltage with a small amount of voltage or current.
It does so by sandwiching one semiconductor between two other semiconductors. Because the current is transferred across a material that normally has high resistance (i.e. a resistor), it was [called] a "transfer-resistor" or transistor.
http://physics.about.com/od/electroniccomponents/f/transistor.htm

So i'd wager as they come up with new ways to control current the name will stay the same.

Back when I was following new developments there was talk of sapphire or diamond substrate instead of silicon.

IBM seems to be working on a liquid approach !
http://www.computerworld.com/s/article/9237823/IBM_moves_toward_post_silicon_transistor

To an unsophisticated end user like me they're all "valves".
 
  • #11
jim hardy said:
Back when I was following new developments there was talk of sapphire or diamond substrate instead of silicon.

Both if these are used today, but they are highly niche products.

Silicon-on-sapphire is a variant of silicon-on-insulator (SOI). It basically insulates the active semiconductor from the bulk. So, charged particles interacting in the bulk do not upset the circuits in the epitaxial layer of silicon above the sapphire layer. In industry SOI is typically made with a buried silicon dioxide layer instead of sapphire these days. Some sapphire processes still exist for defense or space applications.

Diamond is kind of the extreme version of sapphire. It is extremely radiation hard and can be used as a detector for high-energy particles. As far as I know diamond is only used in high-energy physics and maybe in some extreme environment applications like oil exploration.

While a sapphire IC can have its readout integrated in the epitaxial layer, as far as I know diamond detectors are passive and need a readout IC (usually implemented in silicon).

Just an FYI in case you are interested.
 
  • #12
Just an FYI in case you are interested.

Thanks Carl yes, I am curious to a fault. Will do some late night perusing. I didn't know they'd done anything beyond research.

As far as I know diamond is only used in high-energy physics and maybe in some extreme environment applications like oil exploration.
I'd guess that hardened stuff finds use in weapons world.
I once conversed with an old timer at TI who opened my eyes to that field.
I come from power industry where ten rads per hour is high . He was more accustomed to megarads per microsecond ...
 
  • #13
jim hardy said:
Thanks Carl yes, I am curious to a fault. Will do some late night perusing. I didn't know they'd done anything beyond research.

Here's an interesting review of diamond detectors for HEP.

http://www.ifh.de/~akg/phys/tapper.pdf

jim hardy said:
I'd guess that hardened stuff finds use in weapons world.
I once conversed with an old timer at TI who opened my eyes to that field.
I come from power industry where ten rads per hour is high . He was more accustomed to megarads per microsecond ...

You could be right. I work in an unclassified area where we are typically more worried about total dose than dynamic effects.

Interestingly enough, deep submicron CMOS is solving a lot of industry's total dose problems by virtue of the thin gate oxides. Total dose of radiation is bad for CMOS devices because charge gets trapped in the oxide and changes the threshold of the device. In deep submicron the oxide is so thin there is a significant DC gate current due to tunneling. This is usually a big headache but in the case of total dose it's good because it drains off these charges that are stuck in the oxide.

Fascinating stuff.
 
  • #14
Silicon carbine (SiC) and Gallium nitride (GaN) are other options as well. And then there are fancy things like a pure diamond switch. The material is different, the basic principle is the same.

carlgrace said:
I work in an unclassified area where we are typically more worried about total dose than dynamic effects.
What order of magnitude of irradiation do you have?
 
  • #15
Thank you Carl i'll get to that article.

Power plant is largely low tech 1960's stuff.
But I bet you'd have enjoyed one experiment I ran -

I wanted to be sure a typical CMOS microprocessor based PLC would be okay after the hundred R dose it might get in an accident.

So I took my TI99 home computer to our health physics guys.
They irradiated it to 1000 Rads for me .
It ran fine during and after the test, so I slept well.

Total dose of radiation is bad for CMOS devices because charge gets trapped in the oxide and changes the threshold of the device. In deep submicron the oxide is so thin there is a significant DC gate current due to tunneling. This is usually a big headache but in the case of total dose it's good because it drains off these charges that are stuck in the oxide.
Might that be the "annealing" effect my friend spoke of?
(This was all twenty plus years ago - excuse fuzzy memories)


old jim
 
  • #16
mfb said:
What order of magnitude of irradiation do you have?

The sensors I'm working on now are order 1 - 10 Mrad / year.

People in my group (not me) have worked on sensors with closer to 60 or 70 Mrad total dose. These devices are working in a really bad neighborhood. :)
 
  • #17
jim hardy said:
Might that be the "annealing" effect my friend spoke of?
(This was all twenty plus years ago - excuse fuzzy memories)


old jim

Hi Jim,

What I was talking about it is not annealing. Annealing in silicon is heating it up to repair lattice damage (interstitials and dangling bonds and the like). Annealing *is* used to repair mechanical lattice damage due to ionizing radiation. That could be what your friend was talking about. The oxide charging is a different beast and it can only be drained through tunneling as far as I know.

The sensor I'm working on now is actually heated in an oven every few hours or days to repair damage to the bulk. The gate charging is repaired by leakage current.
 
  • #18
jim hardy said:
Power plant is largely low tech 1960's stuff.
But I bet you'd have enjoyed one experiment I ran -

I wanted to be sure a typical CMOS microprocessor based PLC would be okay after the hundred R dose it might get in an accident.

So I took my TI99 home computer to our health physics guys.
They irradiated it to 1000 Rads for me .
It ran fine during and after the test, so I slept well.

That is an interesting test. Having it run during the test was to look for "single event upsets" where a particle changes the value in a flip-flop or RAM cell.

We're planning a test in the next 6 months or so to have a chip operate in the core of a reactor to check it's tolerance to thermal neutrons. I'm not working on the test (I'd love to visit the reactor) but hearing about the challenging instrumentation is fun.
 
  • #19
carlgrace said:
The sensors I'm working on now are order 1 - 10 Mrad / year.

People in my group (not me) have worked on sensors with closer to 60 or 70 Mrad total dose. These devices are working in a really bad neighborhood. :)
That is a range I am interested in, too. Not for sensors, however.

carlgrace said:
The sensor I'm working on now is actually heated in an oven every few hours or days to repair damage to the bulk. The gate charging is repaired by leakage current.
Don't you get problems with reverse annealing?Hmm, this is getting off-topic.
 
  • #20
mfb said:
That is a range I am interested in, too. Not for sensors, however.

Don't you get problems with reverse annealing?


Hmm, this is getting off-topic.

Indeed.

I'm not involved in the long-term application of the sensors but I do know they are cooled to mitigate depletion voltage issues due to reverse annealing. I think that may be the lifetime limit of the sensor but I'm not 100% sure.

In case you were referring to the neutron test, we are using heavily doped substrates so we don't expect any significant type inversion (based on the expected neutron flux).
 
  • #21
mfb said:
That is a range I am interested in, too. Not for sensors, however.

Don't you get problems with reverse annealing?


Hmm, this is getting off-topic.


it's a topic of interest to power plant instrument guys who are struggling with obsolescence.

To do a proper evaluation of suitability for replacement one ought to be aware of what a device's threshold of damage and into what environment it is being placed...

I once tested a Rosemount I to P converter to destruction. It was full of DMOS circuits.
AT 10,000 Rads it showed a slight calibration shift.
Second 10Krads killed it.
So we didn't put it where it was planned to go, in a 10R/hr field.

Aside from a lot of reading twenty five years ago that and the TI99 test is sum total of my experience .

You two seem knowledgeable, might you start another thread and see who chimes in?
It could help the industry.


old jim
 
  • #22
I've been waiting for optical transistors since the '70s.

http://hsic.web.cs.illinois.edu/wp-content/uploads/2009/08/tranf1.gif
 
Last edited by a moderator:
Back
Top