Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How long until transistors become completely obsolete?

  1. Mar 26, 2013 #1
    Devices today are getting smaller and smaller which makes the circuit layouts more complicated and hard to design. And knowing that microelectronics is just limited to the size of the atoms. Are people today now developing a replacement on transistors? Just like in the past when they are replacing the vacuum tubes. Or is it (BJT/FET transistor theory) still widely used by EE's on the Semiconductor industry?
     
  2. jcsd
  3. Mar 26, 2013 #2

    Bobbywhy

    User Avatar
    Gold Member

    Transistors become completely obsolete? Please avoid predicting the future. Some of us still use vacuum tube audio amplifiers.

    Bobbywhy
     
  4. Mar 26, 2013 #3
    Transistors are typically used in two modes - digital ( as switches) or analog. While the potential for quantum computers exists in the future - potentially replacing the digital / data switching functions, the analog side is still valuable and not much on the horizon that I know of. Of course I write this from the power electronics field - we have 200A "Switches" on a single chip - I can not see this being replaced by quantum devices. I would like to say never replace- but the only good use for "never" is never say never......
     
  5. Mar 26, 2013 #4
    To me it seems unlikely that the transistors are going to become entirely obsolete. It seems much more likely that the technology will evolve and adapt.
     
  6. Mar 26, 2013 #5
    Will there be any need to put a quantum computer to run a coke machine when you can use a transistor based circuit that can be built for a few dollars?

    When then do you think transistors will become obsolete?
     
  7. Mar 26, 2013 #6

    rbj

    User Avatar

    what would you replace transistors with? what device would you use to either switch currents (transistors used in digital circuits) or act as a continuous valve to current (transistors used in analog circuits)?
     
  8. Mar 26, 2013 #7

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Even quantum computers are supposed to run on conventional electronics (->transistors), quantum computing is just an additional feature available to them. You don't want to calculate 2+3 with quantum logic, and many operations of a computer are as simple as that.
     
  9. Mar 26, 2013 #8

    vk6kro

    User Avatar
    Science Advisor

    Transistors are still produced and used in millions, but improved versions are already available.

    Insulated gate bipolar transistors are available which have a very high input impedance and operate at high voltages. They also have protective diodes built into the package and this makes them easier to use in circuits.
    They can drive inductive loads without external protective diodes.

    So, they are starting to blur the distinction between transistors and integrated circuits. Maybe this is the future of transistors.
     
  10. Mar 27, 2013 #9
    People are working on modifying how transistors are made (see for example research on single-electron transistors or graphene-based devices) but the concept of transistor action is not going away in the foreseeable future.

    The integrated electronics field is absolutely dominated by CMOS FETs. So yes, that theory is used quite a bit.
     
  11. Mar 27, 2013 #10

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    "Transistor" was once a trademark like "Coke" but has become a general term.

    It's origin is practical-
    http://physics.about.com/od/electroniccomponents/f/transistor.htm

    So i'd wager as they come up with new ways to control current the name will stay the same.

    Back when I was following new developments there was talk of sapphire or diamond substrate instead of silicon.

    IBM seems to be working on a liquid approach !
    http://www.computerworld.com/s/article/9237823/IBM_moves_toward_post_silicon_transistor

    To an unsophisticated end user like me they're all "valves".
     
  12. Mar 27, 2013 #11
    Both if these are used today, but they are highly niche products.

    Silicon-on-sapphire is a variant of silicon-on-insulator (SOI). It basically insulates the active semiconductor from the bulk. So, charged particles interacting in the bulk do not upset the circuits in the epitaxial layer of silicon above the sapphire layer. In industry SOI is typically made with a buried silicon dioxide layer instead of sapphire these days. Some sapphire processes still exist for defense or space applications.

    Diamond is kind of the extreme version of sapphire. It is extremely radiation hard and can be used as a detector for high-energy particles. As far as I know diamond is only used in high-energy physics and maybe in some extreme environment applications like oil exploration.

    While a sapphire IC can have its readout integrated in the epitaxial layer, as far as I know diamond detectors are passive and need a readout IC (usually implemented in silicon).

    Just an FYI in case you are interested.
     
  13. Mar 27, 2013 #12

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    Thanks Carl yes, I am curious to a fault. Will do some late night perusing. I didn't know they'd done anything beyond research.

    I'd guess that hardened stuff finds use in weapons world.
    I once conversed with an old timer at TI who opened my eyes to that field.
    I come from power industry where ten rads per hour is high . He was more accustomed to megarads per microsecond ...
     
  14. Mar 27, 2013 #13
    Here's an interesting review of diamond detectors for HEP.

    http://www.ifh.de/~akg/phys/tapper.pdf

    You could be right. I work in an unclassified area where we are typically more worried about total dose than dynamic effects.

    Interestingly enough, deep submicron CMOS is solving a lot of industry's total dose problems by virtue of the thin gate oxides. Total dose of radiation is bad for CMOS devices because charge gets trapped in the oxide and changes the threshold of the device. In deep submicron the oxide is so thin there is a significant DC gate current due to tunneling. This is usually a big headache but in the case of total dose it's good because it drains off these charges that are stuck in the oxide.

    Fascinating stuff.
     
  15. Mar 27, 2013 #14

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Silicon carbine (SiC) and Gallium nitride (GaN) are other options as well. And then there are fancy things like a pure diamond switch. The material is different, the basic principle is the same.

    What order of magnitude of irradiation do you have?
     
  16. Mar 27, 2013 #15

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    Thank you Carl i'll get to that article.

    Power plant is largely low tech 1960's stuff.
    But I bet you'd have enjoyed one experiment I ran -

    I wanted to be sure a typical CMOS microprocessor based PLC would be okay after the hundred R dose it might get in an accident.

    So I took my TI99 home computer to our health physics guys.
    They irradiated it to 1000 Rads for me .
    It ran fine during and after the test, so I slept well.

    Might that be the "annealing" effect my friend spoke of?
    (This was all twenty plus years ago - excuse fuzzy memories)


    old jim
     
  17. Mar 27, 2013 #16
    The sensors I'm working on now are order 1 - 10 Mrad / year.

    People in my group (not me) have worked on sensors with closer to 60 or 70 Mrad total dose. These devices are working in a really bad neighborhood. :)
     
  18. Mar 27, 2013 #17
    Hi Jim,

    What I was talking about it is not annealing. Annealing in silicon is heating it up to repair lattice damage (interstitials and dangling bonds and the like). Annealing *is* used to repair mechanical lattice damage due to ionizing radiation. That could be what your friend was talking about. The oxide charging is a different beast and it can only be drained through tunneling as far as I know.

    The sensor I'm working on now is actually heated in an oven every few hours or days to repair damage to the bulk. The gate charging is repaired by leakage current.
     
  19. Mar 27, 2013 #18
    That is an interesting test. Having it run during the test was to look for "single event upsets" where a particle changes the value in a flip-flop or RAM cell.

    We're planning a test in the next 6 months or so to have a chip operate in the core of a reactor to check it's tolerance to thermal neutrons. I'm not working on the test (I'd love to visit the reactor) but hearing about the challenging instrumentation is fun.
     
  20. Mar 27, 2013 #19

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    That is a range I am interested in, too. Not for sensors, however.

    Don't you get problems with reverse annealing?


    Hmm, this is getting off-topic.
     
  21. Mar 27, 2013 #20
    Indeed.

    I'm not involved in the long-term application of the sensors but I do know they are cooled to mitigate depletion voltage issues due to reverse annealing. I think that may be the lifetime limit of the sensor but I'm not 100% sure.

    In case you were referring to the neutron test, we are using heavily doped substrates so we don't expect any significant type inversion (based on the expected neutron flux).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: How long until transistors become completely obsolete?
Loading...