Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wired article: Physicists will soon rule Silicon Valley

  1. Jan 17, 2017 #1


    User Avatar
    Education Advisor

    Hi everyone. I came across the following article on Wired about physicists (or I should say, physics graduates) playing an increasing prominent role in Silicon Valley, specifically in the field of machine learning/data science.


    I had read the article, and even though there a few statements that are dubious to me -- for example, the article states that computer scientists don't necessarily learn linear algebra, which just made my eyes roll (how can anyone graduate from a computer science program without learning linear algebra) -- overall, I'm intrigued about the overall trends for physics graduates to be highly valued in the tech field in the burgeoning area of big data/data science, in particular in neural networks or deep learning.

    I was wondering if anyone else wants to step in with comments.
  2. jcsd
  3. Jan 18, 2017 #2
    I read the article and the author seems a bit naive to me. I call it the "Michio Kaku effect," whereby some physicists think that just because they're the only person that knows what "isospin" is at a cocktail party, then that means they are the foremost authority on all branches of human thought.

    I am currently a graduate student in computer science and have recently taken a machine learning class and have studied AI, neural networks and neuroscience for many years. I am also a part-time independent student of the physical sciences. That said, apart from perhaps that being a good mathemetician might help in both pursuits, I really don't see the crossover. Constructing algorithms or coding or designing deep-learning back-propagation networks doesn't seem to me to have a lot in common with gravitational wave, dark matter, and QFT research. They are different animals.

    So, I'm not sure why Wired published that article. Physicists are not going to take over silicon valley any more than the plumbers union is going to take over the carpenters union.
  4. Jan 18, 2017 #3


    User Avatar
    Education Advisor

    Back when I was a graduate student in statistics, I was made aware of the connection between the computational methods developed in statistical physics and the computations required for the application of Bayesian methods for machine learning. One of my professors had written a summary of this over 20 years ago. FWIW, here is a link to a graduate class he taught back in 2011 at my alma mater, which provides a link to that summary.

  5. Jan 18, 2017 #4


    User Avatar
    Education Advisor

    I think the problem here isn't so much the article (although I think it's a bit loaded), but rather that machine learning itself has two faces. I do think a physicist (or anyone who deals with data) has a heads up over a computer scientist when it comes to apply models to real life data. There's certain skills that are acquired and polished in research that translates really well into data science. Computer scientist don't necessary get exposure to analyzing experimental data and doing validation. Most C.S folks are concerned with constructing algorithms, run times, distributions, and design. I've found that the C.S. people are rather exceptional at scaling code. However, I haven't found many CS that compare to the scientist i've hired who are rather exceptional at building models and following the bread crumbles. With that said, I haven't found any scientist that is remotely as good as the CS people at building resilient code.

    Natural, I assume there are exceptions.
  6. Jan 18, 2017 #5
    I found it interesting as I learned more advanced physics, how much it clicked with stuff I already knew.

    Special relativity was super easy for me to understand and even visualize. I can imagine the stretching out of space easily because I can project that in OpenGL. Then I saw the transformation represented in a matrix for the first time and it really clicked. OpenGL uses the exact same type of matrix math to do its projections, and further more, the camera is "the reference frame" and you can have many cameras that can transform into each other.

    I also did my graduation paper on fluid simulations. I used a programming structure that I'd first heard about when programming a Doom clone as a child: a binary space partition tree, which in 3D is called an octree. I had to make the cells small to avoid artifacts. When I learned physics later, I realized I'd basically made a quantized field. So while I don't understand any of the math of quantum field theory, I can at least visualize it and understand through it why things like entanglement have to be there.

    Also, I took linear algebra. Most everyone in my department took several years of math, we all knew each other. We definitely took stats, discrete and linear algebra, and calculus. Unless the generic math requirements are less now? It hasn't even been a decade.

    A physicist is just never going to be as good as a programmer. In a physicists entire career they'll write as much code as I wrote in my first year of 9-5. Why am I a better coder than mot physicists? Because I have the experience equivalent of a ten of their lifetimes. And they won't catch up either because I'll maintain my pace until I retire, if the physicist did that, well they wouldn't be a physicist anymore, they'd be a programmer. They're reading physics papers, they don't have time or need to perfect modern event driven, object oriented, resource pooled, parallel programming.
  7. Jan 18, 2017 #6
    It's not clear to me what "resilient software" means in this context, but I think the goal should be "error-free software." I know many will laugh at that idea. Unfortunately, there is a profit-driven culture in software production which says just ship it, and patch it later as needed. As many have pointed out, if we designed airplanes the way some companies design software, they would be falling out of the sky on a regular basis. In fact, people have died due to software errors. Meanwhile, the programmers keep releasing their error-laden productions. Since even error-laden software can be useful, they get away with it.

    Fortunately there are some who have been fighting this trend towards bloated and buggy software. I admire Prof. Niklaus Wirth. In his earlier university years he studied electrical engineering at ETH. Perhaps this influenced his insistence on approaching software as a serious engineering discipline. I think his books, and also his essay, A Plea For Lean Software, are extremely valuable. He is not only famous for his work on Algol, Pascal, Modula, and Oberon, but lately he has done fascinating work on computer hardware, including chip design. I think he is an excellent role model for someone wanting to become a first-rate computer scientist.

    Last edited: Jan 18, 2017
  8. Jan 19, 2017 #7


    User Avatar
    Education Advisor

    Resilient coding is my term for defensive coding. There's a few billion people in this world, the odds that my coders predict how they all plan to abuse the software or websites we make is relatively low.
  9. Jan 19, 2017 #8


    User Avatar
    Education Advisor

    At the risk of going off topic from my original post, I think there has long been a recognition in the academic side of computer science of the importance of building more reliable software -- the entire subfield of software engineering was built on this, with the importance of modular software design, reliability checks, etc. (Prof. Wirth is certainly among the most vocal proponents of this approach, but he is far from alone).

    The issue, as you rightly state, is the corporate culture within the software industry, where speed and profit have taken precedence over quality. The flip side of this is the response of the consumers/clients of software. To a large extent, error-laden productions are released because the consumers, for the most part, have not been demanding enough to insist upon quality (perhaps because new versions of code get produced so frequently, so why bother complaining about quality?) There are certain areas where, due to issues of safety or security, where reliability is paramount (I'm thinking of software intended for use in nuclear power plants, air traffic control, medical devices, etc.) but these tend to be the exceptions, unfortunately.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Threads - Wired article Physicists Date
Download newspaper articles Dec 19, 2017
Single Wire Transmission Experiment Jun 13, 2012
Fun with home wiring! Apr 27, 2011
Where can I buy platinum wire? May 19, 2010