I am not a physicist. If you wanted to be mean, you cd call me a wannabe. I have worked in computer networking in Manhattan for 20 and have Cisco certification. I know a thing or two but my interests are much broader and so topical than many of you. It appears there are many students here. Just to show off, my girlfriends father, who has sadly passed on, worked on the Manhattan Project at Los Alamos. Otherwise, that is, before he died, he called himself an astrophysicist -- but he supposedly taught the physics of mining at a college in New Mexico, where he and his wife (girlfriend's parents) live. He met Feynman. And was fast friends with Hans Bethe. Once, Bethe fell ill during a party at their house. He slept in my girlfriends old room... strange. But he was ill. When I went to Los Alamos (yes, I know I haven't said a word about what I'm reading yet), he introduced me to this odd physicist (they all live there, keep their secrets, and pal around together), whose name I forgot. But he literally had a barn on his property (lots of space out there), where every Sunday he held "Critical Mass." He was a bit cuckcoo. But he told me something -- not being a scientist, all I recall was it was the mass necessary to start a successful fast-neutron fission reaction. Too little, no runaway avalanch of neutrons splitting other atoms apart exponentially. Too much, the U-235 (or whatever the isotope is) just blows itself apart in a big flash, with much deadly radiation. But the important thing is: he had calculated the amount for the first a-bomb, the amount of Uranium necessary to shove together to have a true fission event. So that was critical, of course. I have lots of stories about Los Alamos, but that's another thing. I am new, so forgive me. I need to explain a little. (Btw, I am a member of IEEE and AAAS -- also NCTE (I teach English!). I'm reading John Pierce's book on Information Theory: Symbols, Signals and Noise. It's an oldie but goodie. I am trying to write a topical exposition on Information Theory (the basics, but including what I know from a life in computing: binary, coding, encyphering/decyphering, and I'd love to get into network theory, but it doesn't really belong there. Hope you were at least entertained. You guys and gals should know, and be proud, that (statistically) physics majors are the smartest people on the planet. That is, we graduate 20,000 or so doctors from our higher ed institutions each year. But only 2,000 or so physics Ph.Ds. I don't think it has anything to do with money. When one is smart enough for physics, one is at least somewhat more interested in how the universe works than dollars. It's because it's so damned hard! So I wish I had that math ability at your level, but I wasn't trained that way (MS in Education) and couldn't handle if I could go back. For some reason, although I'm an expert in TCP/IP and network engineering generally, I turned my back on it after 9/11 and went to school to become an English teacher (there's a story there). Most of what I know is from bugging my two cousins (both PhD physicisits and doing interesting things, well, they were -- they're retired now), reading voraciously anything non-fiction. All the sciences. But physics is the Queen, so of course Feynman and Einstein and Witten and now, Claude Elwood Shannon, are heroes of mine. My cousin tauht calculus, btw, at a local college. He was the only guy in a 50 mile radius (we lived up in Hartsdale and Scarsdale Westchester) who could do it! No kidding. Stories, stories. People, you're gifted. Don't squander it. Work hard as hell. Know what you must backwards and forwards.. Learn how to do calculus so well you will never have trouble with it again in your whole life, not to mention other higher mathematics, or chemistry, biology, electrical engineering, whatever! This nation needs you guys and gals who are students. This world does. Exercise your exceptional noodles, learn as much as you can now. It gets much harder later on (you know physics is usually a young person's game.) Learn, and do something astounding. Big or small (I wrote the very first Desktop Publishing application in 1984 in Pascal on the mac when it had just come out... then Aldus (now Adobe) creamed me, sold the app to Letraset. Story there too, but I'm glad I was lucky enough to contribute. Remember it's a long run. Life is freaking long! There's more to know than anyone one person cd, well, know. And it is (technology, sciences) accelerating or advancing exponentially. Moore's law has been broken. We're faster than the year and a 1/2 (? I forgot. See, don't get older) as far as cpu power now. Maybe Kurzweil is right, the singularity is near! :) (lol).
Daniel