How Is Electricity and Magnetism Used In Computer Science?

AI Thread Summary
Understanding electromagnetism (E&M) is essential for computer science majors, particularly for those involved in hardware, computer architecture, and communications. While software-focused roles may have limited direct applications of E&M, knowledge of electrical circuits is crucial for comprehending computer structures. E&M principles underpin various aspects of technology, including communication systems and embedded systems like Arduino. Applications such as computer graphics, especially techniques like ray tracing, also relate to E&M through optics. A foundational understanding of E&M aids collaboration with engineers, particularly in fields like aerospace and computer-aided design. Although introductory physics courses provide only a basic overview, they serve to ensure a well-rounded education in the sciences, which is valuable for all computer scientists.
Kot
Messages
57
Reaction score
1
I am a computer science major and we are required to take two semesters of physics. First semester is based on mechanics and the second semester is focused on electricity and magnetism. I was wondering how these topics are used in computer science and in what specific area? I have tried looking on the internet about how electricity and magnetism is used in computer science but couldn't find anything. I found that mechanics is useful for video game development, but how is E&M used in computer science?
 
Physics news on Phys.org
I tried to list some applications,but I now see its better to say,you can't be a computer scientist without knowing at least the basics of electromagnetism.And if you're going to be a good computer scientist who is at least in a part involved in computer architecture or communications,then you should know electromagnetism to a good extent.
 
Shyan said:
I tried to list some applications,but I now see its better to say,you can't be a computer scientist without knowing at least the basics of electromagnetism.And if you're going to be a good computer scientist who is at least in a part involved in computer architecture or communications,then you should know electromagnetism to a good extent.

I have searched extensively online and only found one thread on stackoverflow relating to this topic. The responses in the topic mostly talked about classical mechanics. Could you list some examples and applications of E&M in computer science?
 
If you're going to work only on softwares,I guess there is not much application.
If you're going to work on computational techniques or any part of theoretical computer science,again I guess there is not much application.
But when we reach to hardwares,things get different.Computers are bunch of electric circuits.Understanding every little part of computers' structures requires knowledge of basic electromagnetism.
Almost any kind of communication,human-human,human-computer,device-device,involves EM waves.And if you're going to work on such things,you should know electromagnetism well.
I don't know,maybe there are other applications too.
Also...you're not going to learn whole of electromagnetism in that course.Its just an introduction.I think knowing a little about one of the most applicable parts of science is itself good regardless of any applications.
 
It sounds like to me your institution just wants you to have some background in physics, for the purposes of a well rounded education. So you take the basic two semester 'university' style physics course, which has E&M in it. Probably the course all the pre-med majors take.

Specific example: Any kind of computer graphics that uses techniques like ray tracing possibly touches on some E&M, in the sense that the field of optics is built up on E&M, since light is an E&M wave.
 
I think it would be most useful when dealing with embedded systems and the like. Even a basic knowledge of E&M would be useful when working with tools such as, say, an Arduino.

Also, I expect that computer scientists often work closely with engineers. Knowing some scientific jargon is probably helpful when trying to solve engineering problems with software. Think NASA, computer-aided design and simulation, embedded computers in aircraft, and so on.
 
It's an ABET requirement.

At a bare minimum, if you want to know about hardware at the transistor level and below, you need some basic e&m knowledge.

Most cs majors don't bother with hardware much, if they do it's at the register level or above.

Although, an intro e&m class teaches about idealized conductors, insulators, and Ohmic resistors. It does not cover semiconductors, which are a couple orders of magnitude harder to understand, IMO.
 
You could develop an X-Men computer game with Magneto using his E-M powers :)
 
Back
Top