How Is Electricity and Magnetism Used In Computer Science?

Click For Summary
SUMMARY

Electricity and magnetism (E&M) are essential for computer science, particularly in hardware design and communication systems. A foundational understanding of E&M is crucial for computer scientists involved in computer architecture, embedded systems, and any work that requires knowledge of electric circuits. Applications of E&M include computer graphics techniques like ray tracing, as well as practical uses in embedded systems such as Arduino. Furthermore, familiarity with E&M concepts aids collaboration with engineers in fields like aerospace and computer-aided design.

PREREQUISITES
  • Basic understanding of electromagnetism principles
  • Familiarity with computer architecture and hardware design
  • Knowledge of electric circuits and their components
  • Introduction to computer graphics techniques, particularly ray tracing
NEXT STEPS
  • Study the principles of electromagnetism relevant to computer hardware
  • Learn about embedded systems and their applications in computing
  • Explore computer graphics and ray tracing techniques
  • Investigate the role of E&M in communication systems and device interactions
USEFUL FOR

Computer science students, hardware engineers, software developers working on embedded systems, and professionals in fields requiring collaboration with engineering disciplines.

Kot
Messages
57
Reaction score
1
I am a computer science major and we are required to take two semesters of physics. First semester is based on mechanics and the second semester is focused on electricity and magnetism. I was wondering how these topics are used in computer science and in what specific area? I have tried looking on the internet about how electricity and magnetism is used in computer science but couldn't find anything. I found that mechanics is useful for video game development, but how is E&M used in computer science?
 
Physics news on Phys.org
I tried to list some applications,but I now see its better to say,you can't be a computer scientist without knowing at least the basics of electromagnetism.And if you're going to be a good computer scientist who is at least in a part involved in computer architecture or communications,then you should know electromagnetism to a good extent.
 
Shyan said:
I tried to list some applications,but I now see its better to say,you can't be a computer scientist without knowing at least the basics of electromagnetism.And if you're going to be a good computer scientist who is at least in a part involved in computer architecture or communications,then you should know electromagnetism to a good extent.

I have searched extensively online and only found one thread on stackoverflow relating to this topic. The responses in the topic mostly talked about classical mechanics. Could you list some examples and applications of E&M in computer science?
 
If you're going to work only on softwares,I guess there is not much application.
If you're going to work on computational techniques or any part of theoretical computer science,again I guess there is not much application.
But when we reach to hardwares,things get different.Computers are bunch of electric circuits.Understanding every little part of computers' structures requires knowledge of basic electromagnetism.
Almost any kind of communication,human-human,human-computer,device-device,involves EM waves.And if you're going to work on such things,you should know electromagnetism well.
I don't know,maybe there are other applications too.
Also...you're not going to learn whole of electromagnetism in that course.Its just an introduction.I think knowing a little about one of the most applicable parts of science is itself good regardless of any applications.
 
It sounds like to me your institution just wants you to have some background in physics, for the purposes of a well rounded education. So you take the basic two semester 'university' style physics course, which has E&M in it. Probably the course all the pre-med majors take.

Specific example: Any kind of computer graphics that uses techniques like ray tracing possibly touches on some E&M, in the sense that the field of optics is built up on E&M, since light is an E&M wave.
 
I think it would be most useful when dealing with embedded systems and the like. Even a basic knowledge of E&M would be useful when working with tools such as, say, an Arduino.

Also, I expect that computer scientists often work closely with engineers. Knowing some scientific jargon is probably helpful when trying to solve engineering problems with software. Think NASA, computer-aided design and simulation, embedded computers in aircraft, and so on.
 
It's an ABET requirement.

At a bare minimum, if you want to know about hardware at the transistor level and below, you need some basic e&m knowledge.

Most cs majors don't bother with hardware much, if they do it's at the register level or above.

Although, an intro e&m class teaches about idealized conductors, insulators, and Ohmic resistors. It does not cover semiconductors, which are a couple orders of magnitude harder to understand, IMO.
 
You could develop an X-Men computer game with Magneto using his E-M powers :)
 

Similar threads

  • · Replies 23 ·
Replies
23
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
8K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 21 ·
Replies
21
Views
4K
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
5K