Courses Will a Machine Language Course Benefit My Computational Physics Emphasis?

AI Thread Summary
Taking a machine language course can provide valuable insights into how computers operate, which is beneficial for fields like computational physics. The course covers essential topics such as program logic, binary arithmetic, boolean operations, and assembly language syntax, which deepen understanding of memory management and program execution. While many programming tasks may not require assembly language, knowledge gained from the course can enhance debugging skills and improve algorithm efficiency, particularly in high-performance computing scenarios like quantum chromodynamics or astrophysics simulations. Although some argue that self-study might suffice, formal instruction can be advantageous for those who prefer structured learning. Ultimately, if the course aligns with personal interests and does not interfere with more critical classes, it can be a worthwhile endeavor.
tmbrwlf730
Messages
41
Reaction score
0
I'm thinking of taking machine language over the summer because it seems interesting to me but I'm not sure if it'll be useful to my emphasis in computational physics. The objectives of the class are below, could anyone give me advice on whether it'll be helpful to me or not. Thank you.

A. Use flowcharts to describe program logic and use procedures when designing program structure
B. Discuss common applications of assembly language and what an assembler does
C. Perform binary arithmetic calculations with signed and unsigned binary integers
D. Explain basic boolean operations and recognize and convert boolean and hexadecimal integers
E. Describe how the operating system loads and executes programs
F. Represent integer constants, expressions, real number constants, character constants, and string
constants in assembly language
G. Formulate assembly language instructions using valid syntax
H. Use the OFFSET, ALIGH, PTR, TYPE, LENGTHOF, and SIZEOF operators PUSH and POP
I. Link programs to an external code library
J. Create conditional and looping structures using assembly language
K. Use the high-level MASM decision and looping directives such as .IF, .ELSE, .REPEAT, and WHILE
L. Explain and use the MUL, IMUL, DIV, and IDIV instructions
M. Discuss how stack frames are used by high-level languages
N. Write recursive functions in assembly language
O. Use the advanced forms of the INVOKE, ADDR, PROC, and PROTO directives
P. Traverse a two-dimensional array using advanced indexed addressing modes
Q. Create nested macros and macros with multiple parameters
R. Use heap allocation functions to create dynamic data structures
 
Physics news on Phys.org
I cannot imagine why any sane person would choose assembly/machine language for computational anything. Pretty much any other language you can think of (C, C++, VB.NET, FORTRAN, JAVA, etc) would be better.

On the other hand, assembly/machine language will teach you how computers work in a way that none of the others can even approach.
 
phinds said:
On the other hand, assembly/machine language will teach you how computers work in a way that none of the others can even approach.

This is the main point: teaching how computers really work.

You have a lot of interesting things that need to be paid attention to especially if you are writing OS code components, device drivers, and even some optimized code using a particular mechanism like the floating point calculator or some kind of vector processing component like the SIMD SSE, SSE2.

What it forces the programmer to do is understand what everything in memory really represents and how flow actually works on a computer.

A lot of people don't realize the complications in doing simple things like allocating memory in an environment that separates the kernel from the application space and even amongst processes or things like device drivers where you need a tonne of assembly language to not only do simple things like print a character or set the screen mode, but also to co-ordinate everything in an optimal and fault free way. This is not as easy and many people think.

The thing is that if you have done a bit of assembly, you'll have a really good idea of what your compiler does to generate data structures (even in C++) as well as how functions and addresses are computed and also what is going on 'in-between process cycles' with all of the other stuff that the OS has to take care of. It also can help you become a better debugger as well.

Most people won't use it for most of their software development, but it is far from useless.
 
Joel Spolsky, founder of Stack-Exchange and Fog Creek (a very Google-like startup), has some really good things to say on why someone working in software should have some understanding of every level of development:

http://www.joelonsoftware.com/articles/LeakyAbstractions.html

There are certain fields of computational physics, like doing quantum chromodynamic calculations, astrophysics/hydrodynamics, electromagnetics, etc. that will require some very in-depth knowledge of how computers work. The reason for this is because QCD simulations can sometimes take months (someone mentioned a year-- A YEAR!), or astrophysics simulations can take a month on hundreds or thousands of nodes on a supercomputer. When you're dealing with this kind of scale, it's important to make sure that you're not wasting any time because you're implementing stupid/inferior algorithms, creating a bottle-neck because of some basic hardware architecture that you were unaware of, ignorant of how memory management works, etc. Anything could throw days, or even weeks, into your total run time. Also knowing how the computer handles and represents data is kind of crucial if you're coming up with garbage values or you need some extreme precision.

Still, these are things you could learn on your own. I wouldn't take a class on it... but that's just me. Most people can't learn those kinds of things by themselves, but physics PhDs (assuming you want to go to graduate school) are a special kind and should be able to do that sort of thing. Also, you may not be very motivated to learn it too.. but anyway, point is that you ought to be able to learn it independently.
 
tmbrwlf730 said:
I'm thinking of taking machine language over the summer because it seems interesting to me but I'm not sure if it'll be useful to my emphasis in computational physics.

Why does it have to be? If you can take the course without failing or leaving out a more important one, just do it if it seems interesting! :smile:
 
TL;DR Summary: What topics to cover to safely say I know arithmetic ? I am learning arithmetic from Indian NCERT textbook. Currently I have finished addition ,substraction of 2 digit numbers and divisions, multiplication of 1 digit numbers. I am moving pretty slowly. Can someone tell me what topics to cover first to build a framework and then go on in detail. I want to learn fast. It has taken me a year now learning arithmetic. I want to speed up. Thanks for the help in advance. (I also...
guys i am currently studying in computer science engineering [1st yr]. i was intrested in physics when i was in high school. due to some circumstances i chose computer science engineering degree. so i want to incoporate computer science engineering with physics and i came across computational physics. i am intrested studying it but i dont know where to start. can you guys reccomend me some yt channels or some free courses or some other way to learn the computational physics.
I'm going to make this one quick since I have little time. Background: Throughout my life I have always done good in Math. I almost always received 90%+, and received easily upwards of 95% when I took normal-level HS Math courses. When I took Grade 9 "De-Streamed" Math (All students must take "De-Streamed" in Canada), I initially had 98% until I got very sick and my mark had dropped to 95%. The Physics teachers and Math teachers talked about me as if I were some sort of genius. Then, an...
Back
Top