# Math needed for a CS career

Science Advisor
Homework Helper
I was just fielding question from a CS major who was asking how much Calculus should be expected over the course of a career as a computer engineer.

I have been a professional programmer since 1971, a very diverse career including process control, embedded systems, management and financial systems, defense, manufacturing, ... pretty much everything.

With regards to Calculus, I have just reviewed several syllabi for Calculus 101 and 102. Essentially everything in Calc 101 has a very high potential of being very useful in a Software Engineering career. The bulk (80+%) of what is in Calc 102 is also of very high potential. In fact, I was surprised at how basic those courses are.

Courses become less homogeneous after that. I suspect Calculus becomes less and less critical after 102.

To be clear, there are lots a programmers out there that are clueless as far as calculus is concerned and they get by.

There are a couple of items that do not seem to be popping up as I peruse the curricula for CS majors. One is what I might call "management statistics" or "pragmatic statistics". I'm thinking Minitab and some aspects of Lean Six Sigma. The software design and test process often involves significant data collection efforts, a SW Engineers should know how to do this and how to speak (or write) what data they need, how they are going to use it, and what they have discovered in terms that management and others in the organization will understand.

Finally, FFTs need to be squeezed in there someplace. It doesn't have to be everything that a Math or EE major might get, but CS majors need to know that is there is periodicity in their data, there is a way of uncovering that information, recognizing harmonics, and identifying the periodic pattern.

I have run into FFT work numerous times: image processing, signal processing, analog and digital video formatting.

Raj Harsh

## Answers and Replies

FactChecker
Science Advisor
Gold Member
For people on the engineering / scientific side, a lot of math will help. IMO, people in the data processing / storage / retrieval / mining side only really need things like conversion of number base and binary arithmetic.

Sorry, what do you refer with CS?

jedishrfu
Mentor
Also you should clarify the CS vs computer engineer?

In my experience, you only need calculus when doing scientific programming and statistics when doing data science programming or deep learning. For business applications, high school math suffices with a bit of Boolean logic, number conversions between binary, octal, hex, and decimal...

I’ve used Linear Algebra for graphics programming, statistics for database programming and calculus and differential equations for physics simulations. My higher level math became less and less useful unless you’re doing game programming and everything and anything could be useful.

QuantumQuest, StoneTemplePython and AgusCF
jtbell
Mentor
Sorry, what do you refer with CS?
I'm rather sure he means "computer science". This is a common term in the US and many other English-speaking countries. In non-English speaking countries a more common term is probably something similar to "informatics."

AgusCF
I'm rather sure he means "computer science". This is a common term in the US and many other English-speaking countries. In non-English speaking countries a more common term is probably something similar to "informatics."
Thanks!

Science Advisor
Homework Helper
For people on the engineering / scientific side, a lot of math will help. IMO, people in the data processing / storage / retrieval / mining side only really need things like conversion of number base and binary arithmetic.
Over the decades, people don't stay in the same place. In fact, there is a lot of drifting between EE and Software Engineering. There is certainly a lot of drift among the many SW disciplines.

A Freshman might say "I'm only going to do Apps", but are you going to give him a BS in Software Engineering only suited to that narrow category? I don't SW is so broad that a BS program can't place a foundation under almost all of it.

Science Advisor
Homework Helper
I'm rather sure he means "computer science". This is a common term in the US and many other English-speaking countries. In non-English speaking countries a more common term is probably something similar to "informatics."
Yes, that is what I meant. I didn't realize it was unfamiliar outside the US.

AgusCF
Science Advisor
Homework Helper
Also you should clarify the CS vs computer engineer?

In my experience, you only need [ ... ] statistics when doing data science programming or deep learning. For business applications, high school math suffices with a bit of Boolean logic, number conversions between binary, octal, hex, and decimal...
If a bug shows itself 30% of the time under certain conditions, and you change the code believing you have addressed it, how many test cases should you run to verify that it is fixed? Right now we have CS graduates who have no idea what a confidence interval is. You can't expect new engineers to be methodical in their testing when they don't have the faintest clue what the methods are.

Also, a lot of AI and Machine Learning has a foundation in Statistics and correlations. No one should get their key to Machine Learning until they've cut their teeth on the practicalities of statistics. Also, Minitab and Lean Six Sigma have their basis in the Manufacturing environment.

Andy Resnick and jedishrfu
jedishrfu
Mentor
Yes I agree with your comments, When I said deep learning I meant AI and machine learning too. With respect to confidence level this depends on the shop. We did not do it explicitly. We did do unit testing, functional testing, system testing and integration testing for multiple locales which usually surfaced many if not all bugs.

One specialty that was used was a form of post-Morten analysis on defects found, what phase of product development they were discovered and what phase introduced them. In this way release managers could get an honest assessment of where they were. A manager might say we are in beta test whereas the data indicates we are still in a development phase with too many defects popping up. This analysis handled by a senior programmer or system test guy did require knowledge of statistics although tools were available to generate the reports and mitigate that.

There was a basic rule for the team, fixing a bug in development cost $20, fixing it in system test or beta cost$200 and fixing it in a released product cost \$2000. However I’m sure the intervening years have increased those numbers greatly.

Klystron and Tosh5457
What math you need depends, of course, on what kinds of problems you're going to be addressing. A few things to consider: discrete mathematics, and algorithmic graph theory are among mathematical disciplines that are broadly applicable; if you're working with fluid dynamics models, you'll need to understand how and when to apply the use of partial differentials, among other things; multivariant linear regression analysis and derivatives are useful in financial decision support; a multitude of applications require basic engineering maths.

Klystron
Science Advisor
Homework Helper
What math you need depends, of course, on what kinds of problems you're going to be addressing. A few things to consider: discrete mathematics, and algorithmic graph theory are among mathematical disciplines that are broadly applicable; if you're working with fluid dynamics models, you'll need to understand how and when to apply the use of partial differentials, among other things; multivariant linear regression analysis and derivatives are useful in financial decision support; a multitude of applications require basic engineering maths.
As a software engineer, if you're working with fluid dynamics models, hopefully you're also working with a fluid dynamics expert. Which brings us to another expertise required of Software Engineers, collecting data and requirements through discussions with stake-holders and others. And as a specific case, talking to scientists and PhD's who are not algorithm-oriented and who are often ill-practiced and ill-equipped to talk to non-experts about the subject matter of their expertise.
As for partial differentials, I did not include them in my list because I have needed them once in my career - when mapping out the area of coverage from air-breathing reconnaissance given "too much" data.

sysprog
FactChecker
Science Advisor
Gold Member
Over the decades, people don't stay in the same place. In fact, there is a lot of drifting between EE and Software Engineering. There is certainly a lot of drift among the many SW disciplines.
Yes, there is a lot of drifting -- by people capable of drifting. But there is a huge amount of software engineering that has no need at all for advanced math or physics. I think you are talking about a subset of the whole.

Last edited:
StoneTemplePython
Science Advisor
Homework Helper
Yes, there is a lot of drifting -- by people capable of drifting. But there is a huge amount of software engineering that has no need at all for advanced math or physics. I think you are talking about a subset of the whole.

First, I would make a differentiation between "advanced math" and "physics". Technically, Physics, except as it applies to the computer hardware, is not part of computer science at all. But "advanced math" depends of what you mean by "advanced". I looked specifically at course descriptions for Freshman calculus, and I was surprised at how non-advanced it was. A lot of that Calculus is more important that the underlying algebra - It's more important to understand the concepts of derivative and integrals than knowing the solution for the quadratic (although the quadratic solution pops up a lot as well).

I picked out FFT because, again, it is a tool that pops up surprisingly often. It answers the questions: is there periodicity in my data set? and What is the periodicity in my data? And those are useful question in almost any setting. I am not saying that SW Eng students should spend a lot of time on this, but they need an introduction and to run through an exercise.

As I said in my original post: There are lots a programmers out there that are clueless as far as calculus is concerned and they get by. But let's look at those jobs with no need for "advanced math". We're probably thinking of those "Cobol" type jobs where the programmer is much more likely dealing with names and addresses than pixels and Newtons. They are often relatively long-tenured maintenance positions that are a bit too complicated to be handled only by spreadsheets and software packages with support from the IT department.

But even in those jobs, there is the opportunity (and sometimes the expectation) that over time, the DP department will use their familiarity with the business data to support business decisions related to marketing, manufacturing, etc. The specific statistics that I recommend is exactly what would support that type of involvement. I wouldn't describe it as "advanced math", it is what many line managers are often expected to know: Measurement Systems Analysis, Gage R&R, ANOVA, etc. These are targeted as much as those "simple" jobs as any other. And Minitab is not targeted at the high-end programmers. It is target at the "masses".

There is also the question about how easy it is for a Software Engineer to avoid some of the more techincal "trig" and "geometry" type applications. And that involves the question of how many jobs is a programmer likely to have during their careers.

So I did some searching. I couldn't find solid industry-wide figures for this.
This article shows tenures for large companies in the San Francisco area:
https://hackerlife.co/blog/san-francisco-large-corporation-employee-tenure

They show an employment "half life" at those large corporations to be only about 2.2 years.
But SF has a "fast moving" reputation for development and may be an exception. Other figures I have found suggest about three years - a little less for small companies, a little more for large ones.

And I expect business programming may average a bit more than that three years.

Still, most programmers will see well over a dozen jobs in their careers. An early introduction to basic tools would give them an opportunities to develop skills that would allow them to "drift" if they needed to.

FactChecker
Science Advisor
Gold Member
Very little traditional engineering math is needed for software management, configuration management, requirements definition, requirements tracking, test types, test coverage, SW design, SW review processes, verification and validation, data retrieval, big data, etc. etc. etc. There are also a great number of areas where the math involved is quite unlike any that would be addressed in typical math courses and, therefore, one might as well wait till he sees what applies. SW engineering is a massively broad subject for which I would hesitate to make statements about what math is required.

Last edited:
jedishrfu
Mentor
Often when we write posts about programming we discover that it is far broader than we can imagine.

For those of us who use Matlab we might think that Fortran or COBOL are outdated for such work. Yet a very large body of still running legacy code using Fortran and COBOL is there doing tasks far beyond pushing customer address data around.

The same goes for many other "outdated" languages.

The TIOBE chart shows actively used programming languages useful for deciding on what skills you might need:

https://www.tiobe.com/tiobe-index/

However, it doesn't show what usage is in our systems today which depends more on age and utility of the particular programming language. With Fortran and COBOL being among the oldest, they are still there doing their jobs as prudent programmers tend not to fix systems that aren't broken. Of course we hope they break so we can replace the language but realize our jobs may be on the line if we break a critical system that has been working just fine or replace it with a much newer buggier system.

More on COBOL:

https://en.wikipedia.org/wiki/COBOL

and Fortran:

https://en.wikipedia.org/wiki/Fortran

Just as L Ron Hubbard said of sci-fi writers:
You don't get rich writing science fiction, you get rich by creating a religion.

https://en.wikiquote.org/wiki/L._Ron_Hubbard

which for programmers could well be:
You don't get famous writing programs or designing libraries, you get famous by creating a new programming language.

and so most if not all programming languages are created from someone being dissatisfied with the existing ones and coming up with one that is better. You can see that best in the C tree:

C --> C++ --> Java --> { Scala / Kotlin / Groovy}

or

C --> Awk --> Perl --> Python --> Ruby --> { Scala / Kotlin / Groovy}

or

Fortran --> Matlab --> Julia

and then there's:

Lisp, Prolog, Forth, Rexx, Lua ....

but I digress.

Last edited:
FactChecker
I think it's pretty important. Areas including AI, data science, graphics/rendering, and engineering all rely pretty heavily on core calculus. I would go so far as to suggest that computer science curriculum should include special calculus courses beyond the standard required calculus series, just for computer science (calculus for computer science).