What is Algorithmic Problem Solving ?

  • Thread starter Thread starter yUNeeC
  • Start date Start date
  • Tags Tags
    Problem solving
AI Thread Summary
Algorithmic Problem Solving involves designing algorithms and implementing them in programming languages like Java, as part of a Computer Science course integrated into a math major. The course is expected to cover both theoretical aspects of algorithms and practical programming skills, potentially using various languages. There is a debate about the relevance of programming in a math curriculum, with some arguing that programming skills are essential for modern mathematicians. The discussion also touches on the distinction between computer science and natural sciences, emphasizing that both fields require logical reasoning but serve different purposes. Understanding algorithms is fundamental to effective programming and problem-solving in both disciplines.
yUNeeC
Messages
34
Reaction score
0
What is "Algorithmic Problem Solving"?

Hi all,

I'm trying to get a head start on next semester's difficult classes and am kind of confused about what a certain class entails.

I have a Computer Science course that is part of the core for my math major:

"CSCI 2310, 2311. Algorithmic Problem Solving and Programming Laboratory (4,0) - Design of algorithms and their implementation as programs in high-level language such as Java."

I was wandering if anyone here had any idea as to what exactly this entails, as well as what I should study in preparation?

Thanks for any help.
 
Physics news on Phys.org


I offer no insights. Though, i do recall this;

http://www.youtube.com/view_play_list?p=8B24C31197EC371C

"Introduction To Algorithms" MIT lecture series (Very comp-sci based, apparently). Could give you an idea of what you're in for.


P.S. I did watch the first lecture at some point, and remember the guy indicating that the first half of the semester would be spent on analysing the theory of Algorithms and such, before moving on to designing algorithms in the second half of the semester.
 


I think you should ask your instructor, or someone else at your university who is familiar with this course.

Based on my experience, this could mean at least two different kinds of courses:

1. A more or less standard introductory programming course in Java. The description is worded so that the instructor can use a different language if he wants, without having to get it approved by a faculty or university administrative committee, and changing the official course description.

2. A more theoretical course in algorithm design using various modeling techniques such as flowcharts or UML, in which the students can implement the algorithms in whatever programming language they happen to be familiar with. This obviously assumes you already know the basics of some programming language such as C, C++, Java, Fortan, ...

3. ?
 


It could be an introductory course, the course typically called "data structures" and which is a continuation of the introductory course, or the in-major algorithms course. It actually sort of sounds like the latter... that's what it sounds the most like at my school, except we did very little programming and treated it more like a regular math class.
 


Are you at East Carolina University? Their catalog has a course with that number, name and description. Based on its position in the computer science degree requirements, it's almost certainly the following:

jtbell said:
1. A more or less standard introductory programming course in Java. The description is worded so that the instructor can use a different language if he wants, without having to get it approved by a faculty or university administrative committee, and changing the official course description.

It might use something other than Java, so if you want to start studying over the summer, you'd better contact the department or instructor and make sure of which language they're going to use.
 


jtbell said:
Are you at East Carolina University? Their catalog has a course with that number, name and description. Based on its position in the computer science degree requirements, it's almost certainly the following:



It might use something other than Java, so if you want to start studying over the summer, you'd better contact the department or instructor and make sure of which language they're going to use.

Yeah, I'm at ECU. Interesting...I just don't understand why part of the core of the BS degree in math would be programming...but I guess that's just my luck.

Thanks for all of the help guys.
 


"I just don't understand why part of the core of the BS degree in math would be programming"

There's probably a CS major somewhere wondering why Calculus is part of the CS curriculum...
 


Well, math is fundamental to pretty much everything. Computer science, at least in my opinion, is not.
 


Theoretical CS is almost like a branch of math. If indeed this course is an introduction to algorithms course like the one I took, it is a mathematically based subject.

On the other hand if it's a less-theoretical programming course, well, mathematicians do need to know how to program.
 
  • #10


Very true. And I'm starting to take interest in robotics...so I suppose if it is programming oriented I should make the most of it.
 
  • #11


what, in coding, is not an algorithm?
 
  • #12


"what, in coding, is not an algorithm?"

It depends on your definition of "algorithm".
 
  • #13


"Well, math is fundamental to pretty much everything. Computer science, at least in my opinion, is not."

I didn't say a CS major didn't need math. I can't think of any concrete way in which calculus aids the study of (basic, core, undergraduate) computer science. Perhaps you can reference the following: http://www.acm.org//education/curricula/ComputerScience2008.pdf , find the topics where 2 to 3 semesters of calculus are required to understand 10% of the material in the course, and count how many course hours such courses total to.

If your argument is that calculus helps CS majors mature and think logically, ditto for programming.

If your argument is that calculus can be used on problems in the real world to which CS is applied, ditto for programming to mathematics.

I can't think of an argument you could make that I couldn't turn on its head.
 
  • #14


AUMathTutor said:
"what, in coding, is not an algorithm?"

It depends on your definition of "algorithm".

Is it something such as within electrical engineering where a 'bus' means "something that has the look-and-feel of what other people have called a bus."? Everything that takes a bunch of bits and deterministically produces another set of bits is an algorithm. XOR is an algorithm.
 
  • #15


"Everything that takes a bunch of bits and deterministically produces another set of bits is an algorithm. XOR is an algorithm."

If that's your definition of algorithm, it's almost trivial to write a fully-functioning program which contains no algorithms.
 
  • #16


AUMathTutor said:
I can't think of an argument you could make that I couldn't turn on its head.

Math = natural science
CS ≠ natural science

Math is more fundamental to logical development than is computer science. Therefore, I would venture to say that a CS major would benefit from math more than a pure mathematician would benefit from CS.

Your opinion doesn't turn my opinion on it's head.

Edit: You seem to be misunderstanding what I was stating. I was wandering why a CS course was part of the CORE for a math degree at my school...not why it was required. For instance, Calculus I-III and Differential Equations are not considered part of the core for my physics major...they are just requisites that don't count toward my major GPA. The "that's my luck" refers to the CS course being in the core, and consequently counting towards my major GPA for math.
 
Last edited:
  • #17


"CS ≠ natural science"
Would you care to back that up?

"Math is more fundamental to logical development than is computer science."
Differences of opinion are what makes the world a wonderful place, I guess.

"Your opinion doesn't turn my opinion on it's head."
That's the pot calling the kettle black.

CS majors at my school take a discrete math class in the math department and it's part of the core. While it should be required of CS majors, why make it a core class? Most of the stuff is rehashed in the CS courses anyway.

So you're just complaining about the politics of what counts towards your major GPA at your school? I guess I thought you were questioning the merit of a mathematician knowing how to program.

I think any college that doesn't make math majors learn how to program is doing their students a great disservice. Ditto for CS majors not learning calculus. I think it's really more your prejudices than any reasoned opinion that leads to your position.

You can be a great computer scientist without knowing calculus, and you can be a great mathematician without knowing how to program. But why would you want to do that? CS and Math majors should have a fair amount of interest in both, as they are pretty fundamental to what constitutes an educated scientist / mathematician / engineer.
 
  • #18


You have got to be kidding me.

CS is not a natural science. Sure, a good amount of logic is involved, but it is logic geared towards implementing a man-made system (hence it not being natural.) Mathematics is a pure language used to explain natural phenomena, while computer science has a heavily invented component.

I've never said programming is not important. Stop inventing arguments...and quit being so fricken arrogant. Just because an opinion differs from your own doesn't make it the byproduct of prejudices.

I've even agreed that programming is an important thing to know, yet because I don't think it should be the only non-pure math course that is considered a part of my major-GPA I have unfounded bias? Nice.
 
  • #19


Neither mathematics nor computer science is a natural science, because they do not study the physical world.
 
  • #20


mXSCNT said:
Neither mathematics nor computer science is a natural science, because they do not study the physical world.

Mathematics can be used to study the natural world. I suppose this would be under the "physics" category...but it is mathematics nonetheless. Also, I suppose "natural mathematics" has realms outside of physics (modeling, etc.)
 
  • #21


AUMathTutor said:
"Everything that takes a bunch of bits and deterministically produces another set of bits is an algorithm. XOR is an algorithm."

If that's your definition of algorithm, it's almost trivial to write a fully-functioning program which contains no algorithms.

That could be. I'd like to see one. Could you explain a small example?
 
  • #22


Consider the following C program(s):

int main()
{
// x could be anything...
// hardly deterministic.
int x;
cout << x << endl;
}

int main()
{
// this doesn't take anything in
cout << "Cool" << endl;
}

int main()
{
// or do nothing at all?
}

Anything with randomization would not constitute an algorithm. There are other examples, for instance...

int main()
{
cout << time() << endl;
}

int main()
{
cout << date() << endl;
}

int main()
{
cout << file.read() << endl;
}

etc.
 
  • #23


"You have got to be kidding me. "
Do you hear yourself?

"CS is not a natural science. Sure, a good amount of logic is involved, but it is logic geared towards implementing a man-made system (hence it not being natural.)"
Do you know what CS is about? If CS is not a natural science, neither is Math. Remember, computers are to a CS what telescopes are to an astronomer...

"Mathematics is a pure language used to explain natural phenomena, while computer science has a heavily invented component. "
CS is used to explain natural phenomena too. You don't need people to carry out computation; if you say you do, you have to accept that you need people for math to exist as well.

"I've never said programming is not important. Stop inventing arguments...and quit being so fricken arrogant."
Granted. I already acknowledged that I initially thought you were calling into question the importance of programming for a math major. As far as the arrogance thing goes... I think you need an attitude adjustment.

"Just because an opinion differs from your own doesn't make it the byproduct of prejudices."
So why shouldn't programming count towards a math major GPA? Shouldn't math majors know how to program? Will that not be expected of them upon graduation? Isn't the purpose of a GPA to help tell employers how much of what you're supposed to know you actually do? etc.

"I've even agreed that programming is an important thing to know, yet because I don't think it should be the only non-pure math course that is considered a part of my major-GPA I have unfounded bias? Nice."
What's your basis, then? Most college majors include in the major GPA courses offered in other departments. Check your local listings. What right do you have to complain about it? Surely you knew about it when you signed on. If not, most univeristies will not hold you to the new requirements.

Excuse me, but what's your freaking point? Is there one? What's the problem?
 
  • #24


AUMathTutor said:
Consider the following C program(s):

int main()
{
// x could be anything...
// hardly deterministic.
int x;
cout << x << endl;
}

This takes data embedded in the code in format f, uses some built-in algorithms and outputs something to the console in format g.

int main()
{
// this doesn't take anything in
cout << "Cool" << endl;
}

This moves thousands of bytes around, changing, rechanging, and resetting a bunch of registers before halting without having external effect in principle, but actuallity scrolling the screen in some cases, which is more than doing nothing.

int main()
{
// or do nothing at all?
}

Same as above.
Anything with randomization would not constitute an algorithm. There are other examples, for instance...

int main()
{
cout << time() << endl;
}

int main()
{
cout << date() << endl;
}

int main()
{
cout << file.read() << endl;
}

etc.

Mia copa. My adjective 'deterministic' was just wrong. Other than that, our apparent point of departure is that you assume 'produce' to be something sent out of the machine, where I haven't. Interestingly, I assumed external input to be an element to be included in deterministic. I'll have to refine my definition. Thanks for the help. Nobody I know personally would have a clue.
 
Last edited:
  • #25


In the first one, the indeterminacy comes from the fact that I don't initialize x.

In the second one, no input is provided, although a functionality is provided (I would call what cout does an algorithm, sure).

In the third one, it fails to "take in a bunch of bits", one of your conditions.

And I believe the expression is "mia colpa" in Italian, the closest to what you wrote. Unless you were going for "my cup" in Spanish.
 
  • #26


AUMathTutor said:
In the third one, it fails to "take in a bunch of bits", one of your conditions.

I'm thrilled you found ammunition in a typo.

It takes a bunch of bits from disk, interns them in memory, changes several registers. After pointers are generated by the supervisor. Maybe you could find something far simpler than int main{}.
 
  • #27


First of all, a "typo" is where you misspell something, like you did with "mia copa". It's not where you say something you later regret having written. That's just called an error, and we all make them, so don't get so defensive about it.

Generally speaking, one does not say that an algorithm takes its code as input. It's sort of just *there* in the context of the algorithm. Also, one does not usually consider what the algorithm requests as input to the algorithm. An algorithm's input is what you give it when you invoke it.

For instance...

Code:
CalculateSquare(x)
1   result := x
2   result := result * x
3   return result
Is an "algorithm" in that it's completely deterministic (at least in theory), whereas
Code:
CalculateSquare()
1   x = readFromConsole()
2   result = x * x
3   writeToConsole(result)
Is not an algorithm, at least, in the restricted technical sense I think is usually intended. Who knows? Do you like these better than my main()'s above?
 
Last edited:
  • #28


yUNeeC said:
Mathematics can be used to study the natural world. I suppose this would be under the "physics" category...but it is mathematics nonetheless. Also, I suppose "natural mathematics" has realms outside of physics (modeling, etc.)
I think mathematics is just as "unnatural" as computer science. All of mathematics is based on axioms and who invented those axioms? Right, people did. Those axioms aren't to be found in nature. In the same way, you can say that the Turing machine (or any equivalent machine) is the basis for computer science , which I guess you could see as a set of axioms, defining clear rules of what is allowed in the system and what isn't. So, in that way, mathematics and computer science are kind of equivalent so if you call computer science "unnatural" then mathematics is also "unnatural".

Of course, mathematics can be used to describe the real world but it only allows an approximation of what is really going on. Therefore, you cannot say that mathematics fully describes nature. For example, almost all of physics is based on differential equations, in which you make the assumptions that the functions you try to find are continuous. However, all things in nature are discrete (can you find me an infinitesimally small charge dq?)? A very interesting document related to this is http://www.math.rutgers.edu/~zeilberg/mamarim/mamarimPDF/real.pdf" by Doron Zeilberger.

So I think that saying that mathematics is natural is silly.
 
Last edited by a moderator:
  • #29


Wow, a nice post, yoran. I tend to agree.

Although I would consider both mathematics AND computer science natural (or at least physical) sciences in the sense that they're (a) sciences, (b) not life sciences (nothing to do with biology), and (c) not social sciences (nothing to do with how people behave). They describe, in theory, things that occur naturally... or in the realm of thought.

Perhaps a better characterization is that Math and CS constitute a separate category altogether. Difficulties arise since there exist machines which can approximate things discussed in CS (computers), so things can be tested... whereas in math, there is not really an idea of "testing" things in an empirical way (the existence of experimental mathematics makes some mathematicians very angry...)

Still, I think that basic programming is a tool that should be common to all scientists, engineers, and mathematicians.

As far as the GPA issue... if people will expect you to know it, and it's not something you are expected to know from high school, you should probably put it in the major requirements. Why? Because otherwise there's no guarantee the person will know anything about it. It seems fair to me. The real question is this: should a math major who cannot make a satisfactory grade in an introductory programming course be allowed to graduate from a math program? Should a computer scientist who cannot make an acceptable grade in calculus be allowed to graduate? etc.
 
  • #30


Mathematics and CS are not sciences, except in the third sense of the word ("a systematically organized body of knowledge of any kind"). They do not study the natural world. The use of mathematics to study the natural world is physics (or chemistry, or biology, etc.), not strictly mathematics. CS does not study the natural world either, because it instead studies man-made formal systems.

There is no single definition of algorithm. Some people say algorithms have to terminate, some people say algorithms have to express pure functions (in other words no randomization or i/o), and some people drop all restrictions and simply describe an algorithm as a series of operations.
 
  • #31


It is well to point out that "algorithm" is a loaded word. Still, one should be consistent with the definition they give, at least in the context it is given. Ergo my providing examples of simple programs that did not constitute algorithms according to the definition given.

I think that any definition of "algorithm" should be consistent, coherent, and useful. I'm not sure defining algorithms as an arbitrary series of operations is useful... but I can definitely see what you're getting at.
 
  • #32


AUMathTutor said:
Still, I think that basic programming is a tool that should be common to all scientists, engineers, and mathematicians.

I agree with that. Computers are so ubiquitous to anything related to science or engineering. You can't do science or engineering research anymore without using a computer. Even some pure mathematicians at my university use mathematical software to test hypotheses and gain insight in problems. Needless to say, these mathematical software packets require lots of scripting.

AUMathTutor said:
It is well to point out that "algorithm" is a loaded word. Still, one should be consistent with the definition they give, at least in the context it is given. Ergo my providing examples of simple programs that did not constitute algorithms according to the definition given.

I think that any definition of "algorithm" should be consistent, coherent, and useful. I'm not sure defining algorithms as an arbitrary series of operations is useful... but I can definitely see what you're getting at.

Yes, apparently there is no standard definition for "algorithm". Even on Wikipedia, there is no formal definition. I've always been taught that an algorithm is a finite sequence of operations which takes an input and produces an output. Furthermore, for any input, the program must produce an output after a finite number of steps. There is no notion of usefulness in that definition. I think it is difficult to include usefulness in the definition because it is has a subjective meaning. You can't define usefulness formally.
 
Last edited:
  • #33


Well, I think that the definition you provide is useful.

Useful how? It lends itself to proving things about algorithms. There may be no formal definition of usefulness, but it's like porn in that you know it when you see it.

I think any good definition of algorithm should require that the algorithm be as referentially transparent as regular mathematical functions. This excludes certain kinds of things as algorithms, but isn't that the point of definitions?

I am wary of using Wikipedia as a resource in these matters. Their discussion of termination is very misleading, at least in my opinion. They seem to be saying that you can not know for sure whether a certain algorithm will terminate, which I don't believe has ever been shown. Alan Turing showed there's no algorithm to do it, but that doesn't mean we can't do it.
 
  • #34


AUMathTutor said:
I am wary of using Wikipedia as a resource in these matters. Their discussion of termination is very misleading, at least in my opinion. They seem to be saying that you can not know for sure whether a certain algorithm will terminate, which I don't believe has ever been shown. Alan Turing showed there's no algorithm to do it, but that doesn't mean we can't do it.
Well, the part on Wikipedia about termination says this:
There is no algorithmic procedure for determining of arbitrary algorithms whether or not they terminate from given starting states.
This is exactly the Halting problem. Of course, as you say, it is possible to look at every algorithm individually (although there are infinitely many different algorithms) and mathematically proof its termination. However, there is no single method (read algorithm) which works for all of them. I don't think that the Wikipedia entry is misleading about that.
 
  • #35


It says that, but it says other things which seem to imply that a lack of an algorithmic procedure means we can't do any better than just testing every algorithm. The presentation is sloppy... I would say Wikipedia is useful for a quick and dirty intro to fundamentals, but when it gets down to details, you should look elsewhere.
 
  • #36


It would only be possible to solve the halting problem if one had access to a more powerful computer (in the formal sense) than a Turing machine. No man-made computer is more powerful than a Turing machine. You might argue that the brain is more powerful, though I would not agree.
 
  • #37


You're missing the point. The point is that there can be no correct algorithm for the problem, but the problem can still be solved on a case-by-case basis.
 
  • #38


SOME of the cases of the problem can be solved; many cannot.
 
  • #39


mXSCNT said:
SOME of the cases of the problem can be solved; many cannot.
That's an interesting question. Practically, it is impossible to prove (or disproof) termination for all algorithms because there are infinitely many. However, assume you start working by taking algorithms at random. Say you generate the random number x_1 and analyze the termination of algorithm_{x_1}. Then you generate the next random number x_2 and analyze algorithm_{x_2}, and so on... Also, you adapt your method for every specific algorithm since otherwise you "encounter" the Halting problem.

The question is now: would you eventually encounter an algorithm for which you cannot either proof or disproof its termination? Or you could go on this way for infinite time, successfully proving or disproving termination for every algorithm on the way?

I don't know the answer to that question and I would be curious to find out whether someone has an argument in favor of either possibility.

PS: Now I realize that I used the term "algorithm" instead of Turing machine, which is wrong because I am talking about proving or disproving the termination of an "algorithm". However, by definition, an algorithm always terminates. So the terminology I used is kind of wrong :-) and you should read "Turing machine" instead of "algorithm".
 
  • #40


I've never seen a proof of that. I would like to point out that the proof of the undecidability of the Halting Problem doesn't show this is true.

I was under the impression that undecidable problems could be solved... you just have to solve them on a case-by-case basis. As in, there's no general-purpose way of solving them.

Could you reference a proof that there exist instances of undecidable problems which literally cannot be decided upon by human beings, in principle?
 
  • #41


Well, I don't know the answer to the question myself and thus would be very interested in a proof (which either proves the proposition true or false). I have a feeling that it is connected to Gödel's incompleteness theorem. If you see the sentence "this algorithm finishes after a finite number of steps" as a theorem to prove, then I think that there exist algorithm's for which this sentence isn't true or false, thus unprovable. So I would say that there exist algorithms for which termination isn't provable, even on a case-by-case basis. However, this is just a thought and I can't prove it formally.
 
  • #42


By "case by case basis" I believe you mean it is possible for a sufficiently smart mathematician to find a proof of whether any given algorithm halts or not. But that is not possible.

Suppose (for contradiction) that your mathematician is working from some finite set of logical axioms, such that algorithm x halts if and only if the fact that x halts can be proved in those logical axioms, and such that algorithm x does not halt if and only if the fact x does not halt can be proved in those logical axioms. If these logical axioms existed, it would be possible to produce an algorithm y that solves the halting problem. y could start enumerating every possible derivation using those logical axioms, starting with proofs that are 1 line long, going on to proofs that are 2 lines long, and so forth. By our assumption, y will eventually reach a proof that x halts or a proof that x does not halt, whereupon it could output that fact. Therefore y is an algorithm that would solve the halting problem. But it is known that the halting problem can't be solved by a Turing machine, so the assumption that such a set of logical axioms existed must be false.
 
  • #43


I'm not sure I like the part in your proof where the "algorithm" recognizes the proof of the termination or the lack thereof. That seems to be just displacing the problem inherent in the halting problem to interpreting proofs; in fact, all you've shown is that interpreting a proof is undecidable if the halting problem is.

How would the algorithm know the proof was a valid proof of termination or lack thereof? I don't see it.
 
  • #44


mXSCNT said:
By "case by case basis" I believe you mean it is possible for a sufficiently smart mathematician to find a proof of whether any given algorithm halts or not. But that is not possible.

Suppose (for contradiction) that your mathematician is working from some finite set of logical axioms, such that algorithm x halts if and only if the fact that x halts can be proved in those logical axioms, and such that algorithm x does not halt if and only if the fact x does not halt can be proved in those logical axioms. If these logical axioms existed, it would be possible to produce an algorithm y that solves the halting problem. y could start enumerating every possible derivation using those logical axioms, starting with proofs that are 1 line long, going on to proofs that are 2 lines long, and so forth. By our assumption, y will eventually reach a proof that x halts or a proof that x does not halt, whereupon it could output that fact. Therefore y is an algorithm that would solve the halting problem. But it is known that the halting problem can't be solved by a Turing machine, so the assumption that such a set of logical axioms existed must be false.

I think this proof is correct. The Turing machine which implements y can see if a proof is correct in the following way. First, you assume you have a formal (that is, a logic expression) statement for "algorithm x always finishes after a finite number of steps". Let's call that expression P. Then, the machine starts generating "proofs", starting from the axioms. If, at some time, it generates the sentence P, then you have a proof for you statement. If it generates not(P), then you know that the statement is false. If you assume you can always find a proof for the termination of an algorithm, this process always stops after a finite time. So you have solved the Halting problem.

However, the proof is still about vague about certain propositions which need clarification. For example, how would the Turing machine which implements y build up the "tree" of proofs? Also, how would you formally state "algorithm x always finishes after a finite number of steps"?
 
  • #45


I still don't like it. I think you can probably construct an equivalency between interpreting / generating / etc. valid proofs and solving the halting problem for a given algorithm on a given input.

I think it's well not to lose sight of the original point. The claim is that there exist algorithms for which there is no way to determine whether or not they terminate (a much stronger condition than undecidability!) I remain unconvinced, even in light of the proposed proof.

I wonder whether there exists a simple example of an algorithm for which there is no way to evaluate termination on some input (the input would also be nice). It's pleasant to talk in terms of contradiction proofs and what not, but seeing is believing.*

* There's something about all this I just don't like.
 
  • #46


Unfortunately it is not possible to supply the example you requested (and prove it is such an example). For every algorithm that terminates, there is a proof that it does so (simply run it until it finishes). So if you had a proof that there is no proof of termination for an algorithm, your proof implies the algorithm does not terminate; your proof would be a proof of non-termination.

Technically, it IS possible to supply an example of an algorithm for which there is no proof of termination or non-termination, but it is impossible to prove that the example satisfies that condition.
 
  • #47


yUNeeC said:
Well, math is fundamental to pretty much everything. Computer science, at least in my opinion, is not.

Most people I work with wouldn't see the point in either, because they just play video games all day...on computers. They have no knowledge of, or interest in, the mechanics of computer hardware or software, and especially not the math behind it all.

Our world is being constantly redefined by the presence and plethora of computers. And while the vast majority can be consumers without any understanding, if you want to actually be a part of the intellectual class or produce any kind of technology, a working knowledge of CS is definitely essential.
 
  • #48


I think that's where I get nervous... when we start talking about truths couched in terms of things that we can't ever know.

It's like looking for the set of keys you can't ever find. Once you find them... they're not what you were looking for anymore.
 
  • #49


Well, whether it makes you nervous or not, the results and don't depend on any weird conjectures. They are just simple logic.

One could provide examples of many programs that we don't KNOW terminate--this doesn't preclude a mathematician from at some point proving they do or don't, but they are at least candidates.

Here's one simple example:
Code:
def f(n):
  i=n
  while i > 1:
    if i % 2 == 0:
      i = i / 2
    else:
      i = i * 3 + 1
    if i == n:
      return True
  return False

n = 1
while not f(n):
  n = n + 1
Does this terminate? Difficult to say...

You could produce many more examples of algorithms whose termination is unknown, by picking some unproved conjecture C. Then let the algorithm systematically generate every proof that follows from your axioms, and if it proves C it terminates. The algorithm terminates if and only if C is provable--so determining whether it terminates is not easy.
 
  • #50


yUNeeC said:
Yeah, I'm at ECU. Interesting...I just don't understand why part of the core of the BS degree in math would be programming...but I guess that's just my luck.

Thanks for all of the help guys.

If your going to do any form of mathematical modelling, simulation, or something related then you will need to understand some basic programming constructs. Let's say you work in applied mathematics. You will probably need to code up a simulation. It might be scripted but programming is programming nonetheless.
 
Back
Top