Real advances in Computer Science?

In summary, the conversation discusses the contributions of Computer Science (CS) to the world, both in terms of advancements in other fields and its own developments. Some major breakthroughs in CS include Shannon's information theory, Turing's computation theory, Chomsky's theory of context-free grammars, computational complexity theory, and artificial intelligence theories. However, there is a debate about the importance of these theories and whether CS has truly brought significant advancements to other fields. The conversation also mentions the impact of automation and programming errors, as well as a history of costly software errors. Overall, there is a question about the relevance and impact of CS and whether it is mainly studied for its own sake.
  • #1
Sagant
14
0
Hi,
I wanted to start this topic to discuss what are real advances in Computer Science (CS) that actually have contributed to the world, both to sciences and academia, as well as to the industries and common people.

Usually, the discoveries on Physics, Engineering, Chemestry and Biology are more easily reported, while math and CS get a little bit in the shadow (the exception perhaps is AI).

TL;DR: What do you think are the major contributions the field of CS has brought up? And how much importance is given to those developments?

P.S.: This post is informative only, I don't mean to light up any bad discussion or war.
 
Technology news on Phys.org
  • #2
There's so much overlap in fields. We have computers because of computer scientists and mathematicians, we have fast ones that fit in your pocket because of material scientists and physicists. So I assume you're asking about the purely CS stuff, not things like the work with silicon/transistors behind the computer age. My list of favorite breakthroughs would be:

1) Shannon's information theory - led to formal notion of information which can be processed and sent on networks.
2) Turing's computation theory - led to a model of all computation
3) Chomsky's theory of context free grammars - led to programming languages from which software could be written
4) Computational complexity theory - led to a deep understanding of the limits of computation
5) Artificial intelligence theories - many of them, and they are behind most breakthroughs in computer science for decades and will probably shape the world most in decades to come.

I'm sure there's stuff that should be there that isn't, (like something having to do with the Internet) but these are some of the big ones.
 
  • #3
The automation of mundane work previously done by office human computers is probably the biggest followed by the damage even a small programming error wreak on an unsuspecting community.
 
  • #4
Fooality said:
So I assume you're asking about the purely CS stuff, not things like the work with silicon/transistors behind the computer age.
Oh yes. I meant to ask about pure CS stuff. As in, what did CS alone bring of advancements to the our world?
To be clear, I'm a CS major trying to understand what my field has brought up. I'm pretty unsure with what in fact have we contributed, and the level of importance of this contribution.

I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.

jedishrfu said:
The automation of mundane work previously done by office human computers is probably the biggest followed by the damage even a small programming error wreak on an unsuspecting community.

If I remember correctly, it was a programming error that led one of NASA's rocket to explode (it was due to precision error, or something like that).

Well, yes, automation is very important. But it wasn't exactly CS who created it. The computers eventually made it happen, but it could have happened even if all programs had to be created by engineers using FORTRAN 77. Although it would be much harder.

Thanks for the answers by the way :)
 
  • #5
Sagant said:
Oh yes. I meant to ask about pure CS stuff. As in, what did CS alone bring of advancements to the our world?
To be clear, I'm a CS major trying to understand what my field has brought up. I'm pretty unsure with what in fact have we contributed, and the level of importance of this contribution.

I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
Yes, I think so. The contribution described by @jedishrfu is much more significant, IMO.

Another advance that was very significant was the realization by John Von Neumann that computer memory could be used to hold both program code and the data it worked on.
Sagant said:
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.
If I remember correctly, it was a programming error that led one of NASA's rocket to explode (it was due to precision error, or something like that).
What you're remembering wasn't a NASA rocket -- it was an Ariane 5 rocket (https://en.wikipedia.org/wiki/Ariane_5). There was a space shuttle (Challenger) that exploded on launch in 1986, killing all of the astronauts inside it, but that was due to an o-ring, not a programming error.
Sagant said:
Well, yes, automation is very important. But it wasn't exactly CS who created it. The computers eventually made it happen, but it could have happened even if all programs had to be created by engineers using FORTRAN 77. Although it would be much harder.

Thanks for the answers by the way :)
 
  • Like
Likes QuantumQuest
  • #6
Yes, Ariane 5 rocket. Although (risking going off-topic) I've read that it was due to the number of bits to represent numbers and the concersion from one another. A failure that could be fixed through software.
(A link, but I don't know how trustable it is: https://www.google.com.br/url?sa=t&...HRTt2I78Uy0Nowwrw&sig2=jeLlxDEay9dQTpRk7cfbyw).

Back to the topic:
Okay, but as far as I'm noticing the research in computer science has very few impact on other areas, and most of the important discoveries are from its beginning, rather than recent.

This makes me wonder: Is CS a topic being studied only for the sake of CS? To keep the professors doing something?
 
  • #7
No so! Much of CS is related purely to CS partly because it can be used on many problems and partly because its built on other CS ideas.
 
Last edited:
  • #8
CS undoubtedly has great impact through its advancements to our everyday life. Thing is that its advancements have been and are used at large by other sciences and there are many cases in which without these impacts, there would be no advancements to other sciences at all but many times this remains unnoticed by most people. I'm not talking about technology and generally implementations. These are more or less the extensions (with all the necessary modifications and adaptations that an implementation demands) and also, undoubtedly in my opinion, they have their own great impact.

I'll second Mark about Von Neumann model, as this had a very deep impact for every computing machine out there. Also, the list that Fooality gave, is things with really great impact.

Sagant said:
However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?

You can't tell for sure what we could have created without these because this just didn't happen. As we say in theory of algorithms, the eternal question is "Can we do better?" but unless we find this better we can't really tell.

Also, the advancements in CS are subject to advancements in other sciences (i.e the other way around), with math being the most profound in my opinion. Great algorithms and algorithmic techniques go hand in hand and alternate between CS and math. Because CS is really all about algorithms, the impact is lying at a low level but it is very fundamental and essential to many aspects of everyday life.

As for now, new advancements are achieved frequently, but they go unnoticed for many people until some new product hits the market. And then technology prevails but there would be no technology without CS. But to be fair, technology plays also a major role as it is the vehicle (regarding experiments and products) that helps CS advancing.
 
Last edited:
  • #9
another major achievement was the binary floating point format model, without it we'd still be doing either integer or fixed decimal arithmetic.

Conrad Zuse was the first to implement it in his mechanical digital computers.

https://en.m.wikipedia.org/wiki/Floating_point
 
  • Like
Likes QuantumQuest
  • #10
In am not a CS guy, but this looks interesting.

Sagant said:
research in computer science has very few impact on other areas, and most of the important discoveries are from its beginning, rather than recent.
If you are interested in things in non-CS fields affected by CS advances I would point to the use of CS (however maybe these would not be strictly CS in the way you intend) in the analysis of many things in modern biology: imaging, genome analysis, and predicting protein structure. These are all very data intensive and would not be possible otherwise.

Fooality said:
3) Chomsky's theory of context free grammars - led to programming languages from which software could be written
Chomsky from linguistics? I was unaware of his impact on CS.
 
  • #11
jedishrfu said:
No so! Much of CS is related purely to CS partly because it can be used on many problems and partly because its built on other CS ideas.

QuantumQuest said:
CS undoubtedly has great impact through its advancements to our everyday life. Thing is that its advancements have been and are used at large by other sciences and there are many cases in which without these impacts, there would be no advancements to other sciences at all but many times this remains unnoticed by most people. I'm not talking about technology and generally implementations. These are more or less the extensions (with all the necessary modifications and adaptations that an implementation demands) and also, undoubtedly in my opinion, they have their own great impact.

I see. So basically the CS ideas and developments are covered, and the developments of those who use these ideas are the ones that really get to the general public.

But then, not that I want to be too specific, but what advances in CS do you think helped in say, Physics, or Engineering?
QuantumQuest said:
I'll second Mark about Von Neumann model, as this had a very deep impact for every computing machine out there. Also, the list that Fooality gave, is things with really great impact.

You can't tell for sure what we could have created without these because this just didn't happen. As we say in theory of algorithms, the eternal question is "Can we do better?" but unless we find this better we can't really tell.

Also, the advancements in CS are subject to advancements in other sciences (i.e the other way around), with math being the most profound in my opinion. Great algorithms and algorithmic techniques go hand in hand and alternate between CS and math. Because CS is really all about algorithms, the impact is lying at a low level but it is very fundamental and essential to many aspects of everyday life.

As for now, new advancements are achieved frequently, but they go unnoticed for many people until some new product hits the market. And then technology prevails but there would be no technology without CS. But to be fair, technology plays also a major role as it is the vehicle (regarding experiments and products) that helps CS advancing.

Oh, but math also gets a bit in the shadow when talking about public news of science and development. In Physics or Biology people still have a clue of what's going on. But math and CS are so left out. I think they don't have the appealing the others have. Even in new products it is unsure for most people who brought the advancements (at least those outside the STEM), sometimes the credit even goes to the company.

Speaking of algorithms, and other areas a like. Do you think that the coming of Quantum Computers can bring a revolution to CS? I mean, most knowledge we have so far would have to be redesigned to feet quantum ideas. Some areas could even disapear (those studying heuristics to solve NP - Hard problems). In this way, we could say that CS is very bounded to the current technology, and its knowledge does not neccessarily survives throughout time?
BillTre said:
In am not a CS guy, but this looks interesting. If you are interested in things in non-CS fields affected by CS advances I would point to the use of CS (however maybe these would not be strictly CS in the way you intend) in the analysis of many things in modern biology: imaging, genome analysis, and predicting protein structure. These are all very data intensive and would not be possible otherwise. Chomsky from linguistics? I was unaware of his impact on CS.

I've heard they using it in Biology, and some ideas of optimization too, specially in bioinformatics. I'll take a look.

Some big data and data intensive problems could be tackled by statiscs or math I guess.

Yes, Chomsky from linguistics. He did brought some knowledge about grammars to CS, allowing the development of things like compilers.

And here comes another point. Most big discoveries are from people originally outside of CS. Chomsky is an example, Turing another. Even dijkstra was from math field. As well as Von Neumann, Knuth and others.
This usually gets me thinking if CS majors have the skill to develop big things in the field, or if people from other areas are better.
 
  • #12
Sagant said:
This usually gets me thinking if CS majors have the skill to develop big things in the field, or if people from other areas are better.
Back in the days when Dijkstra, Von Neumann, and Knuth were doing their work, computer science wasn't a field of study. Even as late as the 1970s, I don't believe that many universities had departments that specialized in so-called computer science. Several of the CS-related classes I took back then were taught by Math professors. So it's not a question that people from other areas are better; there simply weren't a lot of people back then whose specialty was computer science.
 
  • Like
Likes Jaeusm, QuantumQuest and jedishrfu
  • #13
EE professors handled the computer engineering part and often the boolean logic topics.

Are you writing a paper on this topic? or is this just an interest?
 
  • #14
Sagant said:
I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.

I mean, it comes down to questions about whether math truth is discovered or created. I think its discovered, so if you didn't have Turing, who's model describes all computation you'd basically have someone else discover the same model, and the same mathematical truths like the halting problem etc. I think they are universal, without Turing they would be discovered under some other name. With Chomsky's grammars, its hard to say that with as much certainty... Is there another mathematical formulation of languages which a computer can quickly parse? Possibly. But Chomsky's theory is what they had, so all languages specify CFGs, and so all programming languages rely on this idea of Chomsky, and so all programs from that run on a computer pretty much are based on them.

The main thing is there are underlying mathematical truths about computing you can't get around. If a function is uncomputable by a universal Turing machine, its not computable by any other computer you can build. Eventually, whatever approach was taken, you'd run into these laws of computer science which is what they field is all about.
 
  • #15
The cost effective speed and memory increases are huge. I can now run codes on my notebook that used to require a super computer.

Look at the link in my sig. There is no real need for FFTs any more in many data analysis applications. The "slow" Fourier transform is now fast enough, and it can be more accurate than a "fast" Fourier transform for some applications in data analysis.

Not to mention that element based modeling can handle things on much smaller scales of spatial and time grids than in the past.
 
  • #16
Okay, it makes sense that those important guys were from outside CS because of the time. However, today I don't know listen a lot about new break through findings in CS - or about great names of the field.

I'm a CS major. I'm about to start my MSc degree also in CS, but doubt has strike me in the matter: is there hope in doing something meaningful in CS? And then I got even further to: What are the real advances in CS that impacted the world?.
I always liked science and everything about it. I would really love to study physics, but during my choice of degree I ended up with CS. But now I wondered what could I achieve with that. I stsrted this thread to help not only me, but all those that perhaps are in doubt, or maybe are curious about the field.

I too think that math discovers, rather than create. Its science to me as well. Although not all areas of CS are "mathy".

I can see that perhaps the impact is greater than I was thinking. And you did help me realize that.

But, returning to a previous question of mine. What will happen when quantum computing comes? I think it will drastically change everything in CS. And most knowledge built so far can be given as useless then. I see it as hard times for CS, because quantum computing seems a much more physics field, not to mention a superior type of computing that perhaps won't need any speedups, or fancy algorithms to solve quickly. We don't even know if Turing's theory will hold completely. What do you think of this advancement?
 
  • #17
Quantum computing will initially be tacked on to our existing computer infrastructure much like how the floating coprocessors were added to microprocessors years ago to augment them with more powerful computing capability and to eliminate the software way of doing floating math.

In other words, we would interact with traditional machines who will utilize quantum computing components and report back what the results were.

Here's one such example:

http://www.research.ibm.com/quantum/

and here's another one:

http://www.quantumplayground.net/#/home

Lastly, here's a list of others to check out:

https://www.quantiki.org/wiki/list-qc-simulators

So to answer your question, we will still need CS people but they will now be superpositions of real programmers entangled with other programmers much like programming teams today.
 
Last edited:
  • Like
Likes QuantumQuest
  • #18
Sagant said:
I'm a CS major. I'm about to start my MSc degree also in CS, but doubt has strike me in the matter: is there hope in doing something meaningful in CS?
...
But, returning to a previous question of mine. What will happen when quantum computing comes? I think it will drastically change everything in CS. And most knowledge built so far can be given as useless then. I see it as hard times for CS, because quantum computing seems a much more physics field, not to mention a superior type of computing that perhaps won't need any speedups, or fancy algorithms to solve quickly. We don't even know if Turing's theory will hold completely. What do you think of this advancement?

Hey Sagant, I think you're asking the right questions. My major was in CS too, and as a guy who loves to ask the deep questions, I've found myself more and more of an armchair physics fan, though I didn't study it, especially trying to grasp quantum computers. If you ask me what sort of ideas will really define the future of computer science, my guess would be something like Constructor theory, proposed by David Deutsch and Chiara Marletto at Oxford.
http://constructortheory.org/
My understanding is this theory allows a unified narrative through which physical processes can be viewed as computations/information processes, and vice versa, in a way which includes quantum computation (in which Deutsch is a pioneer), and apparently addresses some other issues in physics. As computing becomes more physical with robotics and questions of what architectures best serve AI algorithms, it becomes more and more about physical processes, computers integrating seamlessly with the physical world, and more about a theories in which unfolding physical processes and computations can be seen through the same lens. We don't fully have such a theory now, thus computer science and physics are different fields. But if you're feeling drawn to mixing in some other science studies like physics into your CS degree, my advice is DO IT! The CS info is good, for instance classical computation is a subset of what quantum computers can do so it will still be valid in the future, but I honestly think you'll be better prepared for what's to come with a good dose of physics mixed in.
 
  • #19
From what I've read Traveling Salesman Problems (esp those that don't obey triangle inequality) are expected to be intractable whether done on a classical von Neumann architecture or a quantum computer.

Despite the lack of proof for ##P \neq NP## people are virtually certain this is a the case on a classic computer. People don't have quite the same degree of confidence for quantum computers, but again from what I've read TSP still is expected to be intractable.

The immediate uses for quantum computers seem to be quantum physics and integer factorization. Though in a manner like money flowing into graphics cards over the past couple decades for video games (but now gpu's are shockingly useful in inference problems -- especially deep learning), I suspect new and somewhat surprisingly potent uses will come up over time.
 
  • #20
jedishrfu said:
So to answer your question, we will still need CS people but they will now be superpositions of real programmers entangled with other programmers much like programming teams today.

First of all, thanks for pointing out the links, they make a very good material on the subject :)
As for the quoted text: yes, okay, I don't expect the "CS guy programming a bunch of codes" to go out so soon, with or without QC. However, to be honest, I didn't major in CS to become a coder, as they say. What I am afraid of (in fact I didn't make it clear) is what would be the future in CS research. For example, I work (today) with theoretical computer science, tackling hard optimization problems and data analysis, which all need efficient algorithms, usually heuristics, to solve in reasonable time.

Some people say QC could solve these problems quickly. Some say only a part would solvable by QC. And a smaller group even says that it won't make much different to those problems, only the classical ones (factorization, grover's search and quantum simulation).

I'd like to share a article I've seen some time ago. It is a bit outdated (2008), but if what it says is true, than my concern would be completely false, and in fact QC would not be able to solve all the problems that efficiently, only a partially. Link: http://www.cs.virginia.edu/~robins/The_Limits_of_Quantum_Computers.pdf

Fooality said:
Hey Sagant, I think you're asking the right questions. My major was in CS too, and as a guy who loves to ask the deep questions, I've found myself more and more of an armchair physics fan, though I didn't study it, especially trying to grasp quantum computers.
...
But if you're feeling drawn to mixing in some other science studies like physics into your CS degree, my advice is DO IT! The CS info is good, for instance classical computation is a subset of what quantum computers can do so it will still be valid in the future, but I honestly think you'll be better prepared for what's to come with a good dose of physics mixed in.

Nice to see some fellows from the CS field :)
I haven't heard about constructor theory, thanks for pointing it out, I'll be reading about it.
Yeah, I would love to add some Physics (and more math too!) to my degree and knowledge. However I would have to learn it by myself (which is a bit hard and prone to errors), or start over a whole new degree (yup, 4 years), because here in my country we don't have things like major/minor, it is a "fixed" degree program, with not much freedom to take other classes outside yours (all I could get was Physics I - Mechanics, though it was pretty cool). An horrible way to do, I know :(
StoneTemplePython said:
From what I've read Traveling Salesman Problems (esp those that don't obey triangle inequality) are expected to be intractable whether done on a classical von Neumann architecture or a quantum computer.

Despite the lack of proof for ##P \neq NP## people are virtually certain this is a the case on a classic computer. People don't have quite the same degree of confidence for quantum computers, but again from what I've read TSP still is expected to be intractable.

The immediate uses for quantum computers seem to be quantum physics and integer factorization. Though in a manner like money flowing into graphics cards over the past couple decades for video games (but now gpu's are shockingly useful in inference problems -- especially deep learning), I suspect new and somewhat surprisingly potent uses will come up over time.

Yes, I've learned it this way too. However, as some people point it out, this fact is not exactly proven, it is believed to be so, but it could also be that QC would solve all NP-Complete / NP-Hard problems and make the world a easier one. On the other hand, even though it is theoretical intractable, as in, it would have no polynomial time algorithm to solve it, perhaps the QC would be so insanely fast that even large instances of the TSP (say, 10^9 nodes) could solved quickly with the simpler algorithms by the QC - again, this is not known, I'm just making an assumption.I think one of the doubts I have is due to the strength of CS to survive major technological advances like this. It is such a new field (as you guys said), that sometimes I'm not certain how much can CS stand on its own when it comes to this. It is not like math or physics that have been around for ages and ages, and every new discovery opens more questions than answers. I'm not sure it works this way in CS, though I'm too only initiating my academic career, and I may be saying some very wrong stuff here.

(Have we gone off-topic?)
 
  • #21
Do we count the development of various programming languages as part of Computer Science ? There can be be controversies over the pros and cons of the various programming languages, but generally speaking there has been progress in developing new languages tailored to specific technologies - e.g. database query languages, languages for creating web pages, etc. We could also count the development of encryption schemes as part of Computer Science.
 
  • #22
I would say programming languages illustrate CS concepts of the day.

As an example, OO came into being with smalltalk. Folks liked the concept of message passing and so C programmers started using message structs to carry state which led to Objective C and then to C++.

The usual language development path is a result of someone supremely unhappy with the present ones and went on to design a better alternative. Like Awk to Perl to Python to Ruby or from Java to Groovy and to Scala or javascript to coffeescript and to Elm as examples.
 
Last edited:
  • #23
I think programming languages could be said to be developments of CS. In fact, there is a huge number of languages nowadays, all sorts of tastes. On the "applied" point of view, this is good, because allows people to develop in different environments with different proposes that a certain language allows. On the "theoretical" side, it doesn't really make much different, since most used languages are all Turing Complete, hence equivalent.

I wanted to put something here. Today I've watched a video about AI and how it can impact future society and jobs. The name is "Humans need not apply", available in youtube here

Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.
 
  • #24
That's an awesome youtube video. Thanks for sharing it here!
 
  • #25
Stephen Tashi said:
Do we count the development of various programming languages as part of Computer Science ? There can be be controversies over the pros and cons of the various programming languages, but generally speaking there has been progress in developing new languages tailored to specific technologies - e.g. database query languages, languages for creating web pages, etc.

Sagant said:
I think programming languages could be said to be developments of CS. In fact, there is a huge number of languages nowadays, all sorts of tastes. On the "applied" point of view, this is good, because allows people to develop in different environments with different proposes that a certain language allows. On the "theoretical" side, it doesn't really make much different, since most used languages are all Turing Complete, hence equivalent.

In my opinion, only the design and concepts is part of CS. Now, this may seem obvious at first glance, but what I say is that fierce market competition is what drives mostly the implementations of programming languages - especially nowadays, so big software companies is the driving force from some point on. The concepts regarding programming paradigms are more or less already established, as are the building blocks of each programming language (syntax, grammar etc.). What has created the explosive rate of new languages hitting the market, is that are tied to specific platforms (regarding software and hardware) and this is not anymore a CS thing. Of course, the positive side of this, is feedback given back to CS.

Sagant said:
I wanted to put something here. Today I've watched a video about AI and how it can impact future society and jobs. The name is "Humans need not apply", available in youtube here

Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.

The video is about what is happening now, in terms of technological advancements. While it is like that - regarding technology itself, it is not mentioned if all that's implied, could actually happen. Besides technology - I'll use the term "technology" throughout, as the form of the various implementations of designs/advancements in AI and in any science for that matter, that gradually replaces us humans, there is the concept of economy as well as the sum of individual economies i.e. the concept of global economy.

Supposing that things will go as described, there would be tens (finally hundreds or more) millions of people getting unemployed. The first question that comes to mind, is where will the companies producing any kind of goods or services or both, sell them? It maybe that everything will be automated, but we can't be exiled from the planet. So, with most people being poor, who's going to buy all these? I cannot really imagine a big company buying its own goods or sell one company to another. All technological advancements and their products have the human factor - as a consumer, included. Now, does this make any kind of sense? I'm afraid not. Where all this could lead in my opinion, is for most people to get back to traditional jobs and so it is rather technology that will gradually vanish, if we follow this scenario.

A second question is how individual countries will develop, under the conditions described in the video. A very important part is science and technology but there is also the human factor intimately related. But the goal of science and technology is grow economy and develop a country. So, at least as I see it, this is also a contradiction.

Finally, I can't see a way that most people will stand living with only the absolute basics - if at all, in a world governed by automated machines.

On the other extreme, vanishing of technology, is not something that can really happen. All the great things that science has given us through technology, is something that no one in my opinion wants to abandon. So, there will be some "middle point" - talking in a broad sense, that we can live using technology.

So, my point is that there are things beyond technology, that constrain an out-of-limits development and spreading.
 
  • #26
Sagant said:
However, today I don't know listen a lot about new break through findings in CS - or about great names of the field.
I'll make a blunt statement: there is none. All breakthrough have been made before the 80'th, and I include "computer languages" and of course the WWW

The problem is to apply some metric to evaluate the landscape you want to describe. The only true "new" thing that appear in CS after 80 was a direct spawn of gaming . Those special hardware requires new type of thinking (GPU), and when the next revolution came to a end, you got back to square one, and enjoy it on a "hyper" text protocol. Like this, or that. If game and entertainment hadn't done it, one may args that science would have created it on their own (simulation, simulation, simulation), or maybe scientist would have roll their sleeve to create new bit of math, instead of relying on those non-evolving 4 cpu operation (load-store-add-jmp) to brute force their ways into solutions.

Let's get real, there is no such things as CS. There is a billion geek who enjoy the show, and a million nerds that all re-invent the wheel by pretending no to (but using romance, alone in a garage against the rest of the world)

Religious(err sorry languages) wars rage on, from interpreted to compile, from imperative to functional, more or less bracket, more or less parenthesis, in fads that last 10 month on average.

The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:
 
  • #27
Boing3000 said:
The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:
Hilarious! I didn't realize that jQuery was so "2013".
 
  • #28
Sagant said:
...
Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.

That's part of the bigger picture of where I think its all going that involves a lot of other fields. People think of AI as about how we think, but in a broader sense its about a synthesizing a function with the same inputs and outputs as some natural function. So for example, insofar as a real estate pro takes a bunch of inputs about a house to assess a value, they are performing a function with inputs (info about the home) and outputs (price range). The AI will learn to simulate that function which was previously outside the computer in the mind of the professional, inside the computer. That professional is now replaceable. But it works for any function, any natural system, not just minds. How will a chemical system evolve in time? How will a forest ecosystem develop? How will an economy grow? A good AI can emulate any natural system with enough training. So true AI in a sense is the art of universal simulation: Being able to make good decisions is secondary to knowing what will happen when a decision is made, accurately simulating the scenario well enough to predict the outcome.

This leads us to some cause for hope:I believe that other poster was right about the economy crashing if people aren't getting paid because they've been replaced by machines, but nothing is better equipped to predict, and therefore avoid, this outcome than an AI capable of simulating economies.
 
  • #29
Boing3000 said:
I'll make a blunt statement: there is none. All breakthrough have been made before the 80'th, and I include "computer languages" and of course the WWW

The problem is to apply some metric to evaluate the landscape you want to describe. The only true "new" thing that appear in CS after 80 was a direct spawn of gaming . Those special hardware requires new type of thinking (GPU), and when the next revolution came to a end, you got back to square one, and enjoy it on a "hyper" text protocol. Like this, or that. If game and entertainment hadn't done it, one may args that science would have created it on their own (simulation, simulation, simulation), or maybe scientist would have roll their sleeve to create new bit of math, instead of relying on those non-evolving 4 cpu operation (load-store-add-jmp) to brute force their ways into solutions.

Let's get real, there is no such things as CS. There is a billion geek who enjoy the show, and a million nerds that all re-invent the wheel by pretending no to (but using romance, alone in a garage against the rest of the world)

Religious(err sorry languages) wars rage on, from interpreted to compile, from imperative to functional, more or less bracket, more or less parenthesis, in fads that last 10 month on average.

The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:

These are some harsh words. Why do you think there is no such thing as CS? How does research continue, then?

On the matter of advancements, yes there is a lot of things going on on computer graphics, and the applicatiom of GPUs to many fields as potential speedups.

In my searchings and readings, I've found now several applications of CS, however one might argue wheter they are meaningful or not. Image processing has allowed the industry to make product quality check entirely based on Computers (much faster). Advances in networks allow better protocols and faster mesaage sending, not to mention error correction, and database fault tolerance. Combinatorial optimzation has a great impact on scheduling, routing and packing (Amazon uses techniques for such). Not to mention, obviously, the advances in AI which had been put forward.

I don't think CS is about discussing languages and learning javascript. Further, I don't think CS is about programming languages at all, even though some study them.

I can be mistaken. What do you guys think? Is CS as useless as pointed? Would the money invested in CS be better spent in, say, engineering?
 
  • #30
Sagant said:
These are some harsh words. Why do you think there is no such thing as CS? How does research continue, then?
Research did not continue. Research died on the late 70'th for two reason:
-one is economic/politic. There was no interest anymore to keep paying the most gifted and talented researchers to come up with unneeded and totally crazy idea (like the mouse), with absolutely no goal nor objective. What research meant back then. That did not only apply to CS but to every other discipline. (I will just remind you that you could board planes going 3 times faster than those we have nowaday).
-the second one is the only scientifically important truth. Everything significant will been invented, so there will always be an innovation peak, and after that you have to put more and more effort that yield less and less result.

Sagant said:
On the matter of advancements, yes there is a lot of things going on on computer graphics, and the application of GPUs to many fields as potential speedups.
Speedups ? Speed of what ? Your brains can handle only 60 FPS.
In my experience GPU (or what was named custom chipset back then) is the only qualitative innovation because garage kid could now tap into it to create genuinely new art form (like winamp visualizers), new paradigm if you will (that word have been abused so much since then, that it is a laughing stock now).

Sagant said:
In my searchings and readings, I've found now several applications of CS, however one might argue wheter they are meaningful or not. Image processing has allowed the industry to make product quality check entirely based on Computers (much faster).
Of course, if computers can alleviate some people from the most menial and boring task, that is a good thing (at least from my point of view). But I don't know if "science" is ever concerned by this. Technology could be, but then a fact is that those "product quality check" have probably been used to improved this

Sagant said:
Advances in networks allow better protocols and faster mesaage sending, not to mention error correction, and database fault tolerance. Combinatorial optimzation has a great impact on scheduling, routing and packing (Amazon uses techniques for such).
The quality of communication have decrease, by every metric possible. The quantity (and vacuity), on the other side, has literally exploded.
So I don't know when those better and fasster protocol are used, but certainly not when I phone, or watch TV. Nowadays, I don't have to call another continent to have a crappy line, calling my neighbor will do. Now if a packet arrive in 4 days instead of 6, it is not really going to change anybody's life.

Sagant said:
Not to mention, obviously, the advances in AI which had been put forward.
There is no AI on the surface of the Earth (human aside:wink:). Seriously though, I have no idea why AI would be put forward.

Sagant said:
I don't think CS is about discussing languages and learning javascript. Further, I don't think CS is about programming languages at all, even though some study them.
That may be true. Maybe you should define more specifically what CS means for you if it is "not software technology". Maybe you meant by CS things like this ? Another things could be "information theory", but this is for serious people like mathematician, not for people getting dirty by assembling things by hand (for whatever reason we still use keyboard).

Sagant said:
I can be mistaken. What do you guys think? Is CS as useless as pointed? Would the money invested in CS be better spent in, say, engineering?
Pointless ? No, no at all. Don't get me wrong. I would like to see people searching for the sake of it (that is: not trying to solve problem).
What I say is that there is no actual money spent in C science because all is spent in C engineering.

I would like a real science genius (in information theory) explains how such a complex image can be perfectly "compressed" in 959 bytes (and really only 6 bit of those bytes are really used...) or a psychologist to explains why we are so dumb as to use PNG or worst JPEG for that.
 
  • #31
Boing3000 said:
Research did not continue. Research died on the late 70'th for two reason:
-one is economic/politic. There was no interest anymore to keep paying the most gifted and talented researchers to come up with unneeded and totally crazy idea (like the mouse), with absolutely no goal nor objective. What research meant back then. That did not only apply to CS but to every other discipline. (I will just remind you that you could board planes going 3 times faster than those we have nowaday).
-the second one is the only scientifically important truth. Everything significant will been invented, so there will always be an innovation peak, and after that you have to put more and more effort that yield less and less result.

Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).

Boing3000 said:
Speedups ? Speed of what ? Your brains can handle only 60 FPS.
In my experience GPU (or what was named custom chipset back then) is the only qualitative innovation because garage kid could now tap into it to create genuinely new art form (like winamp visualizers), new paradigm if you will (that word have been abused so much since then, that it is a laughing stock now).

I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.

However, GPUs have long changed their main usage from computer games only to scientific computing and research. In my university the High Performance Computing Group uses it to make HPC applications. The mathematics departament uses GPUs to accelarate their simulations and calculations. The physics uses it too, and in fact had to ask the CS department for help a couple of times, as I have discovered recently. Not to mention the supercomputers available built entirely with GPUs, such as those here or the famous Titan. Some biotechnology groups also use GPUs to make their protein docking predictions and other stuff.

So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.

Boing3000 said:
Of course, if computers can alleviate some people from the most menial and boring task, that is a good thing (at least from my point of view). But I don't know if "science" is ever concerned by this. Technology could be, but then a fact is that those "product quality check" have probably been used to improved this

Not really only to improve the so-called obsolesce. Lots of food companies use it to avoid using rotten food, of objects with fabrication errors, which can avoid a lot of headaches to the final customers. Also, Image Processing techniques are used by some astrophysicists to help detecting galaxies, stars, planets or other bodies in outer space - which I have also known only recently.

Boing3000 said:
The quality of communication have decrease, by every metric possible. The quantity (and vacuity), on the other side, has literally exploded.
So I don't know when those better and fasster protocol are used, but certainly not when I phone, or watch TV. Nowadays, I don't have to call another continent to have a crappy line, calling my neighbor will do. Now if a packet arrive in 4 days instead of 6, it is not really going to change anybody's life.

It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.

As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.

Boing3000 said:
There is no AI on the surface of the Earth (human aside:wink:). Seriously though, I have no idea why AI would be put forward.

Uh, yes there is AI. There is not strong AI, but there is the so-called weak AI. Some examples are the Google's AI that won several GO games against a top championship (link), the famous IBM's Watson, which won the game Jeopardy (link), and the latest one is the poker winner LIbratus (link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (link), which also uses Image Processing techniques.

Boing3000 said:
That may be true. Maybe you should define more specifically what CS means for you if it is "not software technology". Maybe you meant by CS things like this ? Another things could be "information theory", but this is for serious people like mathematician, not for people getting dirty by assembling things by hand (for whatever reason we still use keyboard).

CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like this. Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.

Boing3000 said:
Pointless ? No, no at all. Don't get me wrong. I would like to see people searching for the sake of it (that is: not trying to solve problem).
What I say is that there is no actual money spent in C science because all is spent in C engineering.

I would like a real science genius (in information theory) explains how such a complex image can be perfectly "compressed" in 959 bytes (and really only 6 bit of those bytes are really used...) or a psychologist to explains why we are so dumb as to use PNG or worst JPEG for that.

I disagree. CS is all about problem solving, just as is math, physics and engineering. The question at hand is if the problem being solved is actually useful to the everyday life, or is just a highly theoretical stuff that perhaps will be used in a distant future (or not). And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still. If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.
 
  • #32
Sagant said:
Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).
CS fault ? What fault ? My point is all this is normal. Nowadays no companies in its right mind would ever invest in research (an oxymoron) . They invest in development. True innovations like all the key technologies of your IPhone date back to the 70'th and was invented by public sector sponsored searcher/scientist, and certainly with not plant to benefit from it in mind.

Sagant said:
I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.
Err, I am the one who is explaining to you that speed is never an issue. Quality is an issue, not quantity. GPU was a deal breaker quality wise, not only because it gave you access to a 3rd dimension, but because even in 2D it open so many possibilities (even in 1D =>DSP/sound)

Sagant said:
However, GPUs have long changed their main usage from computer games only to scientific computing and research.
OK stop. I think I understand you a little better now, because what you say is just soooooooooo wrong. Why do you think nvidia is developing CUDA ? To please scientists ? What is the actual industry that dwarf the entire video-movie sector by nearly an order of magnitude ?
Nvidia do not do research, the do development, and rightly so. They want to make every "smart" phone able to decompress 4K video of cats, and to be able to simulate real-time the "physics" of boobs. That is their main usage, and that tendency is always increasing.
This certainly requires a lot a incredible work of ingenious people making development. That's not research, that is quite the opposite of that.

Sagant said:
So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.
I am sorry but you are wrong. As a garage no-more-kid, I suggested once at work to use early version of CUDA to address some peculiar problem we had in some huge database we had. That is not research, that is the statement of the obvious, and most my co-worker had similar ideas. Any problem expressible in term of (parall'able) matrix mathematics will lead you there.
So of course scientist will also climb on board. But I was explaining to you in my first post that super-computer type architecture already existed before for those reason. They existed way before the term GPU was even coined.
But was you must realize is that a trillion'th dollar world wide industry is borne because of garage kid's doing "true research" by chasing there dream with absolutely no other purpose in minds. They were laugh'ed at for 15 years before someone decide to develop the ideas

Sagant said:
It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.
Vaporware is certainly a sector that got a lots of true research done. That is always somebody else fault (most probably the customer).
Truth is in the 80'th, your TV cold booted in 1 second with 50+ channel at disposal.
Now you wait 1 minute for you modem to initialize 20 second for your ps4 (or tv-box) to boot and enjoy at best 2 HD channel, unless some Chinese hacker decide otherwise. I would hate to call that progress...

Sagant said:
As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.
My point was that it is not research, but development. There is no ethics here. Just metrics.

Sagant said:
Uh, yes there is AI. There is not strong AI, but there is the so-called weak AI. Some examples are the Google's AI that won several GO games against a top championship (link), the famous IBM's Watson, which won the game Jeopardy (link), and the latest one is the poker winner LIbratus (link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (link), which also uses Image Processing techniques.
All these are probably extremely fun projects to work on. But as my C64 could already beats me hand down at chess (ok I suck badly), I am not really impress by all that. I am much more impressed by those humans that are so difficult to beat, even by whole teams of genius-grade people using so much computer power.
Now calling "intelligence", a thing that could not even tie its shoe (or understand what a shoe is or the purpose of it) is ... bizarre.

Sagant said:
CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like this.
OK, that was my understanding of CS. All these field have been "completed" before the 80'th (maybe not AI, but I don't understand what it does into that list).
But then maybe someone will invent something better than the 1956 quicksort ?
The only field that are "kind-of" evolving(or chasing its tail) is languages/framework.

Sagant said:
Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.
Because I mean it. I take logic seriously, that means a take math and science seriously, and "people in garage" suck at it, that's why they prefer garage .. to play.

Sagant said:
I disagree. CS is all about problem solving, just as is math, physics and engineering.
That is a clear statement, that I have to disagree with, and with precise reasons
-Computers are tools. Like a hammer. You fix things with it, you create things with it, or even break thing, depending on the usage. They do solve problem.
There maybe a "science of it", and that science would be a subset of math. There maybe sub-discipline in it (like the link you provided), but calling it "science" is a clear language abuse (see below**)
-Math don't solve problem. It actually create "problem". Problem is an absolute wrong word here. An equation is not a problem because you seek a solution to it. It is not a problem in the sense "need fixing". Algorithmic would be the special branch of mathematics applying only to computers, although cahotician and biologist may disagree.
-Physics/Science seek to DESCRIBE reality. There is no problem with reality. I don't think there is a problem to describe computer either(**).
-Engineering is much more like computer, to actually solve problem but of a concrete nature like "crossing a river"

OK, all those disciple are all somewhat siblings with sometime some overlap, but what irks me is associating computer with science. It is no more a special science that mediation science, or sports science. Science is a word very much abused nowadays to give credential to discipline that do not deserve it.

Sagant said:
And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still.
You may be true. I would be interested to find hard number associated with this claims.
But being in the field from its start, I have never ever heard of "computer researchers". The only remnant of that was the 20% hoax at Google. And only if it benefit the enterprise (which I suppose you could qualify as a "problem solving")

Sagant said:
If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.
I know were the money goes. And that is not in research.
I would be delighted to be pointed to peer reviewed paper from "professor" payed to advance the science of computing. A recent one, not a 40 year old one.
But what I cannot seem to make you understand is that I think we dearly need them, because they are useful. Things like that for example.
 
  • #33
It seems this thread has run its course and it may be time to close it.

Have we covered everything the OP asked?
 
  • #34
@Boing3000 : Clearly the problem in the above discussion is that you want to say that Computer Science is not a sciente, as you say. However, this was not the main purpose of the thread, to argue whether CS is a science or not (there is already too much of this useless discussion online).

My question was only about the advances in CS that helped other sciences and the world, and this goes way beyond the arguing whether this is research or development. I still don't agree with most things you said because everything I've studied says the opposite, but I won't give it any more time.

@jedishrfu : Well, not everything I asked was answered, but I think it would be impossible to do that anyway. I'd say most of it was answered, and it was a helpful thread to me, and hopefully to others as well. For more specific doubts I'll open other threads to keep the discussions "clean".

The thread can be closed,

Thank you all :)
 
  • #35
This thread is now closed.

A big thank you to all who contributed to it.
 
  • Like
Likes QuantumQuest

1. What are some recent advancements in Artificial Intelligence (AI)?

Some of the most significant advancements in AI include deep learning, natural language processing, and computer vision. These technologies have allowed computers to perform tasks that were previously thought to require human intelligence, such as image and speech recognition, decision-making, and problem-solving.

2. How has blockchain technology advanced in recent years?

Blockchain technology has advanced significantly in recent years, particularly in the financial sector. It has been used for secure and transparent transactions, smart contracts, and decentralized applications. Additionally, there have been developments in blockchain scalability and interoperability, making it more practical for widespread use.

3. What are some breakthroughs in quantum computing?

Quantum computing has made significant strides in recent years, with companies like Google, IBM, and Microsoft making significant breakthroughs. Some of these advancements include achieving quantum supremacy, improving qubit stability and coherence, and developing new algorithms and applications for quantum computers.

4. How have advancements in cybersecurity impacted the field of computer science?

With the increase in cyber threats, advancements in cybersecurity have become crucial for protecting sensitive data and systems. Some of the most significant advancements in this field include the use of artificial intelligence for threat detection and prevention, improved encryption techniques, and the development of secure coding practices.

5. What are some recent developments in the Internet of Things (IoT)?

The Internet of Things has seen significant growth in recent years, with the number of connected devices expected to reach 75 billion by 2025. Some recent developments in this field include the use of edge computing for faster data processing, the integration of AI and machine learning for predictive maintenance, and the development of smart cities and homes.

Similar threads

  • Programming and Computer Science
Replies
29
Views
3K
  • STEM Career Guidance
Replies
1
Views
2K
  • STEM Career Guidance
Replies
11
Views
707
  • Programming and Computer Science
Replies
32
Views
3K
  • Programming and Computer Science
Replies
4
Views
4K
Replies
7
Views
665
Replies
3
Views
2K
Replies
29
Views
2K
  • STEM Academic Advising
Replies
2
Views
2K
  • STEM Academic Advising
Replies
2
Views
814
Back
Top