Real advances in Computer Science?

Click For Summary
SUMMARY

The forum discussion centers on the significant contributions of Computer Science (CS) to various fields, emphasizing key breakthroughs such as Shannon's Information Theory, Turing's Computation Theory, Chomsky's Context-Free Grammars, and Computational Complexity Theory. Participants highlight the impact of these theories on programming languages and computational understanding, while also debating the perceived lack of recent advancements in CS compared to other sciences. The conversation underscores the essential role of CS in automation and its foundational influence on technology, despite its contributions often being overshadowed by more visible scientific fields.

PREREQUISITES
  • Understanding of Shannon's Information Theory
  • Familiarity with Turing's Computation Theory
  • Knowledge of Chomsky's Context-Free Grammars
  • Awareness of Computational Complexity Theory
NEXT STEPS
  • Research the implications of Shannon's Information Theory on modern communication systems
  • Explore Turing's Computation Theory and its applications in algorithm design
  • Investigate the development of programming languages influenced by Chomsky's theories
  • Examine the role of Computational Complexity Theory in understanding algorithm efficiency
USEFUL FOR

Computer Science students, educators, researchers, and professionals interested in understanding the foundational theories of CS and their impact on technology and other scientific fields.

Sagant
Messages
14
Reaction score
0
Hi,
I wanted to start this topic to discuss what are real advances in Computer Science (CS) that actually have contributed to the world, both to sciences and academia, as well as to the industries and common people.

Usually, the discoveries on Physics, Engineering, Chemestry and Biology are more easily reported, while math and CS get a little bit in the shadow (the exception perhaps is AI).

TL;DR: What do you think are the major contributions the field of CS has brought up? And how much importance is given to those developments?

P.S.: This post is informative only, I don't mean to light up any bad discussion or war.
 
Technology news on Phys.org
There's so much overlap in fields. We have computers because of computer scientists and mathematicians, we have fast ones that fit in your pocket because of material scientists and physicists. So I assume you're asking about the purely CS stuff, not things like the work with silicon/transistors behind the computer age. My list of favorite breakthroughs would be:

1) Shannon's information theory - led to formal notion of information which can be processed and sent on networks.
2) Turing's computation theory - led to a model of all computation
3) Chomsky's theory of context free grammars - led to programming languages from which software could be written
4) Computational complexity theory - led to a deep understanding of the limits of computation
5) Artificial intelligence theories - many of them, and they are behind most breakthroughs in computer science for decades and will probably shape the world most in decades to come.

I'm sure there's stuff that should be there that isn't, (like something having to do with the Internet) but these are some of the big ones.
 
The automation of mundane work previously done by office human computers is probably the biggest followed by the damage even a small programming error wreak on an unsuspecting community.
 
Fooality said:
So I assume you're asking about the purely CS stuff, not things like the work with silicon/transistors behind the computer age.
Oh yes. I meant to ask about pure CS stuff. As in, what did CS alone bring of advancements to the our world?
To be clear, I'm a CS major trying to understand what my field has brought up. I'm pretty unsure with what in fact have we contributed, and the level of importance of this contribution.

I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.

jedishrfu said:
The automation of mundane work previously done by office human computers is probably the biggest followed by the damage even a small programming error wreak on an unsuspecting community.

If I remember correctly, it was a programming error that led one of NASA's rocket to explode (it was due to precision error, or something like that).

Well, yes, automation is very important. But it wasn't exactly CS who created it. The computers eventually made it happen, but it could have happened even if all programs had to be created by engineers using FORTRAN 77. Although it would be much harder.

Thanks for the answers by the way :)
 
Sagant said:
Oh yes. I meant to ask about pure CS stuff. As in, what did CS alone bring of advancements to the our world?
To be clear, I'm a CS major trying to understand what my field has brought up. I'm pretty unsure with what in fact have we contributed, and the level of importance of this contribution.

I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
Yes, I think so. The contribution described by @jedishrfu is much more significant, IMO.

Another advance that was very significant was the realization by John Von Neumann that computer memory could be used to hold both program code and the data it worked on.
Sagant said:
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.
If I remember correctly, it was a programming error that led one of NASA's rocket to explode (it was due to precision error, or something like that).
What you're remembering wasn't a NASA rocket -- it was an Ariane 5 rocket (https://en.wikipedia.org/wiki/Ariane_5). There was a space shuttle (Challenger) that exploded on launch in 1986, killing all of the astronauts inside it, but that was due to an o-ring, not a programming error.
Sagant said:
Well, yes, automation is very important. But it wasn't exactly CS who created it. The computers eventually made it happen, but it could have happened even if all programs had to be created by engineers using FORTRAN 77. Although it would be much harder.

Thanks for the answers by the way :)
 
  • Like
Likes   Reactions: QuantumQuest
Yes, Ariane 5 rocket. Although (risking going off-topic) I've read that it was due to the number of bits to represent numbers and the concersion from one another. A failure that could be fixed through software.
(A link, but I don't know how trustable it is: https://www.google.com.br/url?sa=t&...HRTt2I78Uy0Nowwrw&sig2=jeLlxDEay9dQTpRk7cfbyw).

Back to the topic:
Okay, but as far as I'm noticing the research in computer science has very few impact on other areas, and most of the important discoveries are from its beginning, rather than recent.

This makes me wonder: Is CS a topic being studied only for the sake of CS? To keep the professors doing something?
 
No so! Much of CS is related purely to CS partly because it can be used on many problems and partly because its built on other CS ideas.
 
Last edited:
CS undoubtedly has great impact through its advancements to our everyday life. Thing is that its advancements have been and are used at large by other sciences and there are many cases in which without these impacts, there would be no advancements to other sciences at all but many times this remains unnoticed by most people. I'm not talking about technology and generally implementations. These are more or less the extensions (with all the necessary modifications and adaptations that an implementation demands) and also, undoubtedly in my opinion, they have their own great impact.

I'll second Mark about Von Neumann model, as this had a very deep impact for every computing machine out there. Also, the list that Fooality gave, is things with really great impact.

Sagant said:
However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?

You can't tell for sure what we could have created without these because this just didn't happen. As we say in theory of algorithms, the eternal question is "Can we do better?" but unless we find this better we can't really tell.

Also, the advancements in CS are subject to advancements in other sciences (i.e the other way around), with math being the most profound in my opinion. Great algorithms and algorithmic techniques go hand in hand and alternate between CS and math. Because CS is really all about algorithms, the impact is lying at a low level but it is very fundamental and essential to many aspects of everyday life.

As for now, new advancements are achieved frequently, but they go unnoticed for many people until some new product hits the market. And then technology prevails but there would be no technology without CS. But to be fair, technology plays also a major role as it is the vehicle (regarding experiments and products) that helps CS advancing.
 
Last edited:
another major achievement was the binary floating point format model, without it we'd still be doing either integer or fixed decimal arithmetic.

Conrad Zuse was the first to implement it in his mechanical digital computers.

https://en.m.wikipedia.org/wiki/Floating_point
 
  • Like
Likes   Reactions: QuantumQuest
  • #10
In am not a CS guy, but this looks interesting.

Sagant said:
research in computer science has very few impact on other areas, and most of the important discoveries are from its beginning, rather than recent.
If you are interested in things in non-CS fields affected by CS advances I would point to the use of CS (however maybe these would not be strictly CS in the way you intend) in the analysis of many things in modern biology: imaging, genome analysis, and predicting protein structure. These are all very data intensive and would not be possible otherwise.

Fooality said:
3) Chomsky's theory of context free grammars - led to programming languages from which software could be written
Chomsky from linguistics? I was unaware of his impact on CS.
 
  • #11
jedishrfu said:
No so! Much of CS is related purely to CS partly because it can be used on many problems and partly because its built on other CS ideas.

QuantumQuest said:
CS undoubtedly has great impact through its advancements to our everyday life. Thing is that its advancements have been and are used at large by other sciences and there are many cases in which without these impacts, there would be no advancements to other sciences at all but many times this remains unnoticed by most people. I'm not talking about technology and generally implementations. These are more or less the extensions (with all the necessary modifications and adaptations that an implementation demands) and also, undoubtedly in my opinion, they have their own great impact.

I see. So basically the CS ideas and developments are covered, and the developments of those who use these ideas are the ones that really get to the general public.

But then, not that I want to be too specific, but what advances in CS do you think helped in say, Physics, or Engineering?
QuantumQuest said:
I'll second Mark about Von Neumann model, as this had a very deep impact for every computing machine out there. Also, the list that Fooality gave, is things with really great impact.

You can't tell for sure what we could have created without these because this just didn't happen. As we say in theory of algorithms, the eternal question is "Can we do better?" but unless we find this better we can't really tell.

Also, the advancements in CS are subject to advancements in other sciences (i.e the other way around), with math being the most profound in my opinion. Great algorithms and algorithmic techniques go hand in hand and alternate between CS and math. Because CS is really all about algorithms, the impact is lying at a low level but it is very fundamental and essential to many aspects of everyday life.

As for now, new advancements are achieved frequently, but they go unnoticed for many people until some new product hits the market. And then technology prevails but there would be no technology without CS. But to be fair, technology plays also a major role as it is the vehicle (regarding experiments and products) that helps CS advancing.

Oh, but math also gets a bit in the shadow when talking about public news of science and development. In Physics or Biology people still have a clue of what's going on. But math and CS are so left out. I think they don't have the appealing the others have. Even in new products it is unsure for most people who brought the advancements (at least those outside the STEM), sometimes the credit even goes to the company.

Speaking of algorithms, and other areas a like. Do you think that the coming of Quantum Computers can bring a revolution to CS? I mean, most knowledge we have so far would have to be redesigned to feet quantum ideas. Some areas could even disapear (those studying heuristics to solve NP - Hard problems). In this way, we could say that CS is very bounded to the current technology, and its knowledge does not neccessarily survives throughout time?
BillTre said:
In am not a CS guy, but this looks interesting. If you are interested in things in non-CS fields affected by CS advances I would point to the use of CS (however maybe these would not be strictly CS in the way you intend) in the analysis of many things in modern biology: imaging, genome analysis, and predicting protein structure. These are all very data intensive and would not be possible otherwise. Chomsky from linguistics? I was unaware of his impact on CS.

I've heard they using it in Biology, and some ideas of optimization too, specially in bioinformatics. I'll take a look.

Some big data and data intensive problems could be tackled by statiscs or math I guess.

Yes, Chomsky from linguistics. He did brought some knowledge about grammars to CS, allowing the development of things like compilers.

And here comes another point. Most big discoveries are from people originally outside of CS. Chomsky is an example, Turing another. Even dijkstra was from math field. As well as Von Neumann, Knuth and others.
This usually gets me thinking if CS majors have the skill to develop big things in the field, or if people from other areas are better.
 
  • #12
Sagant said:
This usually gets me thinking if CS majors have the skill to develop big things in the field, or if people from other areas are better.
Back in the days when Dijkstra, Von Neumann, and Knuth were doing their work, computer science wasn't a field of study. Even as late as the 1970s, I don't believe that many universities had departments that specialized in so-called computer science. Several of the CS-related classes I took back then were taught by Math professors. So it's not a question that people from other areas are better; there simply weren't a lot of people back then whose specialty was computer science.
 
  • Like
Likes   Reactions: Jaeusm, QuantumQuest and jedishrfu
  • #13
EE professors handled the computer engineering part and often the boolean logic topics.

Are you writing a paper on this topic? or is this just an interest?
 
  • #14
Sagant said:
I think that what you said is pretty important. However, let's take Turing's and Chomsky's theories for example. They are useful inside of CS, no doubt, but were they really necessary to evolve the field? Couldn't we still be creating algorithms and languages without this knowledge?
For example, it would be hard to build a computer without the knowledge about electromagnetism. But I don't see the same happening here. I don't think the theoretical knowledge helped that much to evolve things outside CS. I may be wrong though.

I mean, it comes down to questions about whether math truth is discovered or created. I think its discovered, so if you didn't have Turing, who's model describes all computation you'd basically have someone else discover the same model, and the same mathematical truths like the halting problem etc. I think they are universal, without Turing they would be discovered under some other name. With Chomsky's grammars, its hard to say that with as much certainty... Is there another mathematical formulation of languages which a computer can quickly parse? Possibly. But Chomsky's theory is what they had, so all languages specify CFGs, and so all programming languages rely on this idea of Chomsky, and so all programs from that run on a computer pretty much are based on them.

The main thing is there are underlying mathematical truths about computing you can't get around. If a function is uncomputable by a universal Turing machine, its not computable by any other computer you can build. Eventually, whatever approach was taken, you'd run into these laws of computer science which is what they field is all about.
 
  • #15
The cost effective speed and memory increases are huge. I can now run codes on my notebook that used to require a super computer.

Look at the link in my sig. There is no real need for FFTs any more in many data analysis applications. The "slow" Fourier transform is now fast enough, and it can be more accurate than a "fast" Fourier transform for some applications in data analysis.

Not to mention that element based modeling can handle things on much smaller scales of spatial and time grids than in the past.
 
  • #16
Okay, it makes sense that those important guys were from outside CS because of the time. However, today I don't know listen a lot about new break through findings in CS - or about great names of the field.

I'm a CS major. I'm about to start my MSc degree also in CS, but doubt has strike me in the matter: is there hope in doing something meaningful in CS? And then I got even further to: What are the real advances in CS that impacted the world?.
I always liked science and everything about it. I would really love to study physics, but during my choice of degree I ended up with CS. But now I wondered what could I achieve with that. I stsrted this thread to help not only me, but all those that perhaps are in doubt, or maybe are curious about the field.

I too think that math discovers, rather than create. Its science to me as well. Although not all areas of CS are "mathy".

I can see that perhaps the impact is greater than I was thinking. And you did help me realize that.

But, returning to a previous question of mine. What will happen when quantum computing comes? I think it will drastically change everything in CS. And most knowledge built so far can be given as useless then. I see it as hard times for CS, because quantum computing seems a much more physics field, not to mention a superior type of computing that perhaps won't need any speedups, or fancy algorithms to solve quickly. We don't even know if Turing's theory will hold completely. What do you think of this advancement?
 
  • #17
Quantum computing will initially be tacked on to our existing computer infrastructure much like how the floating coprocessors were added to microprocessors years ago to augment them with more powerful computing capability and to eliminate the software way of doing floating math.

In other words, we would interact with traditional machines who will utilize quantum computing components and report back what the results were.

Here's one such example:

http://www.research.ibm.com/quantum/

and here's another one:

http://www.quantumplayground.net/#/home

Lastly, here's a list of others to check out:

https://www.quantiki.org/wiki/list-qc-simulators

So to answer your question, we will still need CS people but they will now be superpositions of real programmers entangled with other programmers much like programming teams today.
 
Last edited:
  • Like
Likes   Reactions: QuantumQuest
  • #18
Sagant said:
I'm a CS major. I'm about to start my MSc degree also in CS, but doubt has strike me in the matter: is there hope in doing something meaningful in CS?
...
But, returning to a previous question of mine. What will happen when quantum computing comes? I think it will drastically change everything in CS. And most knowledge built so far can be given as useless then. I see it as hard times for CS, because quantum computing seems a much more physics field, not to mention a superior type of computing that perhaps won't need any speedups, or fancy algorithms to solve quickly. We don't even know if Turing's theory will hold completely. What do you think of this advancement?

Hey Sagant, I think you're asking the right questions. My major was in CS too, and as a guy who loves to ask the deep questions, I've found myself more and more of an armchair physics fan, though I didn't study it, especially trying to grasp quantum computers. If you ask me what sort of ideas will really define the future of computer science, my guess would be something like Constructor theory, proposed by David Deutsch and Chiara Marletto at Oxford.
http://constructortheory.org/
My understanding is this theory allows a unified narrative through which physical processes can be viewed as computations/information processes, and vice versa, in a way which includes quantum computation (in which Deutsch is a pioneer), and apparently addresses some other issues in physics. As computing becomes more physical with robotics and questions of what architectures best serve AI algorithms, it becomes more and more about physical processes, computers integrating seamlessly with the physical world, and more about a theories in which unfolding physical processes and computations can be seen through the same lens. We don't fully have such a theory now, thus computer science and physics are different fields. But if you're feeling drawn to mixing in some other science studies like physics into your CS degree, my advice is DO IT! The CS info is good, for instance classical computation is a subset of what quantum computers can do so it will still be valid in the future, but I honestly think you'll be better prepared for what's to come with a good dose of physics mixed in.
 
  • #19
From what I've read Traveling Salesman Problems (esp those that don't obey triangle inequality) are expected to be intractable whether done on a classical von Neumann architecture or a quantum computer.

Despite the lack of proof for ##P \neq NP## people are virtually certain this is a the case on a classic computer. People don't have quite the same degree of confidence for quantum computers, but again from what I've read TSP still is expected to be intractable.

The immediate uses for quantum computers seem to be quantum physics and integer factorization. Though in a manner like money flowing into graphics cards over the past couple decades for video games (but now gpu's are shockingly useful in inference problems -- especially deep learning), I suspect new and somewhat surprisingly potent uses will come up over time.
 
  • #20
jedishrfu said:
So to answer your question, we will still need CS people but they will now be superpositions of real programmers entangled with other programmers much like programming teams today.

First of all, thanks for pointing out the links, they make a very good material on the subject :)
As for the quoted text: yes, okay, I don't expect the "CS guy programming a bunch of codes" to go out so soon, with or without QC. However, to be honest, I didn't major in CS to become a coder, as they say. What I am afraid of (in fact I didn't make it clear) is what would be the future in CS research. For example, I work (today) with theoretical computer science, tackling hard optimization problems and data analysis, which all need efficient algorithms, usually heuristics, to solve in reasonable time.

Some people say QC could solve these problems quickly. Some say only a part would solvable by QC. And a smaller group even says that it won't make much different to those problems, only the classical ones (factorization, grover's search and quantum simulation).

I'd like to share a article I've seen some time ago. It is a bit outdated (2008), but if what it says is true, than my concern would be completely false, and in fact QC would not be able to solve all the problems that efficiently, only a partially. Link: http://www.cs.virginia.edu/~robins/The_Limits_of_Quantum_Computers.pdf

Fooality said:
Hey Sagant, I think you're asking the right questions. My major was in CS too, and as a guy who loves to ask the deep questions, I've found myself more and more of an armchair physics fan, though I didn't study it, especially trying to grasp quantum computers.
...
But if you're feeling drawn to mixing in some other science studies like physics into your CS degree, my advice is DO IT! The CS info is good, for instance classical computation is a subset of what quantum computers can do so it will still be valid in the future, but I honestly think you'll be better prepared for what's to come with a good dose of physics mixed in.

Nice to see some fellows from the CS field :)
I haven't heard about constructor theory, thanks for pointing it out, I'll be reading about it.
Yeah, I would love to add some Physics (and more math too!) to my degree and knowledge. However I would have to learn it by myself (which is a bit hard and prone to errors), or start over a whole new degree (yup, 4 years), because here in my country we don't have things like major/minor, it is a "fixed" degree program, with not much freedom to take other classes outside yours (all I could get was Physics I - Mechanics, though it was pretty cool). An horrible way to do, I know :(
StoneTemplePython said:
From what I've read Traveling Salesman Problems (esp those that don't obey triangle inequality) are expected to be intractable whether done on a classical von Neumann architecture or a quantum computer.

Despite the lack of proof for ##P \neq NP## people are virtually certain this is a the case on a classic computer. People don't have quite the same degree of confidence for quantum computers, but again from what I've read TSP still is expected to be intractable.

The immediate uses for quantum computers seem to be quantum physics and integer factorization. Though in a manner like money flowing into graphics cards over the past couple decades for video games (but now gpu's are shockingly useful in inference problems -- especially deep learning), I suspect new and somewhat surprisingly potent uses will come up over time.

Yes, I've learned it this way too. However, as some people point it out, this fact is not exactly proven, it is believed to be so, but it could also be that QC would solve all NP-Complete / NP-Hard problems and make the world a easier one. On the other hand, even though it is theoretical intractable, as in, it would have no polynomial time algorithm to solve it, perhaps the QC would be so insanely fast that even large instances of the TSP (say, 10^9 nodes) could solved quickly with the simpler algorithms by the QC - again, this is not known, I'm just making an assumption.I think one of the doubts I have is due to the strength of CS to survive major technological advances like this. It is such a new field (as you guys said), that sometimes I'm not certain how much can CS stand on its own when it comes to this. It is not like math or physics that have been around for ages and ages, and every new discovery opens more questions than answers. I'm not sure it works this way in CS, though I'm too only initiating my academic career, and I may be saying some very wrong stuff here.

(Have we gone off-topic?)
 
  • #21
Do we count the development of various programming languages as part of Computer Science ? There can be be controversies over the pros and cons of the various programming languages, but generally speaking there has been progress in developing new languages tailored to specific technologies - e.g. database query languages, languages for creating web pages, etc. We could also count the development of encryption schemes as part of Computer Science.
 
  • #22
I would say programming languages illustrate CS concepts of the day.

As an example, OO came into being with smalltalk. Folks liked the concept of message passing and so C programmers started using message structs to carry state which led to Objective C and then to C++.

The usual language development path is a result of someone supremely unhappy with the present ones and went on to design a better alternative. Like Awk to Perl to Python to Ruby or from Java to Groovy and to Scala or javascript to coffeescript and to Elm as examples.
 
Last edited:
  • #23
I think programming languages could be said to be developments of CS. In fact, there is a huge number of languages nowadays, all sorts of tastes. On the "applied" point of view, this is good, because allows people to develop in different environments with different proposes that a certain language allows. On the "theoretical" side, it doesn't really make much different, since most used languages are all Turing Complete, hence equivalent.

I wanted to put something here. Today I've watched a video about AI and how it can impact future society and jobs. The name is "Humans need not apply", available in youtube here

Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.
 
  • #24
That's an awesome youtube video. Thanks for sharing it here!
 
  • #25
Stephen Tashi said:
Do we count the development of various programming languages as part of Computer Science ? There can be be controversies over the pros and cons of the various programming languages, but generally speaking there has been progress in developing new languages tailored to specific technologies - e.g. database query languages, languages for creating web pages, etc.

Sagant said:
I think programming languages could be said to be developments of CS. In fact, there is a huge number of languages nowadays, all sorts of tastes. On the "applied" point of view, this is good, because allows people to develop in different environments with different proposes that a certain language allows. On the "theoretical" side, it doesn't really make much different, since most used languages are all Turing Complete, hence equivalent.

In my opinion, only the design and concepts is part of CS. Now, this may seem obvious at first glance, but what I say is that fierce market competition is what drives mostly the implementations of programming languages - especially nowadays, so big software companies is the driving force from some point on. The concepts regarding programming paradigms are more or less already established, as are the building blocks of each programming language (syntax, grammar etc.). What has created the explosive rate of new languages hitting the market, is that are tied to specific platforms (regarding software and hardware) and this is not anymore a CS thing. Of course, the positive side of this, is feedback given back to CS.

Sagant said:
I wanted to put something here. Today I've watched a video about AI and how it can impact future society and jobs. The name is "Humans need not apply", available in youtube here

Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.

The video is about what is happening now, in terms of technological advancements. While it is like that - regarding technology itself, it is not mentioned if all that's implied, could actually happen. Besides technology - I'll use the term "technology" throughout, as the form of the various implementations of designs/advancements in AI and in any science for that matter, that gradually replaces us humans, there is the concept of economy as well as the sum of individual economies i.e. the concept of global economy.

Supposing that things will go as described, there would be tens (finally hundreds or more) millions of people getting unemployed. The first question that comes to mind, is where will the companies producing any kind of goods or services or both, sell them? It maybe that everything will be automated, but we can't be exiled from the planet. So, with most people being poor, who's going to buy all these? I cannot really imagine a big company buying its own goods or sell one company to another. All technological advancements and their products have the human factor - as a consumer, included. Now, does this make any kind of sense? I'm afraid not. Where all this could lead in my opinion, is for most people to get back to traditional jobs and so it is rather technology that will gradually vanish, if we follow this scenario.

A second question is how individual countries will develop, under the conditions described in the video. A very important part is science and technology but there is also the human factor intimately related. But the goal of science and technology is grow economy and develop a country. So, at least as I see it, this is also a contradiction.

Finally, I can't see a way that most people will stand living with only the absolute basics - if at all, in a world governed by automated machines.

On the other extreme, vanishing of technology, is not something that can really happen. All the great things that science has given us through technology, is something that no one in my opinion wants to abandon. So, there will be some "middle point" - talking in a broad sense, that we can live using technology.

So, my point is that there are things beyond technology, that constrain an out-of-limits development and spreading.
 
  • #26
Sagant said:
However, today I don't know listen a lot about new break through findings in CS - or about great names of the field.
I'll make a blunt statement: there is none. All breakthrough have been made before the 80'th, and I include "computer languages" and of course the WWW

The problem is to apply some metric to evaluate the landscape you want to describe. The only true "new" thing that appear in CS after 80 was a direct spawn of gaming . Those special hardware requires new type of thinking (GPU), and when the next revolution came to a end, you got back to square one, and enjoy it on a "hyper" text protocol. Like this, or that. If game and entertainment hadn't done it, one may args that science would have created it on their own (simulation, simulation, simulation), or maybe scientist would have roll their sleeve to create new bit of math, instead of relying on those non-evolving 4 cpu operation (load-store-add-jmp) to brute force their ways into solutions.

Let's get real, there is no such things as CS. There is a billion geek who enjoy the show, and a million nerds that all re-invent the wheel by pretending no to (but using romance, alone in a garage against the rest of the world)

Religious(err sorry languages) wars rage on, from interpreted to compile, from imperative to functional, more or less bracket, more or less parenthesis, in fads that last 10 month on average.

The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:
 
  • #27
Boing3000 said:
The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:
Hilarious! I didn't realize that jQuery was so "2013".
 
  • #28
Sagant said:
...
Do you think that this CS advancement in AI (though it will take some time yet) can actually happen? I mean, even if the impact was not as huge as the video predicts, it can still be enourmous enough to kick a lot of people out of their jobs. This is especially bad when considering development countries (I'm from one) where a great majority of people have very mechanical and easily computerized jobs.

That's part of the bigger picture of where I think its all going that involves a lot of other fields. People think of AI as about how we think, but in a broader sense its about a synthesizing a function with the same inputs and outputs as some natural function. So for example, insofar as a real estate pro takes a bunch of inputs about a house to assess a value, they are performing a function with inputs (info about the home) and outputs (price range). The AI will learn to simulate that function which was previously outside the computer in the mind of the professional, inside the computer. That professional is now replaceable. But it works for any function, any natural system, not just minds. How will a chemical system evolve in time? How will a forest ecosystem develop? How will an economy grow? A good AI can emulate any natural system with enough training. So true AI in a sense is the art of universal simulation: Being able to make good decisions is secondary to knowing what will happen when a decision is made, accurately simulating the scenario well enough to predict the outcome.

This leads us to some cause for hope:I believe that other poster was right about the economy crashing if people aren't getting paid because they've been replaced by machines, but nothing is better equipped to predict, and therefore avoid, this outcome than an AI capable of simulating economies.
 
  • #29
Boing3000 said:
I'll make a blunt statement: there is none. All breakthrough have been made before the 80'th, and I include "computer languages" and of course the WWW

The problem is to apply some metric to evaluate the landscape you want to describe. The only true "new" thing that appear in CS after 80 was a direct spawn of gaming . Those special hardware requires new type of thinking (GPU), and when the next revolution came to a end, you got back to square one, and enjoy it on a "hyper" text protocol. Like this, or that. If game and entertainment hadn't done it, one may args that science would have created it on their own (simulation, simulation, simulation), or maybe scientist would have roll their sleeve to create new bit of math, instead of relying on those non-evolving 4 cpu operation (load-store-add-jmp) to brute force their ways into solutions.

Let's get real, there is no such things as CS. There is a billion geek who enjoy the show, and a million nerds that all re-invent the wheel by pretending no to (but using romance, alone in a garage against the rest of the world)

Religious(err sorry languages) wars rage on, from interpreted to compile, from imperative to functional, more or less bracket, more or less parenthesis, in fads that last 10 month on average.

The market has ruined all possible evolution, but fortunately open-source drives the few "innovations" worth mentioning
I really recommend to read the whole piece, it hurts so much that it's funny:biggrin:

These are some harsh words. Why do you think there is no such thing as CS? How does research continue, then?

On the matter of advancements, yes there is a lot of things going on on computer graphics, and the applicatiom of GPUs to many fields as potential speedups.

In my searchings and readings, I've found now several applications of CS, however one might argue wheter they are meaningful or not. Image processing has allowed the industry to make product quality check entirely based on Computers (much faster). Advances in networks allow better protocols and faster mesaage sending, not to mention error correction, and database fault tolerance. Combinatorial optimzation has a great impact on scheduling, routing and packing (Amazon uses techniques for such). Not to mention, obviously, the advances in AI which had been put forward.

I don't think CS is about discussing languages and learning javascript. Further, I don't think CS is about programming languages at all, even though some study them.

I can be mistaken. What do you guys think? Is CS as useless as pointed? Would the money invested in CS be better spent in, say, engineering?
 
  • #30
Sagant said:
These are some harsh words. Why do you think there is no such thing as CS? How does research continue, then?
Research did not continue. Research died on the late 70'th for two reason:
-one is economic/politic. There was no interest anymore to keep paying the most gifted and talented researchers to come up with unneeded and totally crazy idea (like the mouse), with absolutely no goal nor objective. What research meant back then. That did not only apply to CS but to every other discipline. (I will just remind you that you could board planes going 3 times faster than those we have nowaday).
-the second one is the only scientifically important truth. Everything significant will been invented, so there will always be an innovation peak, and after that you have to put more and more effort that yield less and less result.

Sagant said:
On the matter of advancements, yes there is a lot of things going on on computer graphics, and the application of GPUs to many fields as potential speedups.
Speedups ? Speed of what ? Your brains can handle only 60 FPS.
In my experience GPU (or what was named custom chipset back then) is the only qualitative innovation because garage kid could now tap into it to create genuinely new art form (like winamp visualizers), new paradigm if you will (that word have been abused so much since then, that it is a laughing stock now).

Sagant said:
In my searchings and readings, I've found now several applications of CS, however one might argue wheter they are meaningful or not. Image processing has allowed the industry to make product quality check entirely based on Computers (much faster).
Of course, if computers can alleviate some people from the most menial and boring task, that is a good thing (at least from my point of view). But I don't know if "science" is ever concerned by this. Technology could be, but then a fact is that those "product quality check" have probably been used to improved this

Sagant said:
Advances in networks allow better protocols and faster mesaage sending, not to mention error correction, and database fault tolerance. Combinatorial optimzation has a great impact on scheduling, routing and packing (Amazon uses techniques for such).
The quality of communication have decrease, by every metric possible. The quantity (and vacuity), on the other side, has literally exploded.
So I don't know when those better and fasster protocol are used, but certainly not when I phone, or watch TV. Nowadays, I don't have to call another continent to have a crappy line, calling my neighbor will do. Now if a packet arrive in 4 days instead of 6, it is not really going to change anybody's life.

Sagant said:
Not to mention, obviously, the advances in AI which had been put forward.
There is no AI on the surface of the Earth (human aside:wink:). Seriously though, I have no idea why AI would be put forward.

Sagant said:
I don't think CS is about discussing languages and learning javascript. Further, I don't think CS is about programming languages at all, even though some study them.
That may be true. Maybe you should define more specifically what CS means for you if it is "not software technology". Maybe you meant by CS things like this ? Another things could be "information theory", but this is for serious people like mathematician, not for people getting dirty by assembling things by hand (for whatever reason we still use keyboard).

Sagant said:
I can be mistaken. What do you guys think? Is CS as useless as pointed? Would the money invested in CS be better spent in, say, engineering?
Pointless ? No, no at all. Don't get me wrong. I would like to see people searching for the sake of it (that is: not trying to solve problem).
What I say is that there is no actual money spent in C science because all is spent in C engineering.

I would like a real science genius (in information theory) explains how such a complex image can be perfectly "compressed" in 959 bytes (and really only 6 bit of those bytes are really used...) or a psychologist to explains why we are so dumb as to use PNG or worst JPEG for that.
 

Similar threads

Replies
29
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
15K
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
6K
  • · Replies 2 ·
Replies
2
Views
13K
Replies
15
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K