Real advances in Computer Science?

Click For Summary
The discussion centers on the significant contributions of Computer Science (CS) to various fields, emphasizing its impact on both academia and industry. Key advancements highlighted include Shannon's information theory, Turing's computation theory, Chomsky's context-free grammars, computational complexity theory, and artificial intelligence. These breakthroughs have fundamentally shaped programming languages, algorithms, and the understanding of computation limits.Participants express concerns about the visibility of CS advancements compared to other scientific fields, noting that many significant discoveries originate from early theoretical work rather than recent developments. The conversation also touches on the automation of tasks previously performed by humans and the potential consequences of programming errors, exemplified by historical incidents like the Ariane 5 rocket failure.The role of CS in interdisciplinary applications, particularly in biology and engineering, is acknowledged, with a consensus that CS advancements often remain unnoticed by the general public. The discussion raises questions about the future of CS in light of emerging technologies like quantum computing, which could redefine computational paradigms and challenge existing knowledge.
  • #31
Boing3000 said:
Research did not continue. Research died on the late 70'th for two reason:
-one is economic/politic. There was no interest anymore to keep paying the most gifted and talented researchers to come up with unneeded and totally crazy idea (like the mouse), with absolutely no goal nor objective. What research meant back then. That did not only apply to CS but to every other discipline. (I will just remind you that you could board planes going 3 times faster than those we have nowaday).
-the second one is the only scientifically important truth. Everything significant will been invented, so there will always be an innovation peak, and after that you have to put more and more effort that yield less and less result.

Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).

Boing3000 said:
Speedups ? Speed of what ? Your brains can handle only 60 FPS.
In my experience GPU (or what was named custom chipset back then) is the only qualitative innovation because garage kid could now tap into it to create genuinely new art form (like winamp visualizers), new paradigm if you will (that word have been abused so much since then, that it is a laughing stock now).

I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.

However, GPUs have long changed their main usage from computer games only to scientific computing and research. In my university the High Performance Computing Group uses it to make HPC applications. The mathematics departament uses GPUs to accelarate their simulations and calculations. The physics uses it too, and in fact had to ask the CS department for help a couple of times, as I have discovered recently. Not to mention the supercomputers available built entirely with GPUs, such as those here or the famous Titan. Some biotechnology groups also use GPUs to make their protein docking predictions and other stuff.

So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.

Boing3000 said:
Of course, if computers can alleviate some people from the most menial and boring task, that is a good thing (at least from my point of view). But I don't know if "science" is ever concerned by this. Technology could be, but then a fact is that those "product quality check" have probably been used to improved this

Not really only to improve the so-called obsolesce. Lots of food companies use it to avoid using rotten food, of objects with fabrication errors, which can avoid a lot of headaches to the final customers. Also, Image Processing techniques are used by some astrophysicists to help detecting galaxies, stars, planets or other bodies in outer space - which I have also known only recently.

Boing3000 said:
The quality of communication have decrease, by every metric possible. The quantity (and vacuity), on the other side, has literally exploded.
So I don't know when those better and fasster protocol are used, but certainly not when I phone, or watch TV. Nowadays, I don't have to call another continent to have a crappy line, calling my neighbor will do. Now if a packet arrive in 4 days instead of 6, it is not really going to change anybody's life.

It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.

As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.

Boing3000 said:
There is no AI on the surface of the Earth (human aside:wink:). Seriously though, I have no idea why AI would be put forward.

Uh, yes there is AI. There is not strong AI, but there is the so-called weak AI. Some examples are the Google's AI that won several GO games against a top championship (link), the famous IBM's Watson, which won the game Jeopardy (link), and the latest one is the poker winner LIbratus (link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (link), which also uses Image Processing techniques.

Boing3000 said:
That may be true. Maybe you should define more specifically what CS means for you if it is "not software technology". Maybe you meant by CS things like this ? Another things could be "information theory", but this is for serious people like mathematician, not for people getting dirty by assembling things by hand (for whatever reason we still use keyboard).

CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like this. Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.

Boing3000 said:
Pointless ? No, no at all. Don't get me wrong. I would like to see people searching for the sake of it (that is: not trying to solve problem).
What I say is that there is no actual money spent in C science because all is spent in C engineering.

I would like a real science genius (in information theory) explains how such a complex image can be perfectly "compressed" in 959 bytes (and really only 6 bit of those bytes are really used...) or a psychologist to explains why we are so dumb as to use PNG or worst JPEG for that.

I disagree. CS is all about problem solving, just as is math, physics and engineering. The question at hand is if the problem being solved is actually useful to the everyday life, or is just a highly theoretical stuff that perhaps will be used in a distant future (or not). And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still. If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.
 
Technology news on Phys.org
  • #32
Sagant said:
Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).
CS fault ? What fault ? My point is all this is normal. Nowadays no companies in its right mind would ever invest in research (an oxymoron) . They invest in development. True innovations like all the key technologies of your IPhone date back to the 70'th and was invented by public sector sponsored searcher/scientist, and certainly with not plant to benefit from it in mind.

Sagant said:
I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.
Err, I am the one who is explaining to you that speed is never an issue. Quality is an issue, not quantity. GPU was a deal breaker quality wise, not only because it gave you access to a 3rd dimension, but because even in 2D it open so many possibilities (even in 1D =>DSP/sound)

Sagant said:
However, GPUs have long changed their main usage from computer games only to scientific computing and research.
OK stop. I think I understand you a little better now, because what you say is just soooooooooo wrong. Why do you think nvidia is developing CUDA ? To please scientists ? What is the actual industry that dwarf the entire video-movie sector by nearly an order of magnitude ?
Nvidia do not do research, the do development, and rightly so. They want to make every "smart" phone able to decompress 4K video of cats, and to be able to simulate real-time the "physics" of boobs. That is their main usage, and that tendency is always increasing.
This certainly requires a lot a incredible work of ingenious people making development. That's not research, that is quite the opposite of that.

Sagant said:
So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.
I am sorry but you are wrong. As a garage no-more-kid, I suggested once at work to use early version of CUDA to address some peculiar problem we had in some huge database we had. That is not research, that is the statement of the obvious, and most my co-worker had similar ideas. Any problem expressible in term of (parall'able) matrix mathematics will lead you there.
So of course scientist will also climb on board. But I was explaining to you in my first post that super-computer type architecture already existed before for those reason. They existed way before the term GPU was even coined.
But was you must realize is that a trillion'th dollar world wide industry is borne because of garage kid's doing "true research" by chasing there dream with absolutely no other purpose in minds. They were laugh'ed at for 15 years before someone decide to develop the ideas

Sagant said:
It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.
Vaporware is certainly a sector that got a lots of true research done. That is always somebody else fault (most probably the customer).
Truth is in the 80'th, your TV cold booted in 1 second with 50+ channel at disposal.
Now you wait 1 minute for you modem to initialize 20 second for your ps4 (or tv-box) to boot and enjoy at best 2 HD channel, unless some Chinese hacker decide otherwise. I would hate to call that progress...

Sagant said:
As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.
My point was that it is not research, but development. There is no ethics here. Just metrics.

Sagant said:
Uh, yes there is AI. There is not strong AI, but there is the so-called weak AI. Some examples are the Google's AI that won several GO games against a top championship (link), the famous IBM's Watson, which won the game Jeopardy (link), and the latest one is the poker winner LIbratus (link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (link), which also uses Image Processing techniques.
All these are probably extremely fun projects to work on. But as my C64 could already beats me hand down at chess (ok I suck badly), I am not really impress by all that. I am much more impressed by those humans that are so difficult to beat, even by whole teams of genius-grade people using so much computer power.
Now calling "intelligence", a thing that could not even tie its shoe (or understand what a shoe is or the purpose of it) is ... bizarre.

Sagant said:
CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like this.
OK, that was my understanding of CS. All these field have been "completed" before the 80'th (maybe not AI, but I don't understand what it does into that list).
But then maybe someone will invent something better than the 1956 quicksort ?
The only field that are "kind-of" evolving(or chasing its tail) is languages/framework.

Sagant said:
Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.
Because I mean it. I take logic seriously, that means a take math and science seriously, and "people in garage" suck at it, that's why they prefer garage .. to play.

Sagant said:
I disagree. CS is all about problem solving, just as is math, physics and engineering.
That is a clear statement, that I have to disagree with, and with precise reasons
-Computers are tools. Like a hammer. You fix things with it, you create things with it, or even break thing, depending on the usage. They do solve problem.
There maybe a "science of it", and that science would be a subset of math. There maybe sub-discipline in it (like the link you provided), but calling it "science" is a clear language abuse (see below**)
-Math don't solve problem. It actually create "problem". Problem is an absolute wrong word here. An equation is not a problem because you seek a solution to it. It is not a problem in the sense "need fixing". Algorithmic would be the special branch of mathematics applying only to computers, although cahotician and biologist may disagree.
-Physics/Science seek to DESCRIBE reality. There is no problem with reality. I don't think there is a problem to describe computer either(**).
-Engineering is much more like computer, to actually solve problem but of a concrete nature like "crossing a river"

OK, all those disciple are all somewhat siblings with sometime some overlap, but what irks me is associating computer with science. It is no more a special science that mediation science, or sports science. Science is a word very much abused nowadays to give credential to discipline that do not deserve it.

Sagant said:
And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still.
You may be true. I would be interested to find hard number associated with this claims.
But being in the field from its start, I have never ever heard of "computer researchers". The only remnant of that was the 20% hoax at Google. And only if it benefit the enterprise (which I suppose you could qualify as a "problem solving")

Sagant said:
If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.
I know were the money goes. And that is not in research.
I would be delighted to be pointed to peer reviewed paper from "professor" payed to advance the science of computing. A recent one, not a 40 year old one.
But what I cannot seem to make you understand is that I think we dearly need them, because they are useful. Things like that for example.
 
  • #33
It seems this thread has run its course and it may be time to close it.

Have we covered everything the OP asked?
 
  • #34
@Boing3000 : Clearly the problem in the above discussion is that you want to say that Computer Science is not a sciente, as you say. However, this was not the main purpose of the thread, to argue whether CS is a science or not (there is already too much of this useless discussion online).

My question was only about the advances in CS that helped other sciences and the world, and this goes way beyond the arguing whether this is research or development. I still don't agree with most things you said because everything I've studied says the opposite, but I won't give it any more time.

@jedishrfu : Well, not everything I asked was answered, but I think it would be impossible to do that anyway. I'd say most of it was answered, and it was a helpful thread to me, and hopefully to others as well. For more specific doubts I'll open other threads to keep the discussions "clean".

The thread can be closed,

Thank you all :)
 
  • #35
This thread is now closed.

A big thank you to all who contributed to it.
 
  • Like
Likes QuantumQuest

Similar threads

Replies
29
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
15K
Replies
4
Views
3K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 2 ·
Replies
2
Views
13K
Replies
15
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K