Sagant said:
Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).
CS fault ? What fault ? My point is all this is normal. Nowadays no companies in its right mind would ever
invest in
research (an oxymoron)
. They invest in
development. True innovations like all the key technologies of your IPhone date back to the 70'th and was invented by public sector sponsored searcher/scientist, and certainly with not plant to benefit from it in mind.
Sagant said:
I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.
Err, I am the one who is explaining to you that speed is never an issue.
Quality is an issue,
not quantity. GPU was a deal breaker quality wise, not only because it gave you access to a 3rd dimension, but because even in 2D it open so many possibilities (even in 1D =>DSP/sound)
Sagant said:
However, GPUs have long changed their main usage from computer games only to scientific computing and research.
OK stop. I think I understand you a little better now, because what you say is just soooooooooo wrong. Why do you think nvidia is developing CUDA ? To please scientists ? What is the actual industry that dwarf the entire video-movie sector by nearly an order of magnitude ?
Nvidia do not do research, the do
development, and rightly so. They want to make every "smart" phone able to decompress 4K video of cats, and to be able to simulate real-time the "physics" of boobs. That is their main usage, and that tendency is always increasing.
This certainly requires a lot a incredible work of ingenious people making
development. That's not research, that is quite the opposite of that.
Sagant said:
So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.
I am sorry but you are wrong. As a garage no-more-kid, I suggested once at work to use early version of CUDA to address some peculiar problem we had in some huge database we had. That is not research, that is the statement of the obvious, and most my co-worker had similar ideas. Any problem expressible in term of (parall'able) matrix mathematics will lead you there.
So of course scientist will also climb on board. But I was explaining to you in my first post that super-computer type architecture already existed before for those reason. They existed way before the term GPU was even coined.
But was you must realize is that a trillion'th dollar world wide industry is borne because of garage kid's doing "true
research" by chasing there dream with absolutely no other purpose in minds. They were laugh'ed at for 15 years before someone decide to
develop the ideas
Sagant said:
It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.
Vaporware is certainly a sector that got a lots of true research done. That is always somebody else fault (most probably the customer).
Truth is in the 80'th, your TV cold booted in 1 second with 50+ channel at disposal.
Now you wait 1 minute for you modem to initialize 20 second for your ps4 (or tv-box) to boot and enjoy at best 2 HD channel, unless some Chinese hacker decide otherwise. I would hate to call that progress...
Sagant said:
As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.
My point was that it is not research, but development. There is no ethics here. Just metrics.
Sagant said:
Uh, yes there is AI. There is not
strong AI, but there is the so-called
weak AI. Some examples are the Google's AI that won several GO games against a top championship (
link), the famous IBM's Watson, which won the game
Jeopardy (
link), and the latest one is the poker winner LIbratus (
link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (
link), which also uses Image Processing techniques.
All these are probably extremely fun projects to work on. But as my C64 could already beats me hand down at chess (ok I suck badly), I am not really impress by all that. I am much more impressed by those humans that are so difficult to beat, even by whole teams of genius-grade people using so much computer power.
Now calling "intelligence", a thing that could not even tie its shoe (or understand what a shoe is or the purpose of it) is ... bizarre.
Sagant said:
CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like
this.
OK, that was my understanding of CS. All these field have been "completed" before the 80'th (maybe not AI, but I don't understand what it does into that list).
But then maybe someone will invent something better than the 1956 quicksort ?
The only field that are "kind-of" evolving(or chasing its tail) is languages/framework.
Sagant said:
Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.
Because I mean it. I take logic seriously, that means a take math and science seriously, and "people in garage" suck at it, that's why they prefer garage .. to play.
Sagant said:
I disagree. CS is all about problem solving, just as is math, physics and engineering.
That is a clear statement, that I have to disagree with, and with precise reasons
-Computers are tools. Like a hammer. You fix things with it, you create things with it, or even break thing, depending on the usage. They do solve problem.
There maybe a "science of it", and that science would be a subset of math. There maybe sub-discipline in it (like the link you provided), but calling it "science" is a clear language abuse (see below**)
-Math don't solve problem. It actually create "problem". Problem is an absolute wrong word here. An equation is not a problem because you seek a solution to it. It is not a problem in the sense "need fixing". Algorithmic would be the special branch of mathematics applying only to computers, although cahotician and biologist may disagree.
-Physics/Science seek to DESCRIBE reality. There is no problem with reality. I don't think there is a problem to describe computer either(**).
-Engineering is much more like computer, to actually solve problem but of a concrete nature like "crossing a river"
OK, all those disciple are all somewhat siblings with sometime some overlap, but what irks me is associating computer with science. It is no more a special science that mediation science, or sports science. Science is a word very much abused nowadays to give credential to discipline that do not deserve it.
Sagant said:
And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still.
You may be true. I would be interested to find hard number associated with this claims.
But being in the field from its start, I have never ever heard of "computer researchers". The only remnant of that was the
20% hoax at Google. And only if it benefit the enterprise (which I suppose you could qualify as a "problem solving")
Sagant said:
If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.
I know were the money goes. And that is not in research.
I would be delighted to be pointed to peer reviewed paper from "professor" payed to advance the science of computing. A recent one, not a 40 year old one.
But what I cannot seem to make you understand is that I think we
dearly need them, because they
are useful. Things
like that for example.