Sagant
- 14
- 0
Boing3000 said:Research did not continue. Research died on the late 70'th for two reason:
-one is economic/politic. There was no interest anymore to keep paying the most gifted and talented researchers to come up with unneeded and totally crazy idea (like the mouse), with absolutely no goal nor objective. What research meant back then. That did not only apply to CS but to every other discipline. (I will just remind you that you could board planes going 3 times faster than those we have nowaday).
-the second one is the only scientifically important truth. Everything significant will been invented, so there will always be an innovation peak, and after that you have to put more and more effort that yield less and less result.
Fair enough, but then it is not a CS fault, it is just the investment in research inside universities, though in companies it continues - my advisor worked at AT&T in their R&D group. And it is also not true worldwide, perhaps it is in the US (or the country you are referring to).
Boing3000 said:Speedups ? Speed of what ? Your brains can handle only 60 FPS.
In my experience GPU (or what was named custom chipset back then) is the only qualitative innovation because garage kid could now tap into it to create genuinely new art form (like winamp visualizers), new paradigm if you will (that word have been abused so much since then, that it is a laughing stock now).
I think you have some misconception about the usage of GPUs. First, the speedup is not only to make higher FPS, but to allow more and better scenes bing rendered, when we are talking about animation and games purely.
However, GPUs have long changed their main usage from computer games only to scientific computing and research. In my university the High Performance Computing Group uses it to make HPC applications. The mathematics departament uses GPUs to accelarate their simulations and calculations. The physics uses it too, and in fact had to ask the CS department for help a couple of times, as I have discovered recently. Not to mention the supercomputers available built entirely with GPUs, such as those here or the famous Titan. Some biotechnology groups also use GPUs to make their protein docking predictions and other stuff.
So GPUs are not about "garage kids", and nerds playing games anymore, their market has grown a lot to use in other scientific fields, due to their natural power on handling calculations.
Boing3000 said:Of course, if computers can alleviate some people from the most menial and boring task, that is a good thing (at least from my point of view). But I don't know if "science" is ever concerned by this. Technology could be, but then a fact is that those "product quality check" have probably been used to improved this
Not really only to improve the so-called obsolesce. Lots of food companies use it to avoid using rotten food, of objects with fabrication errors, which can avoid a lot of headaches to the final customers. Also, Image Processing techniques are used by some astrophysicists to help detecting galaxies, stars, planets or other bodies in outer space - which I have also known only recently.
Boing3000 said:The quality of communication have decrease, by every metric possible. The quantity (and vacuity), on the other side, has literally exploded.
So I don't know when those better and fasster protocol are used, but certainly not when I phone, or watch TV. Nowadays, I don't have to call another continent to have a crappy line, calling my neighbor will do. Now if a packet arrive in 4 days instead of 6, it is not really going to change anybody's life.
It is not due to CS also, it is due to the amount of people using the telecommunications at the same time. The core of the internet (called its backbone) is really heavy on traffic. It these faster protocols were not being used in the routers along the way, your service could be a lot worse than it is now. This is mostly a fault of the telecom companies that do not improve their infrastructure.
As for the packets, it is not about a packet arriving some days later. The whole process of packing and delivering can save millions of dollars and time, to the company and customers. Also, there are a lot more areas where CO can be applied, including networks.
Boing3000 said:There is no AI on the surface of the Earth (human aside). Seriously though, I have no idea why AI would be put forward.
Uh, yes there is AI. There is not strong AI, but there is the so-called weak AI. Some examples are the Google's AI that won several GO games against a top championship (link), the famous IBM's Watson, which won the game Jeopardy (link), and the latest one is the poker winner LIbratus (link). This is to name a few and the most famous. AI also takes place on some robots, such as the famous Mars Rover (link), which also uses Image Processing techniques.
Boing3000 said:That may be true. Maybe you should define more specifically what CS means for you if it is "not software technology". Maybe you meant by CS things like this ? Another things could be "information theory", but this is for serious people like mathematician, not for people getting dirty by assembling things by hand (for whatever reason we still use keyboard).
CS is at some point "software technology", but it is not all about software. Several areas of CS only make use of software as a mean to an end, not the entire focus. If you are talking about Software Engineering, which is a common misconception to think CS==SE. By CS I mean something like this. Why do you say that "information theory" is for serious mathematicians? There is a lot of CS people working on that, especially on the theoretical side of CS, which is very close to math, but not math itself.
Boing3000 said:Pointless ? No, no at all. Don't get me wrong. I would like to see people searching for the sake of it (that is: not trying to solve problem).
What I say is that there is no actual money spent in C science because all is spent in C engineering.
I would like a real science genius (in information theory) explains how such a complex image can be perfectly "compressed" in 959 bytes (and really only 6 bit of those bytes are really used...) or a psychologist to explains why we are so dumb as to use PNG or worst JPEG for that.
I disagree. CS is all about problem solving, just as is math, physics and engineering. The question at hand is if the problem being solved is actually useful to the everyday life, or is just a highly theoretical stuff that perhaps will be used in a distant future (or not). And again, I don't think the money is being spent in CE, rather than CS, both fields receive a certain amount of investment, perhaps one more than another, but still. If there was no investment, how would there be so many professors and researchers doing research in that? If the research is useful is the main question of the thread, though some of the answers and my readings showed it is useful.