Other Is BigData/MachineLearning/DeepLearning really the future?

  • Thread starter Thread starter davidbenari
  • Start date Start date
  • Tags Tags
    Future
Click For Summary
Big Data, Machine Learning, and Deep Learning are increasingly recognized as significant fields, but their potential is often overstated. Understanding data quality and collection methods is crucial for accurate analysis, as many analysts overlook these factors, leading to flawed conclusions. While Big Data is a reality, the real advancements may lie in mundane applications rather than the flashy AI projects often highlighted in the media. Collaboration with domain experts is essential for effective data analysis, yet many data scientists fail to engage with those who understand the data's context. Ultimately, pursuing a career in these areas requires a genuine interest and a commitment to understanding the complexities involved.
  • #31
davidbenari said:
Browsing through the internet, I keep hearing of these areas as if they hold so much promise. But, it could be the case that the internet is being exaggerated. Do you think it is the case that humanity will see great advances in these areas and will depend largely on these advances? Why? Anything interesting you have to tell me about these areas?

I'm thinking about doing PhD in this area, so I wanted to know what you guys thought about these areas.

Thanks.

Great question.

I would be skeptical of the AI/Deep Learning area. AI has gone through fads before including 1980 with Mathematica like stuff, then 90s with Rule based systems, then 00 with machine learning. Now in 2010 we had Deep learning. Each of the previous booms looked promising and companies were started. People talked about AI being relevant but the excitement faded when the real limitations were found. I think Convolutional Networks are a tool and they have limits. No one has shown that there is substantial additional leverage we will get from these new neural network tools. I believe there will be disillusionment as before when people realize the limitations of the technology and there are not subsequent fast breakthroughs. This does not mean that its not a useful tool and that machine learning is passe. Each of these things is adding to our arsenal to make computers smarter. But it will have serious limits in terms of what it can accomplish and where it can add value. The
excitement will fade. That's my prediction.

So, where should you study? There is so much. IoT, VR, BigData in general has legs. Even if AI isn't going to create sentient humans it is still one of the only ways to make some sense of BigData. Everybody will want to use that to make things incrementally better and AI is a tool. I also think the movement to the cloud is accelerating and so any area that has to do with helping companies move to the cloud is big.

The tremendous reduction in cost of communications is leading to putting networking into everything. I really think IoT is going to be huge. IoT generates massive data and we need all the tools we can to make sense of the data and help the IoT devices work better and work together. There is enormous network effect from the combination of all these devices we have no idea about yet.
 
  • Like
Likes Crass_Oscillator
Physics news on Phys.org
  • #32
I don't particularly care if machine learning/big data/AI is a fad but this so called fad saved my company 1.2 billion dollars last year and is on track to do much more than that this year. Call it what you like, be excited about it to whatever level you wish, but I'm fairly confident it's here to stay.
 
  • Like
Likes davidbenari
  • #33
As long as information increases and as long as a need to utilize it (for whatever reason) increases then more resources will be needed in some form to meet that need.

Those resources don't need to necessarily be people (since computers are often programmed) but I'd imagine there should be growth in human resources to meet that demand.
 
  • #34
I would just like to add that we need to be clear about the distinction about AI, machine learning, and big data, because these 3 terms are sometimes used interchangeably when they are in fact dealing with different things.

Artificial intelligence (AI), broadly speaking, is the research field within computer science and related disciplines that seeks to imbue computing machines with intelligence.

Machine learning is a subfield within AI that explores the study and construction of algorithms that can learn from and make predictions from data (and has considerable overlap with the field of statistics).

"Big data" is a poorly defined term, but I've seen described as data sets that are so large and complex that traditional data processing methods are inadequate.

Clearly, these three fields/definitions overlap considerably, as machine learning algorithms and methods are important tools to handle "big data" (although they are not the only tools -- I have often seen "data scientists" use traditional statistical methods to at least provide some initial analyses of "large" datasets). And so long as information increases and we are creating ever larger datasets in a wide range of areas (from various areas of science, business, etc.), I speculate that I see the scope of machine learning/statistics/data science growing.
 
  • #35
MarneMath said:
I don't particularly care if machine learning/big data/AI is a fad but this so called fad saved my company 1.2 billion dollars last year and is on track to do much more than that this year. Call it what you like, be excited about it to whatever level you wish, but I'm fairly confident it's here to stay.
He was just arguing that you should be wary of the hype, not that it's all useless (at least that's how I interpreted it).

There are lots of absurd pronouncements made by fan boys of the subject, as is the won't of fan boys of anything. The CEO (or some other higher up) of kaggle in an interview with Slate suggested that expert knowledge is actually a detriment, and that most problems will be solved using data science approaches, for instance (which is partially true but mostly delusional). Less obnoxiously, I've had numerous encounters with famous ML researchers where bullish proclamations about the application of ML to field X will have Y result, only to later see the media uncritically propagate them. Could novel ML techniques be useful in control engineering? In principle, yes. In practice, every control engineer I know finds the idea amusing at best. Can you do electronic structure calculations with neural nets and will this be the end of all other computational chemistry methods? Probably not. Etc etc.

And in the hype phase, fan boys breed prodigiously. Just think critically.
 
  • #36
Crass_Oscillator said:
He was just arguing that you should be wary of the hype, not that it's all useless (at least that's how I interpreted it).

There are lots of absurd pronouncements made by fan boys of the subject, as is the won't of fan boys of anything. The CEO (or some other higher up) of kaggle in an interview with Slate suggested that expert knowledge is actually a detriment, and that most problems will be solved using data science approaches, for instance (which is partially true but mostly delusional). Less obnoxiously, I've had numerous encounters with famous ML researchers where bullish proclamations about the application of ML to field X will have Y result, only to later see the media uncritically propagate them. Could novel ML techniques be useful in control engineering? In principle, yes. In practice, every control engineer I know finds the idea amusing at best. Can you do electronic structure calculations with neural nets and will this be the end of all other computational chemistry methods? Probably not. Etc etc.

And in the hype phase, fan boys breed prodigiously. Just think critically.

I agree in general, but regarding machine learning and control engineering, how about things like http://video.mit.edu/watch/meet-2011-tr35-winner-pieter-abbeel-4/ ?http://video.mit.edu/watch/meet-2011-tr35-winner-pieter-abbeel-4/
 
  • #37
Crass_Oscillator said:
He was just arguing that you should be wary of the hype, not that it's all useless (at least that's how I interpreted it).

There are lots of absurd pronouncements made by fan boys of the subject, as is the won't of fan boys of anything. The CEO (or some other higher up) of kaggle in an interview with Slate suggested that expert knowledge is actually a detriment, and that most problems will be solved using data science approaches, for instance (which is partially true but mostly delusional). Less obnoxiously, I've had numerous encounters with famous ML researchers where bullish proclamations about the application of ML to field X will have Y result, only to later see the media uncritically propagate them. Could novel ML techniques be useful in control engineering? In principle, yes. In practice, every control engineer I know finds the idea amusing at best. Can you do electronic structure calculations with neural nets and will this be the end of all other computational chemistry methods? Probably not. Etc etc.

And in the hype phase, fan boys breed prodigiously. Just think critically.

I have two points of contention with your post. I think there's a fundamental difference between saying be skeptical of entire fields of study and their application versus understand the limitations of those entire fields of study and their applications. The former implies that there are fundamental problems with the field that one should be weary of, while the latter implies that the being aware of the short comings of tools is a good thing.

My second disagreement is using Jeremy Howard, who happens to be someone I know, as your reference for a fan boy. First off, he is not the CEO of Kaggle, nor was he back in 2012 when he gave his statements. He was the President and Chief Senior Data Scientist for Kaggle. His job is literally to get companies to invest time and money into using Kaggle. What do you expect him to say, "Yes sometimes Kaggle isn't the right tool and we can't solve all your problems?" He's immensely supportive of the accomplishments that "amateur" free lance data scientist have accomplished for companies, and he does truly believe that subject matter expertise can hamper insights. However, in my experience his feelings are outside of the norm in the real world, after all Jeremy is a bit of an outside of the norm type of guy.

I find the rhetoric of data science can solve everything, or expertise doesn't matter tends to exist in a vacuum in the internet. It's easy to find people who say this online but it's hard to find such people who say it without some agenda such as pushing their new BI products or selling their expertise. In reality, I haven't encountered many speakers, or working data scientist who hold those views. In fact, the most successful projects I've seen were accomplished through close collaboration between engineers, data scientist, and technicians. The constant feed back loop of question -> understanding -> more questions -> insight -> question can only really be accomplished by cross functional and diverse teams.
 
  • Like
Likes davidbenari
  • #38
@ atyy

Looks neat. I will say that what I do is not related to control engineering. It doesn't bother me that people are considering applying ML to control engineering applications or other areas. What does bother me is when the application is dubious or absurd claims about it's potential successes are put forth; it's not a complaint about the field in particular, it's a complaint about the sociology that I've encountered.

I also dislike the token papers published where somebody had the bright idea of taking a flavor of the month algorithm and applied it to their problem without apparent rational justification.

@ MarneMath

Well I suppose I was a bit harsh there. I will say that I don't consider "he's selling his products, thus he must make strong remarks" to be a valid defense. If you're going to send a strong opinion into the public forum, it should be interpreted as your honest opinion, and will be.

It may be an experience issue. More bullish individuals I've encountered tend to be younger, but not always.
 
  • #39
http://www.rogerschank.com/fraudulent-claims-made-by-IBM-about-Watson-and-AI

Roger Schank views seem to relate to this thread.
 
  • Like
Likes Crass_Oscillator

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 12 ·
Replies
12
Views
1K
  • · Replies 11 ·
Replies
11
Views
4K
Replies
3
Views
2K
  • · Replies 21 ·
Replies
21
Views
5K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 62 ·
3
Replies
62
Views
9K