- 22,467
- 7,371
We have many threads on AI, which are mostly AI/LLM, e.g,. ChatGPT, Claude, etc. It is important to draw a distinction between AI/LLM and AI/ML/DL, where ML - Machine Learning and DL = Deep Learning.
AI is a broad technology; the AI/ML/DL is being developed to handle large data sets, and even seemingly disparate datasets to rapidly evaluated the data and determine the quantitative relationships in order to understand what those relationships (about the variaboles) mean.
At the Harvard & Smithsonian Center for Astronphysics, AI is being developed to evaluate data and solve problems.
There will probably be an AI/LLM component as well.
https://ai.engineering.columbia.edu/ai-vs-machine-learning/
https://en.wikipedia.org/wiki/Machine_learning
The New Yorker Radio Hour had a discussion with Steven Witt about the rise of AI in conjunction with Nvidia and the development of the specialized microchips essential to the AI revolution. The first half-hour is about AI (neural networks) and Nvidia (chips using parallel computing).
https://www.wnycstudios.org/podcast...-plus-elaine-pagels-on-the-mysteries-of-jesus
Witt also mentions the first, what we would call a modern AI system, was built by Alex Krizhevsky 9in his bedroom), a computer scientist at University of Toronto, who built a system with two Nvidia gaming cards (GeForce-branded GPU cards) in 2011-2012.
https://en.wikipedia.org/wiki/Alex_Krizhevsky
https://www.cs.toronto.edu/~kriz/
https://en.wikipedia.org/wiki/AlexNet
https://en.wikipedia.org/wiki/Neural_network_(machine_learning)
https://en.wikipedia.org/wiki/Convolutional_neural_network
AI is a broad technology; the AI/ML/DL is being developed to handle large data sets, and even seemingly disparate datasets to rapidly evaluated the data and determine the quantitative relationships in order to understand what those relationships (about the variaboles) mean.
At the Harvard & Smithsonian Center for Astronphysics, AI is being developed to evaluate data and solve problems.
https://www.cfa.harvard.edu/research/astroaiAstroAI strives to bring experts in artificial intelligence together with scientists to tackle the most exciting and challenging problems in astrophysics. By facilitating interdisciplinary collaborations and drawing on the expertise of the Smithsonian, Harvard and Boston area science community, we hope to advance our understanding of the universe and drive forward technology that will revolutionize and accelerate scientific discovery at the CfA.
There will probably be an AI/LLM component as well.
https://www.redhat.com/en/blog/what-aiml-and-why-does-it-matter-your-businessWhat is machine learning?
Machine learning (ML) is a subset of AI that falls within the “limited memory” category in which the AI (machine) is able to learn and develop over time.
There are a variety of different machine learning algorithms, with the three primary types being supervised learning, unsupervised learning and reinforcement learning.
https://ai.engineering.columbia.edu/ai-vs-machine-learning/
https://en.wikipedia.org/wiki/Machine_learning
https://www.cengn.ca/information-centre/innovation/difference-between-ai-ml-and-dl/Artificial Intelligence (AI): Developing machines to mimic human intelligence and behaviour.
Machine Learning (ML): Algorithms that learn from structured data to predict outputs and discover patterns in that data.
Deep Learning (DL): Algorithms based on highly complex neural networks that mimic the way a human brain works to detect patterns in large unstructured data sets.
The New Yorker Radio Hour had a discussion with Steven Witt about the rise of AI in conjunction with Nvidia and the development of the specialized microchips essential to the AI revolution. The first half-hour is about AI (neural networks) and Nvidia (chips using parallel computing).
https://www.wnycstudios.org/podcast...-plus-elaine-pagels-on-the-mysteries-of-jesus
Witt makes the comment that neural nets (networks) developed concurrently with the advanced microchips.Across the country, data centers that run A.I. programs are being constructed at a record pace. A large percentage of them use chips built by the tech colossus Nvidia. The company has nearly cornered the market on the hardware that runs much of A.I., and has been named the most valuable company in the world, by market capitalization. But Nvidia’s is not just a business story; it’s a story about the geopolitical and technological competition between the United States and China, about what the future will look like. In April, David Remnick spoke with Stephen Witt, who writes about technology for The New Yorker, about how Nvidia came to dominate the market, and about its co-founder and C.E.O., Jensen Huang. Witt’s book “The Thinking Machine: Jensen Huang, Nvidia, and the World’s Most Coveted Microchip” came out this year.
Witt also mentions the first, what we would call a modern AI system, was built by Alex Krizhevsky 9in his bedroom), a computer scientist at University of Toronto, who built a system with two Nvidia gaming cards (GeForce-branded GPU cards) in 2011-2012.
https://en.wikipedia.org/wiki/Alex_Krizhevsky
https://www.cs.toronto.edu/~kriz/
https://en.wikipedia.org/wiki/AlexNet
https://en.wikipedia.org/wiki/Neural_network_(machine_learning)
https://en.wikipedia.org/wiki/Convolutional_neural_network