What Is The Data Center Story?

  • Thread starter Thread starter Hornbein
  • Start date Start date
Hornbein
Gold Member
Messages
3,980
Reaction score
3,192
I want to know what the data center craze is about. I was able to find only superficial discussions. Can anyone recommend a good online reference? In particular, it seems to me that sooner or later there will be a shakeout and the centers belonging to the losers will go bust. Second, computer hardware goes obsolete so rapidly. Will all those chips doing linear algebra be worthless in three years? And it seems quite likely that more efficient methods will be discovered. That always seems to happen.
 
Computer science news on Phys.org
Hornbein said:
Can anyone recommend a good online reference?
Turn it against itself. Just ask google AI, then follow your nose.
Hornbein said:
Second, computer hardware goes obsolete so rapidly. Will all those chips doing linear algebra be worthless in three years?
The upgrade of a module in a rack costs less than replacing the entire building. The present hardware modules will be replaced as lower energy, bigger memory, and faster computing become available. Out with the old, in with the new. Market forces will decide the direction of the evolution.
Hornbein said:
And it seems quite likely that more efficient methods will be discovered. That always seems to happen.
More efficient methods do not always demand a hardware upgrade. Often, existing hardware is used more efficiently.

Above all, connectivity allows a digital task to be sent to where it can be handled most economically. You no longer need to maintain in-house hardware processor capacity, you can hire what you need, when you need it.
 
The rise of data centers is tied closely to the explosion of AI services and cloud computing. AI companies compete for subscribers much like streaming companies compete for viewers, but their product is computational intelligence delivered over the internet.

Modern AI models require enormous computing resources both to train and to serve millions of users in real time. Because most advanced AI systems are too computationally expensive to run locally, companies rely on massive data centers filled with GPUs and specialized hardware.

Cloud companies such as Amazon recognized early that computing itself could be sold as a utility. What began as internal infrastructure for large internet businesses evolved into cloud platforms like AWS, where companies rent storage and computing power on demand.

The competition among AI companies has intensified this trend. Firms pursuing increasingly advanced systems, including possible AGI and ASI successors to today’s models, require huge amounts of computation, electrical power, networking, and cooling infrastructure. As a result, data centers have become critical economic and strategic assets in the modern AI race.

---

Who suffers from this expansion?

The rapid expansion of AI and cloud data centers also creates social and environmental costs. Consumers increasingly face subscription-based access to advanced computational tools, which can create pressure to pay for AI services simply to remain competitive in education, business, and creative work.

Large-scale data centers require enormous amounts of electricity and water for computation and cooling. In regions where electrical grids are already strained, utilities may expand power generation capacity through natural gas, coal, nuclear, or renewable energy projects depending on local economics and policy.

Critics argue that this expansion can increase environmental stress through higher energy consumption, carbon emissions from fossil-fuel generation, and heavy water usage for cooling systems. Supporters counter that AI and cloud infrastructure can also improve efficiency in science, medicine, engineering, logistics, and energy management.

The debate is therefore not simply about technology itself, but about who benefits, who pays the costs, and how society manages the infrastructure demands of large-scale AI systems.

---

The energy market is tiered. When demand is low, consumers pay much less for the bottom tier of hydro and nuclear power. The next tier is wind, and solar at the bottom tier. But because wind and solar are intermittent and demand increases, more costly energy production services come online, such as coal plants and natural gas plants. Higher tiers mean higher prices, which push consumers to buy solar and batteries to offset the higher-tier cost.

---

One recent development is the sale of compute nodes to consumers as an alternative to data centers. People could trade space at home for a compute node that draws power from their electrical grid to create a distributed data center. The idea is that it reduces transmission problems and makes the grid more resilient.

The compute node idea follows in the footsteps of the home battery, where you have access to power to charge your EV, and since you'll have leftover energy, it can be pushed back to the grid for additional consumer savings.

We will become islands in a gridded sea, owning our own energy generation (solar), energy storage (battery), and local compute nodes, and sharing it with the community.

 
Different companies may have different reasons. It's my understanding that Meta is transferring a lot of its Metaverse resources to AI. That is to put some of their useless Metaverse equipment to a different useless purpose. In my opinion, the resulting AI-generated fake videos will be about as useful to me as the Metaverse was.
 
I think gustaphson’s law applies here as compute capacity increases then the load will follow.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K