Here are the takeaways from today's Morning Brief. sign up Every morning you will receive the following message in your inbox:
It used to be enough to mention AI on an earnings call celebrated by Wall Street. However, a harsher reality is beginning to emerge.
The grand ambitions of AI technology are supported by enormous costs, from extreme demands on natural resources to huge hardware investments. Big Tech's huge valuations seem poorly justified against the improbable status quo of AI development.
If 2024 is the year of “Show Me,” we’re still waiting.
Now, with earnings season upon us, AI is once again driving a small number of giant stocks. But a recent wave of skepticism suggests the hyped returns may never materialize.
The web's collection of content, the material that inspires sophisticated models to generate artificial images or churn out compelling LinkedIn posts, is itself a finite resource. Even the vastness of the Internet ends somewhere.
This has led AI companies to go on a mad dash for more content, stealing copyrighted works, converting videos to text, and using AI-generated material to train AI systems. It may even be used as data.
However, as researchers have shown, relying on synthetic data reduces the quality and reliability of AI models, highlighting significant limits to the potential of advanced AI.
Researchers at Rice University compared the dangers of training generative models with synthetic materials to “feeding cows the remains of other cows, including their brains,” and likened AI training to mad cow disease.
The explosion of AI tools has already littered the web with synthetic content, and the share of synthetic content on the internet continues to grow. As you've probably already noticed, gaming search engine results often get a lot of attention for a short time by clicking on authorless, synthetic, and ultimately useless articles when searching for information by someone you trust. It is something that collects.
Of course, this means that existing AI systems are already incorporating their own results.
“This is really about the brain corrupting the brains of the future,” said study co-author Richard Baraniuk, a professor of electrical and computer engineering at Rice University.
The limits of human-generated content are just the latest example of AI facing an insurmountable line. There are arrays to choose from.
Rene Haas, CEO of chip design company ARM, said earlier this week that AI models “have an insatiable appetite in terms of their thirst for power.”
“By the end of the decade, AI data centers could consume 20% to 25% of U.S. electricity demand,” Haas told The Wall Street Journal. “That’s not very sustainable.”
And these words are coming from a CEO, not a hater.
His comments echo a January report from the International Energy Agency that found that queries on ChatGPT require nearly 10 times more power than the average Google search. Measured from 2023, electricity demand by the AI industry will increase by at least 10 times by 2026, the agency said.
Another problem standing in the way of AI dreams is closer to home.
Tech companies are scrambling to reduce their dependence on external suppliers of AI chips, spending billions of dollars on hardware and infrastructure. Google (GOOG, GOOGL) and Meta (META) unveiled new in-house chips this week, revealing an expensive undertaking.
This investment is your ticket to prosperity in an AI-driven future. But expenditures such as warnings on data and resources move them closer to having to prove it.
Hamza Shaban is a reporter for Yahoo Finance, covering markets and economics. Follow Hamza on Twitter @hshaban.
Click here for the latest technology news impacting the stock market.
Read the latest financial and business news from Yahoo Finance