How much does it really cost to train AI
Startup News 2 months ago
startup #ai

How much does it really cost to train AI? A lot more than you’d think

Training artificial intelligence used to be pretty cheap. But these days? It's a whole different story. A recent chart from the 2025 AI Index (based on data from Epoch AI) shows just how wild the costs have gotten to build the latest AI models, from just a few hundred bucks to nearly 200 million dollars.

From pocket change to millions

Back in 2017, Google trained the original Transformer model, the backbone of today’s AI tools, for just $670. Yep, that’s not a typo. Fast forward a couple of years, and Facebook's RoBERTa Large in 2019 cost around $160K. Still pricey, but manageable.

Then things took off. OpenAI’s GPT-3 (the brains behind a lot of modern AI tools) cost $4 million to train in 2020. And that was just the beginning.

The price explosion

By 2023, costs were climbing fast. GPT-4? $79 million. Google’s PaLM 2? $29 million. Even Meta’s Llama 2-70B, which was relatively “lightweight,” still came in at $3 million.

And 2024? Things got even crazier. Google’s Gemini 1.0 Ultra takes the crown, costing a jaw-dropping $192 million to train. Meta wasn’t far behind with its Llama 3 (405B) at $170 million. Even Elon Musk’s xAI project, Grok-2, cost $107 million.

Let’s just say—training these models is not for anyone on a budget.

Why so expensive?

There are a bunch of reasons these costs are blowing up:

  • Bigger models need more computing power—tons of it.
  • Massive datasets take time and money to collect and clean.
  • Electricity and hardware (like super powerful GPUs) cost a fortune.
  • Top-tier talent doesn’t come cheap, think researchers, engineers, and data scientists.

What this means going forward

These rising costs bring up a few big questions:

  • Who can even afford this? Only huge companies with deep pockets can play at this level.
  • Will it slow down smaller AI startups? Maybe, but it might also push folks to find smarter, more efficient ways to train AI.
  • What about the environment? With so much energy being used, sustainability is becoming a bigger concern.

Final thoughts

Building powerful AI is super exciting, but also super expensive. If the trend keeps going this way, only the biggest players might be able to build the next generation of models. Unless, of course, someone figures out how to do it faster, cheaper, and greener.

Tags: PaLM , OpenAI , Llama , Grok , Google
0
234
FRC DECISION-MAKING PROCESS

FRC DECISION-MAKING PROCESS

defaultuser.png
Startup News
3 years ago
TAP EU is the first Talent Acquisition Platform from Romania and Bulgaria

TAP EU is the first Talent Acquisition Platform from Romania and Bulga...

defaultuser.png
Startup News
3 years ago
Fil Rouge Capital Lands in Romania with €60M Fund — Taps Matei Dumitrescu as Venture Partner

Fil Rouge Capital Lands in Romania with €60M Fund — Taps Matei Dumitre...

defaultuser.png
Startup News
2 months ago
video

Brian Chesky on Launching Airbnb and the Challenges of Scale

startup
Startup News
3 years ago
Stockholm’s EQT Growth announces €2.2 billion fund to back European startups

Stockholm’s EQT Growth announces €2.2 billion fund to back European st...

defaultuser.png
Startup News
2 years ago