Training artificial intelligence used to be pretty cheap. But these days? It's a whole different story. A recent chart from the 2025 AI Index (based on data from Epoch AI) shows just how wild the costs have gotten to build the latest AI models, from just a few hundred bucks to nearly 200 million dollars.
From pocket change to millions
Back in 2017, Google trained the original Transformer model, the backbone of today’s AI tools, for just $670. Yep, that’s not a typo. Fast forward a couple of years, and Facebook's RoBERTa Large in 2019 cost around $160K. Still pricey, but manageable.
Then things took off. OpenAI’s GPT-3 (the brains behind a lot of modern AI tools) cost $4 million to train in 2020. And that was just the beginning.
The price explosion
By 2023, costs were climbing fast. GPT-4? $79 million. Google’s PaLM 2? $29 million. Even Meta’s Llama 2-70B, which was relatively “lightweight,” still came in at $3 million.
And 2024? Things got even crazier. Google’s Gemini 1.0 Ultra takes the crown, costing a jaw-dropping $192 million to train. Meta wasn’t far behind with its Llama 3 (405B) at $170 million. Even Elon Musk’s xAI project, Grok-2, cost $107 million.
Let’s just say—training these models is not for anyone on a budget.
Why so expensive?
There are a bunch of reasons these costs are blowing up:
What this means going forward
These rising costs bring up a few big questions:
Final thoughts
Building powerful AI is super exciting, but also super expensive. If the trend keeps going this way, only the biggest players might be able to build the next generation of models. Unless, of course, someone figures out how to do it faster, cheaper, and greener.