Artificial intelligence is not just getting smarter and faster, it’s also getting cheaper. Much cheaper. Like almost 100% cheaper in just the last two years. From more powerful chips and more efficient storage to advances in AI research that reduce computational load, the cost of a byte has never been lower.

OpenAI’s latest chatbot, GPT-4o, is faster and half the price of its previous model. The company estimates that the cost of accessing its large-language model (LLM) has declined 99% in the last two years.1 Similarly, Google’s latest Gemini model is more powerful and also half the price of its previous model.1

Determining the cost of running an AI model is complicated and involves a lot of behind-the-scenes players. Companies like OpenAI and Google are measuring the cost of their models by how much it costs to process tokens. A token is the smallest unit of data used to process information.2 GPT-4o tokens cost $4 per million tokens at the end of 2024. In March 2023, GPT-4 cost $36 per million tokens — that’s a near 80% price reduction in less than two years.3

But running an AI model also requires storage and computational power, all of which have gotten significantly cheaper over time. From 2000 to 2023, the cost of storing one terabyte of data has dropped from $1 million to $1,000 — yet another nearly 100% decrease.4 

Computational performance of advanced AI chips has also become drastically cheaper. Chip performance is measured as floating-point operations (flops) per second per USD. Nvidia’s best chip in 2008 performed 0.96 gigaflops/s/$. Their best chip in 2022 performed 51.66 gigaflops/s/$. That’s a 5,281% increase in 14 years.5

Anthropic has also been slashing its AI costs and attributes gains to AI research in sparsity and quantisation. Sparsity attempts to remove as much unnecessary data from the models as possible without compromising accuracy, which reduces computational load and storage needs.6 Quantization achieves similar goals by, essentially, rounding numbers to reduce precision without compromising accuracy.7

Technical jargon aside … cheaper AI ultimately means greater possibilities. As the cost of a byte continues to decline, the potential for companies operating in the AI and technology space will continue to rise. TrueShares Technology, AI, and Deep Learning ETF (LRNZ) operates at the forefront of this space with a concentrated, actively-managed portfolio of category killers. From companies using AI to make lifesaving discoveries in the biomedical industry to leaders in chip-making, data storage, cloud computing, and cybersecurity. LRNZ is an accessible entry point into an industry that is only just getting started.

For more information on LRNZ, including current holdings, visit: www.true-shares.com/lrnz/

  1. https://www.deccanherald.com/opinion/ai-is-getting-cheaper-that-wont-fix-everything-3137255 
  2. https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them 
  3. https://www.deeplearning.ai/the-batch/falling-llm-token-prices-and-what-they-mean-for-ai-companies/ 
  4. https://ourworldindata.org/grapher/historical-cost-of-computer-memory-and-storage?time=earliest..2023
  5. https://ourworldindata.org/grapher/gpu-price-performance
  6. https://blogs.nvidia.com/blog/sparsity-ai-inference/ 
  7. https://huggingface.co/docs/optimum/concept_guides/quantization 
  8. https://www.semafor.com/article/12/04/2024/this-is-amazing-how-samsara-unglamorous-ai-could-drive-productivity-gains