I will conduct an in-depth analysis of AI growth trends across all niches, with a primary focus on large language models (LLMs) as crystallized intelligence. The research will cover:

I'll update you with my findings soon.

AI Growth Trends: A Comprehensive Analysis

1. Contextual Background

AI has experienced rapid growth in recent years, driven by breakthroughs in deep learning and large language models (LLMs). Since around 2012, the field has seen an acceleration in compute usage and model capabilities, leading to systems far more powerful than a decade ago. In particular, the period from 2012 to 2018 marked a staggering 300,000x increase in compute for training AI models – an exponential trend with a 3.4-month doubling time, far outpacing Moore’s Law. This growth is underpinned by advances in algorithmic innovation, massive training datasets, and specialized hardware like GPUs and TPUs.

Key developments over the past few years include the rise of foundation models (e.g., GPT-3, GPT-4, BERT, PaLM) that learned from vast internet-scale data. AI systems have made major strides in natural language understanding, computer vision, robotics, and scientific discovery. For example, OpenAI’s GPT-4 (2023) demonstrated human-level performance on numerous benchmarks, passing a simulated bar exam in the top 10% of test-takers (GPT-3.5 was bottom 10%). Similarly, DeepMind’s AlphaFold2 (2020) solved the 50-year protein folding challenge, predicting protein structures with atomic accuracy. These milestones illustrate AI’s crystallized intelligence: LLMs and related models act as repositories of human knowledge, excelling at tasks that draw on accumulated information (e.g., language, facts, patterns) (The Value of Age: Crystallized Intelligence and Metcalfee’s Law – Neil Aitken – Blog).

Overall, the last few years have been transformative, with AI moving from niche applications to broad societal impact. AI now assists in coding, composes art and text, helps scientists in research, and is increasingly integrated into consumer products and enterprise workflows. This background sets the stage for analyzing growth indicators and exploring whether AI is nearing a tipping point in its development trajectory.

2. Growth Indicators

To assess AI’s growth across all niches, we examine multiple indicators: compute scaling, model performance benchmarks, investment trends, research output, and breakthrough innovations.

Compute Scaling Trends

Compute power for AI has grown exponentially. OpenAI’s analysis noted that since 2012, the largest AI training runs grew with a 3.4-month doubling time, meaning a 10x increase in compute per year. By 2018, this translated to over a 300,000-fold increase in training compute compared to 2012. Key drivers of this trend: