The performance of machine learning models is closely related to their amount of training data, compute, and number of parameters. At Epoch, we’re investigating the key inputs that enable today’s AIs to reach new heights.
Our recently expanded Parameter, Compute and Data Trends database traces these details for hundreds of landmark ML systems and research papers.
In the past six months, we’ve added 240 new language models and 170 compute estimates. We will be maintaining this dataset, updating it with more historical information, and adding new significant releases. It's a valuable resource for journalists, academics, policymakers, and anyone interested in understanding the trajectory of AI.
The performance of machine learning models is closely related to their amount of training data, compute, and number of parameters. At Epoch, we’re investigating the key inputs that enable today’s AIs to reach new heights.
Our recently expanded Parameter, Compute and Data Trends database traces these details for hundreds of landmark ML systems and research papers.
In the past six months, we’ve added 240 new language models and 170 compute estimates. We will be maintaining this dataset, updating it with more historical information, and adding new significant releases. It's a valuable resource for journalists, academics, policymakers, and anyone interested in understanding the trajectory of AI.
Explore the interactive visualization, check out the documentation, and access the data for your own research at epochai.org/data/pcd.