A long report on the state of AI. For all those interested in this multidimensional rapidly evolving phenomena – holding the sum of all fears and hopes.
Created and launched as a project of the One Hundred Year Study on AI at Stanford University (AI100), the AI Index is an open, not-for-profit project to track activity and progress in AI. It aims to facilitate an informed conversation about AI that is grounded in data. This is the inaugural annual report of the AI Index, and in this report we look at activity and progress in Artificial Intelligence through a range of perspectives. We aggregate data that exists freely on the web, contribute original data, and extract new metrics from combinations of data series.
All of the data used to generate this report will be openly available on the AI Index website at aiindex.org. Providing data, however, is just the beginning. To become truly useful, the AI Index needs support from a larger community. Ultimately, this report is a call for participation. You have the ability to provide data, analyze collected data, and make a wish list of what data you think needs to be tracked. Whether you have answers or questions to provide, we hope this report inspires you to reach out to the AI Index and become part of the effort to ground the conversation about AI.
The first half of the report showcases data aggregated by the AI Index team. This is followed by a discussion of key areas the report does not address, expert commentary on the trends displayed in the report, and a call to action to support our data collection eforts and join the conversation about measuring and communicating progress in AI technology. Data Sections The data in the report is broken into four primary parts:
- Volume of Activity
- Technical Performance
- Derivative Measures
- Towards Human-Level Performance?
The Volume of Activity metrics capture the “how much” aspects of the field, like attendance at AI conferences and VC investments into startups developing AI systems. The Technical Performance metrics capture the “how good” aspects; for example, how well computers can understand images and prove mathematical theorems. The methodology used to collect each data set is detailed in the appendix.
These first two sets of data confirm what is already well recognized: all graphs are “up and to the right,” reflecting the increased activity in AI efforts as well as the progress of the technology. In the Derivative Measures section we investigate the relationship between trends. We also introduce an exploratory measure, the AI Vibrancy Index, that combines trends across academia and industry to quantify the liveliness of AI as a field.
You can download the Entire Report Here
Leave a Reply