• Questions
  • Tournaments
  • Services
  • News
  • Questions
  • Tournaments
  • Questions
  • Questions
Feed Home
👥
Communities
💎
Metaculus Cup
🤖
Q2 AI Benchmarking
🌍
USAID Outlook
🏛️
POTUS Predictions
💵
Fiscal Showdown
Topics
✨🔝
Top Questions
🐦🦠
H5N1 Bird Flu
🕊️🌐
Global Elections
⏳🌀
5 Years After AGI
🇮🇱🇵🇸
Gaza Conflict
🦠🩺
Mpox outbreak
🇺🇦⚔️
Ukraine Conflict
categories
🤖
Artificial Intelligence
🧬
Health & Pandemics
🌎
Environment & Climate
☣️
Nuclear Technology & Risks
See all categories
  • About
  • API
  • FAQ
  • forecasting resources
  • For Journalists
  • Contact
  • Careers
GuidelinesPrivacy PolicyTerms of Use
ForbesScientific AmericanTimeVoxYale NewsNature

Will a 100 trillion parameter deep learning model be trained before 2026?

resultYes

When will AI out-perform humans on argument reasoning tasks?

17 May 2026

How many Computation and Language e-prints will be published on arXiv over the 2020-12-14 to 2021-06-14 period?

result3938

How many Natural Language Processing e-prints will be published on arXiv over the 2021-01-14 to 2022-01-14 period?

result8066

How many Natural Language Processing e-prints will be published on arXiv over the 2021-01-14 to 2030-01-14 period?

This question is closed for forecasting. Latest Community prediction is displayed.

110k

How many Natural Language Processing e-prints will be published on arXiv over the 2021-02-14 to 2023-02-14 period?

result17.199k

Where is the AGI Roadmap?

5
2 comments2
AI Progress Essay Contest

AI Safety ∩ AI/DL Research

10
no comments0
AI Progress Essay Contest

Deep Learning ourselves into the unknown

3
no comments0
AI Progress Essay Contest

Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?

Key Factor

Attention mechanisms have incurred 7 years' gradual innovations, showing that the transformer architecture is still evolving and improving.

90%chance

Key Factor

Attention mechanisms have incurred 7 years' gradual innovations, showing that the transformer architecture is still evolving and improving.