Metaculus
M
Questions
Tournaments
Leaderboards
News
More
create
Log in
a
/
文
Question
Question
Will transformer derived architectures still be state of the art for language modeling in December 2025?
14
8 comments
8
1d
1w
2m
all
Total Forecasters
61
Community Prediction
85%
Make a Prediction
50%
community: 85%
Sign Up to Predict
Background Info
Histogram
median
85.0%
mean
84.0%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Authors:
Matthew_Barnett
Opened:
Closes:
Scheduled resolution:
AI Training and Compute
AI Technical Benchmarks
Artificial Intelligence
Neuroscience
Rnn (software)
Long short-term memory
Artificial intelligence
Technology
Computer science
Software engineering
Google Scholar
Submit Tags Feedback
Comments
recent
? comments
Log in
Load more comments
Follow
embed
Authors:
Matthew_Barnett
Opened:
Closes:
Scheduled resolution:
AI Training and Compute
AI Technical Benchmarks
Artificial Intelligence
Neuroscience
Rnn (software)
Long short-term memory
Artificial intelligence
Technology
Computer science
Software engineering
Google Scholar
Submit Tags Feedback
News Match
The Long Road to Genuine AI Mastery
Time
•
Sep 12, 2024
Do Large Language Models Have a Subconscious?
Psychology Today
•
Sep 12, 2024
The Evolution of LLMs Through Real-Time Learning
Psychology Today
•
Sep 13, 2024
Show More News
Learn more
about Metaculus NewsMatch
News Match
The Long Road to Genuine AI Mastery
Time
•
Sep 12, 2024
Do Large Language Models Have a Subconscious?
Psychology Today
•
Sep 12, 2024
The Evolution of LLMs Through Real-Time Learning
Psychology Today
•
Sep 13, 2024
Show More News
Learn more
about Metaculus NewsMatch
Similar Questions
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
72%
Will transformer derived architectures accelerate progress in deep learning?
32%
Will high-impact research on reducing the sample complexity of Large Language Model pretraining be forthcoming before 2026?
65%
Show More Questions
Similar Questions
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
72%
Will transformer derived architectures accelerate progress in deep learning?
32%
Will high-impact research on reducing the sample complexity of Large Language Model pretraining be forthcoming before 2026?
65%
Show More Questions