Major shifts at OpenAI spark skepticism about impending AGI timelines

Status
You're currently viewing only volcano.authors's posts. Click here to go back to viewing the entire thread.

volcano.authors

Smack-Fu Master, in training
78
Having followed AI off-and-on since the Prolog/Society of Mind days, I've never understood how a scaled up LLM is supposed to make the leap to AGI. Now, I'm not an AI researcher or scientist, but perhaps the answer is "it's not".
ChatGPT-3.5 = 175 billion parameters, according to public information

Different studies have slightly varying numbers for a human brain, but it's 1000x more: from 0.1 to 0.15 quadrillion synaptic connections. Source: https://www.scientificamerican.com/article/100-trillion-connections/ (among others)

While it's likely to require something more than just scaling up the model size, I thought this gives some clue about scale. I agree with you, perhaps the answer is "it's not" scaling.
 
Upvote
64 (75 / -11)
Status
You're currently viewing only volcano.authors's posts. Click here to go back to viewing the entire thread.