A machine was fed with the equivalent of 10 million books. Then it was taught humanity.
he result? GPT-3 began solving advanced mathematics. Without anyone teaching it. It translated between programming languages. Detected invisible emotional signals. Wrote sonnets under seemingly impossible constraints.
Then GPT-4.5 arrived. Worse than its predecessor on certain tests. With no obvious explanation.
As if these models had their own cognitive weather.
Between 2018 and 2022, behind the closed doors of research laboratories, something was created that we still do not fully understand. Something that develops abilities no one explicitly taught it. And sometimes loses them, without anyone knowing why.
The cost of the operation: 200 million dollars. 25,000 GPUs. 20,000 hours of human labor just to teach it basic politeness.
If their creators discover abilities only after the fact, what other surprises might still be waiting for us?




