Going into 2026 I’ve been trying to figure out how I feel about 2025. Here are some very rough and incomplete thoughts.
Humans evolved to live on earth, it’s unlikely anywhere else universe will be more hospitable. If there is a place better than earth it’s probably too far away. We should do everything we can to preserve earth, all alternatives will be worse. Exploration and space travel should still be pursued. Not to find Earth 2 because we’ve ruined our home but because we learn more about the universe, ourselves and how to preserve and prevent further ruination of the earth we have.
Humans exhibit broad intelligence in the mundane, just look how much engineering, money and computing power are needed to write coherent sentences or drive a car. It’s not purely mechanical, knowing how to use a pencil does not make you an author. This “everyday intelligence” is undervalued by individuals and society.
No amount of data about me, be it writings, video recordings, brain scans or anything else fed into a computer will be me. It might be a high resolution 2D photograph of me but will never fully replicate nor replace me. To me the mind and body are the same highly integrated system and there is a material difference between what I do and the computer does, both in form and function, using data about me. Perhaps even this photograph of me is conscious but is still not me, it’s something new that may have many similarities. Similarly reproductions of art are not the same as the original. Like the camera and internet before it, AI is a copy machine. Invariably it will change the way we see art and ourselves.
Scaling feels like a brute force approach that, like others have said, has diminishing returns. I’m not convinced the authors of Attention Is All You Need for us to bet the world economy on scaling alone, just that we can get pretty far in a wide variety of situations. That said there’s lots of interesting software and hardware innovation that’s happened as a result even if we ignore the wild infrastructure build out happening. Regardless, brute force techniques are a good place to start but inevitably give way to smarter and more advanced approaches. If an LLM comes up with a novel solution to a problem it’s not due to insight, intuition nor understanding, it’s due to having a load of data and effectively trying everything until something works. Not to mention this is entirely based on the corpus of human works built over millennia of hard work, creativity and empathy.
The AI data center builders’ goal is to “monetize energy” no different than it was when their data centers ran bitcoin mines. Both in terms of energy and society’s data, AI is an extractive industry. AI in it’s current form is useless without all our data. Additionally, this is similar to social media being useless unless we login and post. Often this feels like work I perform to keep the computers and companies running rather than work the computers and companies are doing for me.