I watched an interesting video last night about artificial intelligence, and one point in particular stuck with me. The argument was that AI may have already consumed most of the useful human knowledge available online. Books, articles, code repositories, forums, research papers, social media posts, Wikipedia, technical manuals, Stack Overflow answers, cat memes, conspiracy theories, recipes for banana bread during the pandemic... all of it. The machines have basically eaten the internet. And now the question is: what happens when there's nothing substantial left to feed them?
That got me thinking about how we humans learn. When I was younger, every new computer book I picked up felt like discovering new alien technology. You'd learn one new command in DOS or one clever VBA trick in Access and suddenly your whole mental map expanded. But eventually you hit a point where new information doesn't come as fast anymore. You stop making giant leaps and start making tiny refinements. You already know the foundations. You're just optimizing around the edges. Experience becomes more valuable than raw information intake. (1)
AI seems to be hitting a similar wall. The early models improved dramatically because they were vacuuming up gigantic amounts of human-created data. Every iteration felt smarter. Like watching Commander Data slowly become more human over the course of Next Generation. But even Data had limits. Dr. Soong could only program him with the knowledge and experiences available at the time. Data grew afterward through interaction, observation, and experience. He didn't magically become omniscient just because he had a positronic brain.
That's where things get interesting. The next leap for AI probably won't come from simply throwing more text at it. There's only so much left to scrape. A lot of the remaining content online is either repetitive garbage, AI-generated slop, misinformation (2), or twenty-seven copies of the same article rewritten for SEO. At some point, feeding AI more internet content is like trying to improve your diet by eating bigger portions of cotton candy. You're consuming volume without adding much nutritional value.
As a programmer, this actually makes perfect sense to me. More data doesn't always mean better output. I've seen databases with fifty duplicate tables, six copies of customer records, and enough redundant queries to make Geordi cry into his VISOR. At some point, the issue isn't quantity anymore. It's quality, structure, and context. You don't improve a database by dumping random junk into it. You improve it by refining the design.
I think that's where AI companies are now. They've already built warp engines. Now they're discovering that warp engines alone don't solve every problem. The next breakthroughs may come from better reasoning models, memory systems, contextual awareness, or entirely new architectures we haven't thought of yet. Maybe AI needs experiences instead of just information. Maybe it needs long-term interaction with the world. Maybe we're still in the "room-sized computer with blinking lights" phase of this whole thing and don't even realize it yet.
Of course, there is also a slightly hilarious irony here. Humans created AI by feeding it human knowledge, and now AI companies are quietly realizing that humans may not actually produce enough high-quality knowledge fast enough to sustain exponential growth forever. Considering how much of the internet is arguments about politics, celebrity gossip, and whether pineapple belongs on pizza, maybe we've reached the practical limits of the dataset.
I still think AI is going to transform society in ways we're only beginning to understand. I use it every day now for coding help, brainstorming, customer service drafts, image generation, and organizing ideas. It's incredibly useful at speeding up things I already know how to do but just want an assistant to help with. But I also think we're moving out of the explosive "holy crap" phase and into the slower engineering phase where progress becomes incremental instead of magical. That's normal. That's how most technologies mature.
The funny thing is, this might actually be good news. Slower progress gives humanity time to adapt. Historically, humans are terrible at adapting to rapid technological change. We still haven't figured out how to handle social media responsibly, and we've had that for over twenty years now. Maybe a little breathing room isn't the worst thing.
What do you think? Have we reached the first real ceiling for AI, or is this just another temporary plateau before the next big breakthrough?
End Notes(1) Although I will say now that I'm excited to learn things that are Access and VBA adjacent. Things you can do with Access using other tools, like PowerShell, Python scripts, or AI that works with Access. That's one of the reasons I'm excited to dive into SQL Server as well. Even though I've been using it myself for decades, I'm still discovering new things with it almost every day.
If you are a Visitor, go ahead and post your reply as a
new comment, and we'll move it here for you
once it's approved. Be sure to use the same name and email address.