The purpose of education is to show people how to learn for themselves. The other part of it is that you need to reach that state of independence and self-reliance.
I've been saying this for a while now, and it's finally starting to play out in the real world. A couple years ago, the tech industry was running around like Chicken Little screaming that AI was going to replace 80% of developers by 2025. Companies bought into the hype hard. Massive layoffs, hiring freezes, "AI-first" mandates. The narrative was that we'd have tireless digital co-workers cranking out perfect code while the humans rode off into the sunset. Yeah... about that.
Turns out, software development isn't just typing code. It's architecture, context, long-term thinking, and knowing how and why something was built the way it was in the first place. A lot of these companies are now discovering that AI-generated code looks great in a demo but creates a maintenance nightmare in production. More bugs, more security holes, more duplicated logic, and a whole lot of "slop layer" code that works... until it doesn't. And when it breaks, guess who has to fix it? The same developers they thought they didn't need anymore.
There's also a pipeline problem brewing. Junior hiring got slashed because leadership assumed AI could handle entry-level work. But that's how seniors are made. You don't magically spawn experienced engineers out of thin air. You train them. Cut off the juniors, and five years from now you've got a talent drought. We're already seeing the early signs.
The lesson here isn't that AI is useless. It's not. It's a tool, and a powerful one when used correctly. But it's not a replacement for skilled human developers any more than a calculator replaces mathematicians. Companies that treated it like a silver bullet are now paying the price in technical debt, security risks, and re-hiring sprees.
Funny how the pendulum swings. The winners in this next phase aren't the ones who fired their engineers. They're the ones who invested in them... and gave them AI as an assistant, not a substitute.
I never thought about it like this but you are absolutely right, and makes total sense.
Matt Hall
@Reply 34 days ago
The same is true with skilled and semi-skilled labor. Ford's Jim Farley was complaining about the lack of skilled labor. What these companies forget is that their skilled employees used to be unskilled. It was these companies that trained people to fill their own pipeline. If Ford is so short of skilled labor, maybe they could hire unskilled people and train them.
I have watched this decline throughout my career. When I started out in the construction/maintenance/engineering field, I was sent out for factory/vendor training on a regular basis. I have watched this move from off-site training in a proper classroom, to on-site training in an ad-hoc classroom, to train-the-trainer, to lunch-and-learn, to nothing at all. Now they tell themselves that they are just hiring people "pre-trained". It is going to take an actual leader in manufacturing to "invent" the concept of training employees and the rest of the lemmings will follow. Maybe they could call it something clever, like an apprenticeship...not that I have an opinion on the matter. :)
Lisa Snider
@Reply 34 days ago
Matt So true, in nearly every industry these days. My own position became necessary when the company decided getting the most labor for the least amount of money was of the utmost importance. So they rely almost entirely on temporary workers, which receive (at best) fifteen minutes of training. My position is that of troubleshooter, having to unravel the myriad mistakes made during processing our products to determine WHY the units processed do not match precisely the units for which we are billed by our vendors.
Robert Heifler
@Reply 34 days ago
I was a VBA programmer and database developer for over 40 years. The things that really help solve specific company needs requires creative thinking to ensure the system handles different contingencies. Each company I worked with within the same business niche worked slightly different. It was those differences that required custom application design. That being said there are certain repetitive tasks that AI is very capable of handling. The creation of Microsoft Access objects can be assisted with this automation. But the decision of what those objects should do and how they should relate to each other needs to be controlled by someone who has found out what is really needed and wanted from that specific business or activity and then delivering on that specifically.
Michael Olgren
@Reply 32 days ago
Wanna be really scared? This is happening in healthcare too.
Michael yeah, I just watched a news report last night that there are several lawsuits being filed right now because AI told physicians and surgeons to do specific things, and that was wrong. Now they're being sued.
Lisa Snider
@Reply 31 days ago
Unfortunately, if physicians are taking medical advice from AI, those physicians should NOT be in practice. All those years of training and they fail to follow science rather than AI? They've no one but themselves to blame if things go badly for their patients.
Michael Olgren
@Reply 30 days ago
Lisa Are you therefore disagreeing with Rick et al that AI can be a tool, from which a developer, builder, etc. can take advice?
I am saying that AI does NOT merit the status of a medical textbook or a vetted medical resource. IMHO, AI gathers slop (which has a higher error rate), whereas a trusted resource will have a lower rate. I believe this is true for any field. For example, using Black's Law textbook will be more reliable than AI when writing a legal opinion. Thus, anyone using AI in any field, uses it at their peril and should be held liable for trusting a sloppy resource.
Lisa Snider
@Reply 30 days ago
Not at all, but when it comes to some fields, AI should never be given a higher level of confidence than actual science, and advice from actual physicians. Just as a honest journalist would not rely on the word of one source when investigating an important story, all professionals should not rely on AI as their only source of information. AI has to be trained to spit out the facts it knows. And as found in the recent incidence of GROK creating disgusting depictions of naked children, how it is trained, and by whom, will determine what it spits out as fact. Sadly, the same people who attained much of their education from the supermarket check-out lane, are just as likely to take whatever AI spits out as fact, rather than doing additional research to ensure what AI says is accurate. AI can be a useful tool, but only when used correctly as an adjunct source of information, not the primary source.
You have to take everything that AI says with a grain of salt, though. I've had it tell me things about Microsoft Access, for example, that I knew were not true. Then, when I went and called it out on it and I said, "Search the web or do some research on that topic," it comes back and says, "Oh, you're right. Thanks for calling me out on it." Because it's trained with a certain amount of data, but it's trained on basically the whole web, and there's a lot of garbage out there that it gets trained on. That's only when you actually make it stop and think about what it's saying that it sometimes takes a step back and goes, "You're right." You always have to verify everything that it tells you.
Sorry, only students may add comments.
Click here for more
information on how you can set up an account.
If you are a Visitor, go ahead and post your reply as a
new comment, and we'll move it here for you
once it's approved. Be sure to use the same name and email address.