AI Hallucinations Are Real and the Hype Is Getting Loud

11 January 2026

As a Solo Builder, Solopreneur, Entrepreneur, whatever label fits - you can’t avoid social media. It’s one of the strongest channels for organic growth.

So naturally, when I open my feed, I see a lot of posts that feel completely out of reach.
Not aspirational. Just disconnected.


Hard to fathom, because my own lived experience simply doesn’t resonate with what’s being claimed.


As I sit here trying to integrate a payment gateway for ThaiCopilot.com, I’m doing what usually works for me: asking Claude to plan, Gemini to review, switching models, switching sessions, using whatever technique gets me unstuck.

This works for simple features.

Suddenly, it doesn’t.

And this isn’t my first payment integration. I’ve built payment systems from scratch before. Many times. I know how these systems need to be designed and implemented. I know where things go wrong. I know what questions matter.

And AI is just not good enough.

I tried multiple sessions.
Multiple models.
Different prompts.

None of them worked.


Yet I keep hearing that software engineering is dead.

That triggers me.

Because people who used to code by hand should be considered a goldmine. AI will hallucinate. It hallucinates confidently. It spits out nonsense with authority.

Maybe it will improve.
Maybe it will reach AGI.
Maybe it will become super-intelligent and humans won’t have to do anything.

But that path also risks degrading humans as a species.

If everyone relies on AI, nobody learns. Nobody builds intuition. Nobody thinks deeply. Why bother learning when you can just ask AI?

AI as a tutor might end up having too much power.

We’re not there yet.

But we can already see early signs.


Open LinkedIn.

Everyone is going gaga over AI. Founders. Investors. Engineers. Marketers. Pick a profession, everyone has a hot take. Everyone is claiming something.

What’s really happening is this: some people are picking up AI too early, some too late, and some because they lack fundamentals. They feel AI can do everything, so human skill no longer matters.

AI can write a ton of content.

Does that make writers irrelevant?

What about lived experience? Does AI have 30, 40, 60 years of lived experience?

It doesn’t.





It has data. It predicts the next token. Yes, it’s inspired by the human brain. But what is it actually doing? Does anyone truly understand it?

I don’t. And I’m okay admitting that.

So when a “noob” like me opens LinkedIn and sees endless AI-generated content, it feels off. Some of it even feels sponsored or manufactured hype to attract more investment, especially on LinkedIn.


Sure, AI has its place. It’s too good to ignore.

For the last year and a half, I’ve used AI consistently at work. One of the great things about working at Agoda was company-wide AI adoption. Leadership believed in GenAI. Employees were empowered to explore.

AI was always a topic - town halls, tech talks, leadership sessions.

At one point, I relied on AI too much.

And I noticed something uncomfortable.

My own growth stalled.
My thinking weakened.
My problem-solving muscles softened.

Then Max gave an internal tech talk. One thing he said stuck with me. I’m paraphrasing, but the idea was clear: research shows that over-reliance on AI reduces cognitive capability.

AI can make you lazy.

But if you use AI as a guide, as a thinking partner, as something that sharpens your reasoning instead of replacing it, it can actually improve your cognition. It can be a great force multiplier.

We should not be in the first camp.

We should be in the second.

Max helped me validate my thought process and from there it was an upward trajectory.


Interestingly, the second camp isn’t loud. They aren’t  constantly posting on social media sites like LinkedIn. They aren’t flooding everyone's post with AI-generated posts.

They’re building.

I think I fall into that camp.

That’s why I want to write about this. Not to preach. Not to predict the future. Just to document what I’m seeing, in real time, from the trenches.

AI is powerful.

But it is not a replacement for experience, judgment, or deep work.
At least not yet.

And pretending otherwise is dangerous.


What do you think?

What’s been your experience from the trenches?




All comments and opinions are welcome.

No comments: