You Should Care About AI

Edit · Nov 7, 2025 · 4 minutes read · AI LLMs

In conversations with developers lately I’ve been noticing a lot of discomfort when bringing AI. Talking to people and doing self-reflection I think there are a couple of reasons for this:

  • Overinflation. AI is certainly over hyped by some. People have business interests in raising money or increasing the valuation of their companies and they tend to exaggerated the capabilities of agents and language models. This makes it feel like we’re in yet another hype cycle that will pass, so why waste energy?
  • Fear. Developers see how AI automates some of the tasks they do. Combining this with the overinflation often leads to fear that software engineers will become obsolete. We already see how new grads are struggling to find jobs.

Overinflation

I’ve been using AI daily in my work over the past couple of years. I’ve been using it for coding, scaffolding documents, understanding stack traces, and so much more. It’s certainly not perfect and it is certainly over hyped by some. At the same time, there are thousands of use cases that are real and can improve your productivity.

When I think of an idea, I often use Gemini CLI to prototype its first version and from there I iterate on my project using Cursor/Gemini CLI and manual coding. This makes the process so much more enjoyable because I don’t have to write again variations of trivial code that I’ve already written thousands of times. You can learn more about development process I’ve been experimenting with in my article “Generative Development”.

We’re far from the place in which AI will deliver on the capabilities that hype people promise, but we can certainly start benefiting from automating certain use cases today.

Fear

My believe is that the origin of this fear is primarily from being unfamiliar with what’s out there. Secondary from resistance of change.

Darkness could be scary in a similar way because we don’t know what is out there. When we shine some light things change. It’s the same with AI - it’ll be scary until we spend some time familiarizing ourselves with the fundamental concepts and tools out there.

On the resistance of change, I definitely get it. I’ve been loving the craftsmanship in developing software. Installing linux on your machine, configuring your dotfiles, vimrc, and plugins. Thinking about beautiful ways to solve challenging problems. Reducing the number of keystrokes you do by memorizing shortcuts. There’s a lot of beauty in this. Many developers, including myself, take a lot of pride in it.

The world is constantly changing and we can explore how to achieve a similar level of craftsmanship with the new tools we have available. What’s even more exciting is that there are tons of opportunities to build such tools…

Opportunity

Talking about opportunity is worth it especially because it’s not trivial to find the right opportunity. Many startups put to much believe in AI and likely they’ll under-deliver.

I like software foundations and infrastructure, so I’m interested in the roles that focus on building models or train them. There are researchers who spend a lot of time experimenting with different models. Barely understanding how they work internally even though they build them. Feeding them with input and anticipating certain output. All the math and machine learning expertise there could be overwhelming.

At the same time there are also a lot of opportunities in the space:

  • Emerging fields of research
  • Need for experienced engineers

AI will continue changing the industry. I’m certain in this. It’s good for everyone to follow the space and understand it well. To get involved you don’t have to have a PhD in machine learning. Context engineering will likely lead to huge advancements in the space in the next couple of years and the background you need to do research in this area is minimal. From there you can easily jump to agent architecture and keep pealing the layers until you get to your area of interest.

Don’t sleep on it.