About two and a half years ago, I wrote about the AI make-or-buy decision. The core idea was simple: treat it at a continuum, not a binary problem. That said, most companies should lean heavily toward buying, especially when getting started.

That core advice still holds in 2026. But the landscape underneath it has shifted so much that the decision continuum deserves a second look.

The short version: Buy the technology, then build your process around it. Only build more when really needed.

Let me explain.

Why "Make" Keeps Getting Harder

In 2023, "building AI" could mean many things – training a model, racking up infrastructure, serving a RAG system, developing an internal ChatGPT clone. Some of those were reasonable choices back then.

In 2026, the bar for building your own AI is higher than it's ever been. Not lower. Higher.

The cost of training frontier AI models has grown by a factor of 2 to 3x per year and training the largest AI models today is expected to cost over a billion dollars by 2027. The cost of compute, the pace of foundation model improvement, and the talent required to do it well have all moved in the same direction: up. Unless you're a well-funded research lab, investing in this foundational area is almost never justified.

Ben Cottier, Robi Rahman, Loredana Fattorini, Nestor Maslej, and David Owen. ‘The rising costs of training frontier AI models’. ArXiv [cs.CY], 2024. arXiv. https://arxiv.org/abs/2405.21015.

Looking at the infrastructure required to run these AI models, it doesn’t get much better. GPUs are deprecating before you can say hardware acceleration. And storage prices are through the roof. (Congrats to everyone who bought Sandisk).

What’s getting clear: as AI gets more powerful, the case for building your own large-scale systems from scratch gets weaker, not stronger. Every improvement in foundation models makes the "buy" side more capable and the "make" side more expensive by comparison – just because of basic economics.

That’s why already today the highest “make” version of AI systems I see with my clients is to build on top of available services of existing (cloud) platforms like Azure that essentially charge for access to their systems on a pay-per-use basis. (E.g. use Azure AI Search instead of building your own RAG system from scratch.)

"But what about small language models?" and open source on-prem deployments?

Fair point. Fine-tuning an SLM for a specific, high-volume task can make sense. But even that path starts with buying. You need a large model first — to validate the use case, to generate training data, to set the quality bar. You buy before you build. The same with open source on-prem deployments. You wouldn’t buy a local AI workstation unless you had a proven use case (proven in the sense of you actually tried it out.)

The sequence is always: general-purpose model → proven value → then maybe own and specialize.

The path to "make" runs through "buy" now. Every time.

The Real Moat: Data, Process, and Organizational Readiness

So if the technology is commoditized, what's left to compete on?

I believe it’s the following three things:

  1. Your data

  2. Your processes

  3. Your organizational readiness

Tools like Power Automate, n8n, or even vibe-coded prototypes let you assemble AI solutions without building the underlying technology. You literally stand on the shoulders of giants. The platforms are all there and easily accessible. It’s hard to imagine you could build better building blocks than the ones available.

But what those platforms can't do for you is feed themselves the right context, at the right time.

The best LLM in the world is useless if it can't access your data. If your processes aren't documented. If nobody knows what "good output" looks like for your specific business.

Here’s what you actually need to build because no one can build it for you:

  • Data readiness. Can your AI access the context it needs? Can your systems talk to each other?

  • Process clarity. Do you know what the workflow looks like end-to-end? Not the AI workflow – the business workflow that AI is supposed to improve. If you can't draw it on a whiteboard, AI can't automate it.

  • Observability. Can you see what the AI is doing? Are you tracking outputs, accuracy, costs? Most companies deploy AI and then never look at it again. That's hopeful deployment, not automation.

  • Governance. Who reviews outputs? Who owns the process? At what point do you pull the plug? If the answer to any of these is "we'll figure it out later," you'll never move past the prototype stage.

“Everything is a wrapper” they say. But who’s building your organizational wrapper around AI?

You can't buy organizational readiness. Not from a vendor, not from a consultant. Consultants can help (shameless plug) – but eventually YOU have to build it. And that's where your time and budget should actually go. Not in trying to rebuild the tech that someone else has already built 10x cheaper.

Three Buying Modes

In 2023, I mapped four scenarios based on strategic value and development stage, which was useful then. But the market simplified the decision even more.

Here's how I think about AI Make or Buy now.

Mode 1: AI Features From Your Existing Vendors → Buy, Generally

Your CRM adds AI-powered summaries. Your email tool gets smart sorting. Your project management software starts suggesting task assignments.

Turn it on. But don't run a project around it. Don't form a committee. Don't write a business case. Just activate it for a group of test users and see what happens. Buy more licenses as needed.

This is Copilot territory (“Productivity AI”). It covers 99% of what most employees will interact with as "AI" in their daily work. The vendor handles the model, the integration, and the updates. Your job is to flip the switch and let people use it. You competitors will likely do the same.

The only mistake here is overthinking it too much.

Mode 2: General-Purpose Platforms → Buy the Platform, Build Your Workflows

This is where it gets interesting. Tools like ChatGPT, Claude, n8n, Make, or Power Automate give you the building blocks. You assemble the workflow.

This is the mode where most real business value gets created right now. Not from the platform itself, but from what you build on top of it — your “wrapper”, essentially. The specific combination of your data, process logic, and business rules running on a general-purpose engine.

My advice: don’t buy specialized point solutions unless there is absolutely no other way. Don't bet on the vertical AI tool that promises to solve your exact industry problem out of the box.

Why? Because these tools are built on the same foundation models you can access directly. And those foundation models improve every few months. The specialized tool that impressed you in January is fighting to stay relevant by June. Even poster child AI startups like Harvey are seeing headwinds. Whatever vendor solution you’re using – if they can't innovate faster than the platforms they're built on, they’ll be gone quickly (or worse – remain a Zombie app with you as a customer locked in). Remember, the real leverage in niche applications comes from data, process knowledge and your organization’s ability to execute. The underlying tech is pretty much the same everywhere.

So instead, buy the building blocks and build your specific solution with them. When the underlying building blocks (e.g. AI model) improve, your solution improves with it. When you need to change the workflow, you change it — you don't file a feature request with a vendor and wait.

The moat isn't the platform you chose but what you chose to do with it.

Mode 3: Specialized Tools → Buy When the Tech Gap Is Real

There's one exception to the "don't buy point solutions" rule, and it's quite narrow.

Some capabilities are genuinely hard to replicate with general-purpose platforms. Right now, that's mostly in audio and video: tools like ElevenLabs for voice synthesis, Descript or OpusClip for video editing, Synthesia or HeyGen for virtual avatars. These tools do things that no amount of prompt engineering on a general-purpose model can match.

Plus, running AI audio and video processing is significantly more expensive than text-based applications. That's why it's hard to imagine you'll get these capabilities out of the box for $30/month from the big labs anytime soon.

But I expect this category to get attacked fast. Audio generation for video was a specialized-tool-only capability until Google released Veo 3 with native audio support. Today's exception might be tomorrow's built-in feature served using the same API you’re already using.

Buy these tools, but don't build your business around them.

Conclusion

In 2023, the question was "should I make or buy my AI solution?"

In 2026, the question is: “which tech should I buy so I can build on top of it fast and future-proof?"

In an age where everyone can rent the same intelligence, the companies pulling ahead are the ones who built the system and capability that knows what to do with it.

Your very own “wrapper” is your real moat.

So go ahead and map your processes, assign ownership, get the data ready, build governance – and ship without looking back.

That’s the real AI investment – the one that actually compounds.

See you next Saturday,
Tobias

Two years ago I laid out the original framework. Here's the original 2023 version if you want to see what changed."

Reply

Avatar

or to participate

Keep Reading