Last edited:

September 16, 2025

Insights

AI and precision

tl:dr

LLMs are non-deterministic, making them a problem for enterprise precision. Simply setting a parameter like "temperature" to zero won't solve this; the issue is the underlying infrastructure. True precision comes from system design. To build reliable AI, you need to provide two things: context (specific brand data and guidelines) and guardrails (templates, rules, and structured workflows). This is precisely how deckd works. We use AI as a powerful tool that scales production within your pre-defined brand system, ensuring every asset is consistently on-brand, every time.

In a previous article, we explained why AI alone won't solve brand asset creation. While it's a powerful tool, it lacks the creative intent and brand context essential for a strong, consistent brand. But the challenges of implementing AI for business go even deeper than that.

The very nature of current Large Language Models (LLMs) presents a fundamental problem for businesses that rely on precision and predictability. LLMs are non-deterministic, meaning that even with the same input, the output can vary. While this might be a feature in creative brainstorming, it's a critical bug for most enterprise use cases.

The demo trap

As Eoghan McCabe (Co-founder Intercom) recently put it:

“It’s not hard to take Sonnet or GPT and wrap it in some light software. It’s extremely hard to do that with the accuracy and precision most enterprise customers require. You can make cool demos. But it’s going to fail hard in the market.”

When a business relies on a tool, they need it to be reliable. They need to know it will produce high-quality output every single time. A non-deterministic AI that produces a brilliant result one moment and a nonsensical one the next is not a tool you can build dependable workflows or processes around.

The path to precision: Beyond temperature = 0

Many believe that making an LLM predictable or its output reproducible is as simple as setting a parameter like temperature to zero. The reality is far more complex. The root of the problem isn't always the model's settings, but the very infrastructure of how LLMs process information as explained in a recent article from Thinking Machines AI (co-founded by former CTO of OpenAI).

And this is where simple wrappers fail. Precision doesn't come from a single setting, it comes from the system's design. To achieve it, you need to create a predictable framework for an inherently unpredictable technology. You need to give the AI two things: context and guardrails.

Context is about providing the AI with the specific brand voice, product and client information, previously shared assets or other strategic data it needs to understand the task. The more context you provide, the less the AI has to guess.

Guardrails are about setting the boundaries within which the AI operates. This means using templates, constraints, and structured workflows that don't just ask the AI what to create, but show it how to create it within a predictable framework.

Choosing predictability

For enterprise teams, the key isn’t running experiments themselves. It’s choosing partners who already have. Partners who can demonstrate, not just promise, consistent results.

An AI tool that behaves like a black box is a liability. Enterprises need solutions that are transparent, predictable, and built to scale with confidence. A solution that operates within a structured system. One that combines human creativity and thinking with machine efficiency.

How deckd delivers precision

At deckd, we’ve built our system around this principle. AI doesn’t invent your brand. Your brand defines the system; through templates, media libraries, and context.

Within that structure:

  • Human-led brand systems and templates with constraints set the standards.
  • AI-powered scaling adapts, repurposes, and accelerates production within those standards.

This approach turns a non-deterministic technology into a reliable business asset. Every output is fast, scalable, and, most importantly, predictably on-brand.

Leon Jacken