How to Generate Real Business Value with Digital Workers?

Over the past two years, the agenda around generative AI has become clearer. Today, concepts like “agent” and “digital worker” are no longer just buzzwords; the conversation has shifted to how these technologies can be embedded into organizational operations, how they can be scaled, and most importantly, how they can be turned into measurable business value. In short, our focus is clear: not a tech showcase, but producing results.

McKinsey’s The State of AI in 2025 report provides concrete data on this issue. According to the report, only a small portion of organizations are reaching a level where they can generate high value by applying generative AI across end-to-end processes. This group, which constitutes just 6%, reports that AI can create meaningful financial impact at the organizational level. This tells us that success is not just about “choosing the right model,” but about embedding and scaling the technology in the right way.

This ratio aligns with the findings we shared last year in our content titled “MIT Explained: Why Only 5% of GenAI Projects Generate Millions in Value.” Two different sources point to the same truth: significant impact doesn’t come from well-intentioned experiments, but from well-designed workflows, clear KPIs, and sustainable ownership.

At CBOT, we approach digital workers within this framework: not as tools that automate isolated tasks, but as systems that take ownership of specific workflows, smartly hand off to humans when needed, and deliver clearly measurable outcomes. In this article, we explore the question “How to Create Real Business Value with Digital Workers?” by combining McKinsey’s 2025 findings with on-the-ground experience.

Enjoy the read,

The issue isn’t “Do we have AI?”; it’s “How are we embedding AI into our work?”

When generative AI is discussed in organizations, two statements often follow one another: “We’re using it too,” and “But it hasn’t fully settled in yet.” McKinsey’s The State of AI in 2025 report translates this sentiment into numbers: 88% of organizations use AI in at least one business function (up from 78% last year), but most are still at the pilot or early stage when it comes to scaling. So the real issue is not “Do we have AI?”; it’s “How are we embedding AI into our work?”

Now we come to the truly critical point: McKinsey shows that the organizations achieving the highest returns are not merely experimenting more — they are systematically applying specific practices that enable value creation. The report highlights six key areas: Strategy, Technology, Adoption & Scaling, Operating Model, Talent, and Data.

These six areas all revolve around a single question: Do we want the digital worker to simply “operate,” or do we want to make it the “owner of the work” and drive real outcomes?

This is exactly where we see the distinction on the ground at CBOT. In the first approach, the digital worker remains like a “smart tool” positioned next to a team. It delivers impressive demos and speeds up a few tasks, but because value creation depends on individual initiative and isolated usage, scaling becomes difficult. In the second approach, however, the digital worker becomes a natural part of the workflow: when it will be triggered is clear, what decisions it will make and based on which data is defined, when it will hand over to a human is predetermined, and which KPI its output will be measured by is specified. What we call “value” stems from this clarity.

That’s why the practices McKinsey groups under six headings shouldn’t be read as a checklist, but as components of a single system. Strategy clarifies the question “what and why are we automating?” Technology establishes a secure foundation for the digital worker to operate. Adoption and scaling embed the solution into the process — meaning the work is redesigned. The operating model treats this not as a project but as a product, evolving through rapid iterations. Talent ensures ownership is sustained both technically and on the business side. Data allows the digital worker to operate not based on “guesswork” but on institutional truth.

In short, high return doesn’t come from simply “better prompts” at some point — it comes from better workflow design and stronger operational discipline.

Let’s start reading the chart from the top. McKinsey is doing two things simultaneously here: first, it shows which practices are most commonly implemented by organizations with the highest returns (Highest Prevalence). Second, it identifies which of these practices are more effective in explaining high performance (Relative Importance). In other words, “Who’s doing what?” and “What really makes a difference?” are both on the same page.

The macro picture is this: organizations with high returns treat AI not as a tech project but as a way of working. That’s why you won’t see items like “they picked this model” or “they implemented this use case” on the list. Instead, it highlights often-overlooked but growth-driving muscles like strategy, process, governance, delivery model, data, and talent. On the ground, the same pattern shows up in value-generating projects: model excitement is short-lived; lasting impact comes from processes and ownership.

Now let’s zoom in. The first row under Highest Prevalence is “Human in the loop.” This is no coincidence. In high-performing organizations, it’s clearly defined from the start where the model output will be subject to human oversight and who will have decision authority. Once this definition is made, two things happen: operations feel “safe,” and the solution can expand more broadly. Without it, every team gets stuck on the same question: “Who’s responsible here?” When this goes unanswered, scaling slows down — often unintentionally.

The second strong signal is the technology infrastructure. The chart tells us that the top performers don’t stop at “embedding AI into a single application”; they build an architecture to support it. In practice, this means identity and access management, logging, integration layers, cost tracking, security, and versioning. If you want to position the digital worker in a real business role, this operational foundation is a must. Without it, the solution behaves like a pilot, not like a production system.

The third group of rows falls under strategy: a clear roadmap and leadership alignment around value creation. The key nuance here is that the roadmap isn’t a “list of AI projects” — it’s a list of business impacts. That’s when scaling happens. Because the digital worker is no longer something to “try out,” but something that is expected to change a KPI. This sharpens prioritization and moves the project from personal enthusiasm to the institutional agenda.

And now, the heart of the chart: rewiring business processes. McKinsey defines this as “embedding solutions into business processes.” For digital workers, this means not just adding automation but redesigning the work. For instance, if a digital worker doesn’t just make suggestions and sit idle — but initiates tasks, pulls the right data, triggers decision steps, escalates exceptions to humans, and delivers measurable outcomes — that’s where real value begins. That’s why the “digital worker” concept, if not paired with process design, only leads to faster experimentation, not business value.

So far, we’ve looked at “who is doing what more.” Now the Relative Importance section tells us this: some practices are common, but others are more critical levers in explaining performance gaps. For example, product development / product delivery stands out here. This means that when AI is managed like a product rather than a project, outcomes grow. It involves iteration, feedback, release management, live monitoring, backlogs, and ownership. Digital workers are living systems — as processes evolve, so must they. Without a product discipline, even the smartest solution today can become outdated within months.

Similarly, governance emerges strongly in this section. Because scaling isn’t just about tech capacity — it’s about decision and responsibility architecture. When questions like “Who approves? Who monitors? How is risk managed? What happens if there’s a deviation?” are clearly answered, organizations don’t hit the brakes. When they’re not, even the best model never moves beyond cautious use.

Finally, on the data side, there’s emphasis on data products and iterative solution development. The performance of digital workers doesn’t stabilize through one-time integrations — it requires reusable, domain-defined data products. If the same concept has three different meanings across systems, the decision quality of the digital worker fluctuates. Data products standardize this; iterative development gradually builds more reliable, more predictable behavior.

If we sum up the main message of this chart in one sentence: high returns don’t come from more experimentation — they come from the right operating model.
Model selection still matters, but what wins the game is: redesigning processes, placing human oversight appropriately, establishing product-like delivery discipline, and integrating governance with data.

Real business value with digital workers doesn’t come from saying “we built an agent” — it comes from setting up a system that updates how work is done. And the practices McKinsey sees in high-performing organizations all point to the same outcome: redesign processes, apply human oversight thoughtfully, improve continuously with a product mindset, and embed data and governance into the workflow.

We see the same picture on the ground. Value-generating organizations don’t position digital workers as “side tools” — they make them owners of the workflow. They define KPIs upfront, clarify the decision boundaries of the digital worker, design the human handoff moments, measure outcomes, and improve through iteration. When done this way, the digital worker moves beyond being a demo — it becomes a trusted part of operations.