A practice built on one question:
where does the value actually go?

Most technology programmes spend the first six months answering the wrong question. They ask which tools to buy, which vendors to partner with, which platforms to migrate to. SentAIent starts somewhere different: what change does this organisation need to make, and what will prove that the change happened?

The conventional model
has a structural problem.

Large advisory firms are built to deliver outputs: reports, roadmaps, frameworks, presentations. The commercial model rewards the volume of work produced, not the change it generates. That is not a criticism of the people inside those firms. It is a structural observation about how the incentives are set up.

SentAIent was built around a different structure. Every engagement starts with a defined business outcome and a measurable test of whether that outcome was reached. The work is designed to produce the change, not to document the possibility of it.

The frameworks that underpin this practice were not adapted from existing methodologies. They were built in response to failure patterns observed across dozens of client engagements over three decades — AI programmes that collapsed without architectural foundations, transformation programmes that produced strategy documents instead of results, and technology investments that were never connected to the business outcomes they were supposed to serve.

The failure patterns this practice was built to address

AI investment without architectural foundations

95% of generative AI pilots fail to reach production. The reason is almost never the model. It is the absence of data quality, governance, and architectural integration beneath it.

Transformation programmes that produce documents

A strategy document is not a transformation. The test is whether the organisation can do something after the engagement that it could not do before.

Technology decisions disconnected from business outcomes

Platform selection, vendor choice, and architecture decisions made without a clear line to a specific, measurable business result produce expensive technical debt and no commercial return.

Advisory that confirms rather than challenges

The most expensive advisory relationship is the one that validates the direction the organisation was already heading. The value is in the challenge, not the confirmation.

Value streams mapped but never orchestrated

Most organisations have separate processes for value stream mapping, customer journey analysis, and benefits tracking. The disconnection between them is where investment decisions go wrong.

The methodology at the
centre of everything we do.

Holistic Value Orchestration is a proprietary framework that integrates three disciplines that most organisations treat as separate: Value Stream Mapping, Customer Journey Analysis, and Benefits Dependency Mapping. The reason they are treated separately is organisational — different teams own each one. The reason they need to be integrated is commercial — the value only appears when all three are read together.

Phase 1

Map the Value Stream

  • Identify every step that contributes to the delivery of value to the customer
  • Separate value-adding activity from process waste
  • Establish the current-state baseline before any technology decision is made
Phase 2

Identify Touchpoints

  • Map the customer journey against the value stream
  • Identify where the customer experience and the internal process are misaligned
  • Locate the moments of highest commercial and strategic impact
Phase 3

Analyse Benefits and Risks

  • Apply Benefits Dependency Mapping to connect each initiative to a specific outcome
  • Identify the enabling changes and business changes required
  • Quantify the risk of inaction alongside the risk of investment
Phase 4

Apply the Strategic Alignment Test

  • Place every initiative on the Strategic Impact Grid (Customer Impact vs Strategic Impact)
  • Eliminate low-value initiatives before budget is committed
  • Produce a prioritised investment roadmap with measurable success criteria

The Strategic Impact Grid

The grid plots every initiative on two axes: the impact it has on the customer experience, and the impact it has on the organisation's strategic position. Initiatives that score low on both axes are eliminated. Initiatives that score high on both are prioritised and resourced first.

The output is not a list of things to do. It is a ranked, evidence-based investment roadmap where every line item is connected to a specific business outcome and a measurable test of success. In a typical engagement, this process eliminates between 30% and 50% of the initiatives that were already in the pipeline — saving capital before a single line of code is written.

PrioritiseHigh Customer / High Strategic
EvaluateLow Customer / High Strategic
OptimiseHigh Customer / Low Strategic
EliminateLow Customer / Low Strategic

A proprietary approach to
technology that thinks together.

Most technology programmes treat AI, behavioural simulation, real-time data, and IoT as separate workstreams with separate budgets and separate teams. The insight behind Convergent Intelligence Modelling is that the value is not in any one of those technologies — it is in the point where they intersect. When they are designed to work together from the start, they produce something that none of them can produce alone: a working model of how a system actually behaves, and a reliable basis for deciding what to change.

The approach was developed through applied engagements across transport, financial services, and logistics — contexts where the cost of a wrong decision is measurable and the tolerance for theoretical frameworks is low. It is not a research concept. It is a working methodology with a track record of results.

01

Artificial Intelligence

Pattern recognition, prediction, and decision support applied to real operational data. Not AI as a product — AI as a component of a larger analytical system.

02

Behavioural Simulation

Models of how people actually behave in a system, built from observed data rather than assumed averages. The difference between a model that works in a spreadsheet and one that works in the real world.

03

Real-Time Data Integration

Live operational data feeds that keep the model current. A simulation built on historical data alone is a record of the past. One connected to live data is a tool for the present.

04

IoT & Sensor Intelligence

Physical-world signals — location, movement, environmental conditions, usage patterns — translated into inputs that the model can act on. The bridge between the digital model and the physical system it represents.

What it produces

A working model of a real system — not a diagram, not a report, not a set of recommendations. A model that can be interrogated: what happens if we change this variable? What does the data say about that assumption? Where is the gap between what we expect and what is actually happening?

In practice, this has been applied to demand forecasting, modal shift modelling, customer behaviour simulation, and operational process optimisation. In each case, the output was a decision — not a document — and the decision was grounded in evidence that the organisation could interrogate and challenge before committing to it.

Six principles that
do not change by engagement.

These are not values statements. They are operational commitments that shape how every engagement is scoped, delivered, and measured. They are the reason some clients find this practice uncomfortable at first, and the reason the outcomes are different from what they have seen before.

The practice is independent by design. There are no vendor partnerships, no platform commissions, and no preferred technology stack. The recommendation is always the one that serves the client's outcome — not the one that serves a commercial relationship.

Outcome first, always

No engagement starts with a technology recommendation. Every engagement starts with a defined business outcome and a measurable test of whether it was reached.

Architecture before investment

AI and digital transformation programmes fail when the architectural foundation is absent. The foundation is designed before the investment is committed.

Original frameworks, not recycled playbooks

The frameworks used in this practice were built from observed failure patterns across real client engagements. They are not adapted from existing methodologies.

Eliminate before you build

The most valuable work in any programme is identifying what not to do. Eliminating low-value initiatives before budget is committed saves more than any optimisation after the fact.

Independence is non-negotiable

No vendor partnerships. No platform commissions. No preferred stack. The recommendation is always the one that serves the client's outcome.

Measurement is built in, not bolted on

Every engagement defines its success criteria before the work begins. The measurement framework is part of the design, not an afterthought at the end.

Lionel Schotter —
built from the ground up.

The career started not in a graduate scheme or an MBA programme, but in the engine room of technology delivery — support engineering, infrastructure, systems administration, and IT outsourcing. That ground-level experience is the source of something that cannot be replicated in a classroom: a full-stack understanding of what technology actually does to an organisation when it is deployed, and what it does when it fails.

The progression from technical delivery to strategic advisory was not a pivot. It was an accumulation. Every layer added a different kind of understanding — what the engineers see, what the project managers see, what the programme directors see, and what the board sees. The frameworks built along the way were the result of noticing the gaps between those layers and building something to close them.

The career spans financial services, transport, logistics, mining, and public sector organisations across 16 European countries. In each of those contexts, the measure of success was the same: could the organisation do something after the engagement that it could not do before? That question has not changed.

TOGAF and IASA certified. BSc Biology — a discipline that trained the systems thinking that underpins every framework in this practice. Co-author of published whitepapers on Convergent Intelligence Modelling and human behaviour simulation in complex systems.

Credentials

TOGAF 9 Certified Practitioner
IASA Business Architecture Associate
IASA IT Architect Foundation Core
Leading SAFe Certified
BSc Biology — University of Natal
GIBS Corporate Innovation Masterclass

Sectors

Financial ServicesTransportLogisticsMiningPublic SectorRetail BankingSustainable MobilityIT Outsourcing

Four formats.
One standard of outcome.

The engagement format is determined by what the organisation needs, not by what is easiest to sell. Each format has a defined scope, a defined output, and a defined test of success. None of them produce a document as the primary deliverable.

Format 013–12 months

Fractional Advisory

Ongoing strategic advisory at board or C-suite level. Typically one to two days per week, with defined focus areas agreed at the start of each quarter.

Outputs

  • Quarterly strategic review and priority reset
  • AI governance and architecture oversight
  • Technology investment decision support
  • Stakeholder alignment and challenge function

Right for organisations that need a senior technology strategist without the cost or commitment of a full-time hire.

Format 024–8 weeks

Diagnostic Engagement

A structured assessment of a specific problem: a technology programme that is not delivering, an AI investment that has not reached production, or an architecture that is not supporting the business strategy.

Outputs

  • Root cause analysis with evidence
  • Gap assessment against defined outcomes
  • Prioritised remediation roadmap
  • Board-ready findings presentation

Right for organisations that know something is not working but need an independent view of why, and what to do about it.

Format 031–5 days

Strategy Workshop

A facilitated working session that brings business and technology leadership into the same room to build shared priorities, resolve misalignment, and produce a cohesion plan that both sides own.

Outputs

  • Shared priority list with ownership assigned
  • Decision log with rationale
  • 90-day action plan
  • Agreed success criteria for each initiative

Right for organisations where the gap between business strategy and technology execution is a conversation problem, not a capability problem.

Format 043–18 months

Programme Advisory

Embedded advisory on a specific transformation programme: AI implementation, enterprise architecture redesign, operating model change, or digital platform migration. Focused on ensuring the programme delivers the outcome it was designed for.

Outputs

  • Programme governance and measurement framework
  • Architecture and AI readiness oversight
  • Benefits tracking against original business case
  • Escalation and course-correction support

Right for organisations running a significant programme that needs an independent advisor focused on outcomes, not on protecting the delivery contract.

If the question is about value —
this is the right conversation.

The practice takes a small number of engagements at any one time. If your organisation is trying to understand where its technology investments are actually going, or why a programme is not delivering what it was supposed to, get in touch.

© 2026 SentAIent Ltd. All rights reserved.