Monday, 26 January 2026

The creative technologist is a bridge, not a destination

Test Gadget Preview Image

I started my fascination with code thirty-two years ago. Then I moved into design. Then brand. Then digital. Then the strategy.

That progression felt natural at the time. I didn't realise I was living proof of something that would become urgent for every creative professional in 2025.

The technical skills I picked up early became my competitive advantage. Not because I planned it that way. Because the industry shifted beneath all of us.

The uncomfortable gap in creative talent

When I interview creatives now, I'm looking for something different than I did five years ago.

Historically, the best creatives had a solid foundation in design history and understood trends that came and went. They read design blogs. They attended exhibitions. They stayed current through osmosis and curiosity.

That's not enough anymore.

Now I want to hear about their process, and that process needs to include AI. I want to know how they see that changing. What they're doing to stay current. The really good ones talk about agentic AI and how they're building their own workflows.

Here's what I'm seeing in interviews: senior creatives at the top of their game understand this. Graduate-level creatives from university understand this.

The middle layer is vulnerable.

The next generation is coming armed with the tools and expectations of how they need to work. If you're in that middle layer with ten years of traditional experience, you need to prove that experience still matters.

At MTM, our approach is human-centric. We want our creatives in front of clients, meeting in person. Experience with client expectations remains valuable. The osmosis between technically fluent juniors and client-savvy seniors creates something powerful.

But only if the seniors actually absorb the technical skills instead of delegating them.

Using AI tools poorly versus using them well

When I talk to new client partners, I'm hearing a consistent pattern. Agencies aren't refusing to adopt AI tooling. They're using it poorly.

Quality of output is suffering across the market.

Here's what using AI poorly looks like: logging into Midjourney and giving a full prompt to produce a creative output. Done. Ship it.

Here's the alternative: creating an agentic production line of many agents, with each one focused on a single task or outcome. When you layer the ability to choose which LLM you use at each stage of that process, you have something powerful.

This transforms creative artworkers and designers into art directors.

The difference isn't subtle. Amazon's agentic AI tool reduces ad creative production from weeks and tens of thousands of dollars to hours at no additional cost. That's not incremental improvement. That's a fundamental shift in how creative work gets done.

But building multi-agent workflows requires understanding systems, logic, and process architecture. That's not traditional creative thinking. You're asking creatives to become systems designers.

I realised this while working hands-on with tools to build solutions for our client partners. It forced me to understand what's out there and the best ways to produce things.

The holding company platform race

The larger agency groups are doing something interesting. They're redefining themselves as platform companies, developing operating systems designed to automate production, media, and optimisation.

Stagwell built The Machine. Omnicom produced Omni. Havas has Converged.AI.

These aren't future plans. Stagwell is investing $20 million per quarter into AI integration. The Machine promises 15% cost savings and is scheduled for full network rollout by early 2026.

At CES 2026, the vision converged: agencies as managed ecosystems of AI agents, built on proprietary data, wrapped in compliance, plugged into end-to-end marketing execution.

So what happens to mid-sized integrated agencies like MTM?

These holding company systems feed the advertising channel engine. Our offer is different. We do the hard work of thinking. Production sits in the middle. Then comes the strategic execution of how campaigns roll out.

Our service offering is fully integrated. We include PR, custom SaaS, full film production, SEO, and social. These are differentiators against brands that get homogenized output from larger agency SaaS systems.

Here's the thing everyone misses: at the moment, only the package differs. Everyone has access to exactly the same LLMs and agentic AI framework options.

The competitive advantage isn't about having technical skills. It's about how you apply them within a strategic context.

Why the creative part still matters

You might think I'm arguing for "strategic-technologist" or "systems-thinking creative" as the more accurate job description.

I'm not.

We can't lose the creative part of this role. Embracing cultural influences from society plays a huge part in design. Without a human, that's not going to happen with LLMs. Everything they're trained on exists in the past.

We need creative humans thinking about what's coming and how they can shape that creatively.

I stand by the term creative-technologist. At least until AI in creative tooling becomes commonplace in the future. That's ultimately where it will end up. The playing field will level once again.

At MTM, these processes fast-track the visualisation of ideas. But you still need to have the ideas at the start. After being fed with insight, research, and understanding, that's still very much for humans to drive.

The idea is what matters. Making things matter.

What separates good from great when tools become accessible

When AI tools become commonplace and easy to use, what separates a great creative from an average one?

If everyone has the same accessible AI tools, what becomes the new scarcity?

The answer brings us full circle: the idea. The ability to make things matter.

Research shows that humans aren't being displaced, they're being elevated. The emphasis shifts from executing tasks to designing the systems that execute tasks. Skills like creativity, critical thinking, and emotional intelligence remain vital.

So why are we putting creatives through this technical gauntlet right now?

Because the tools aren't intuitive yet. We're in a transitional phase. The creative-technologist exists specifically because of this gap.

There's a risk that by forcing creatives to become systems designers and agentic workflow builders, we dull the very thing that will matter most in three years. I think about this balance constantly.

The answer is that these technical processes serve the idea, not replace it. Speed to output is the most visible signpost that AI is here to stay. Production timelines are being reduced by 80%. That efficiency creates space for more thinking, not less.

Amara's law and what comes next

I'm a huge believer in Amara's law: "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run."

AI use is going to change, then change again before we see the full effect in the creative industries.

I'm certain there will be "another thing" in the not-too-distant future. The creative-technologist role is temporary. It's a bridge, not a destination.

Five years from now, when the tools have caught up and become intuitive, we'll look back at this messy middle period. We might miss parts of it. We'll definitely carry forward lessons we learned.

What's certain is that speed to output proves AI's staying power. The efficiency gains are undeniable. 79% of marketers chose increased efficiency as the top benefit of adopting AI. 83.82% report increased productivity since adoption.

But efficiency was never the end goal. The goal is making things matter.

What this means for your team

If you're hiring creatives right now, you face a choice.

You can wait for the tools to become easier. You can hope your traditional creative talent will adapt when forced to. You can assume the holding companies will solve this problem for everyone.

Or you can recognise that we're in a transitional phase that rewards early adopters.

The UK creative industries face a £400 billion AI skills gap. Research shows that by 2035, around 10 million workers will be in roles where AI is part of their responsibilities. Too many freelancers and smaller employers are using the technology without training, creating quality control issues.

The agencies that recognise this moment for what it is will have a competitive advantage.

At MTM, we're building teams where technically fluent juniors work alongside client-savvy seniors. We're investing in understanding agentic workflows and multi-agent systems. We're doing the hard work of thinking while using AI to accelerate production.

We're treating the creative-technologist as a bridge to cross, not a destination to reach.

Because on the other side of this bridge, when the tools become intuitive and accessible to everyone, the playing field levels again. And the thing that matters most will be the thing that always mattered: the quality of the idea and the ability to make things matter.

The creatives who survive this transition won't be the ones with the best technical skills. They'll be the ones who used technical skills to protect and amplify their creative thinking.

That's the balance worth fighting for.

Friday, 9 January 2026

The liability gap: why your AI contracts won't protect you when things go wrong

Test Gadget Preview Image

After an awful lot of talking to marketing directors, compliance teams, and operational leaders about their AI strategies. The conversations follow a pattern.

They're excited about AI's potential. They've integrated tools across their operations. They're seeing results.

Then I ask about liability.

The room gets quiet.

The assumption that's costing organisations millions

Here's what I keep hearing: "We use ChatGPT, but it's OpenAI's problem if something goes wrong. We have contracts."

This assumption is dangerous.

When you enter sensitive information into ChatGPT, it becomes part of the chatbot's data model. It may be shared with others who ask relevant questions. That's a data leakage problem. That's a compliance violation.

And OpenAI's liability? Limited to $100 (approximately £75) or the fees you paid in the past 12 months, whichever is greater.

Your contract is with the employee using the tool, not with your enterprise. You can't bring a claim about confidentiality or security risks. The legal protection you assumed existed doesn't.

The deployer versus developer confusion

The EU AI Act brings fines up to 7% of global annual turnover or €35 million (whichever is greater) for the most serious breaches. For UK businesses operating in the EU market, full compliance is required by August 2026.

In the UK, the Information Commissioner's Office continues to enforce data protection standards, with notable fines in 2025 including £14 million against Capita for a cyber-attack that exposed the personal data of 6.6 million people. Whilst the UK's approach to AI regulation remains principles-based rather than prescriptive, UK GDPR fines can still reach £17.5 million or 4% of annual global turnover, whichever is greater.

The EU AI Act creates two categories: developers and deployers.

Developers build the AI system. They control design decisions, training data, and core functionality.

Deployers use the AI system in their operations. They don't control how it was built, but they control how it's used.

The problem? The line between these categories blurs fast.

If you put your name or trademark on a high-risk AI system, you become a provider. If you make substantial modifications to how the system works, you become a provider. If you customise the system beyond basic configuration, you might become a provider.

Most organisations don't realise they've crossed this line until they're facing regulatory scrutiny.

What the data reveals about your exposure

The numbers tell a story about how unprepared organisations are for this reality.

According to EY's 2025 Responsible AI Pulse survey, nearly 98% of UK respondents reported experiencing financial losses due to unmanaged AI risks, with an average loss estimated at approximately £3 million.

What changed? Organisations realised one AI lapse cascades into customer attrition, investor scepticism, regulatory scrutiny, and litigation.

Two-thirds (64%) of UK companies surveyed allow 'citizen developers', employees independently creating or deploying AI agents, but only 53% have formal policies in place to ensure responsible AI practices. The gap between adoption and governance creates massive exposure.

Organisations adopting AI governance measures, such as real-time monitoring and oversight committees, report significant improvements. Of the UK respondents interviewed, those with an oversight committee reported 35% more revenue growth, a 40% increase in cost savings, and a 40% rise in employee satisfaction.

Yet the skills gap remains stark. Many compliance professionals now handle AI governance responsibilities without specific training for these expanded roles.

UK AI regulation: the principles-based approach

Unlike the EU's prescriptive AI Act, the UK has adopted a principles-based framework built on five core principles: safety and security, transparency and explainability, fairness, accountability and governance, and contestability and redress.

The UK government has announced plans to introduce legislation addressing AI risks, though recent comments indicate the first UK AI Bill is unlikely before the second half of 2026. In the interim, existing regulators, including the ICO, Financial Conduct Authority, and Competition and Markets Authority, enforce AI standards within their respective sectors.

For UK businesses with EU operations, this creates a dual compliance challenge. The EU AI Act's extraterritorial reach means UK firms developing or deploying AI systems for the EU market must comply with both frameworks.

That word matters: both.

You can't point to your vendor and claim immunity. You can't hide behind a service agreement. Whether you're operating under UK principles or EU requirements, if you're deploying the system, you're responsible for how it affects people.

Why your contracts don't work the way you think

I've reviewed dozens of AI service agreements over the past year. They follow a pattern.

The organisation using the AI tries to push all liability onto the provider. The provider limits their exposure to minimal amounts. Nobody addresses who's responsible when things go wrong in practice.

OpenAI's terms require you to indemnify and hold them harmless from third-party claims arising from your use of their services. You're protecting them, not the other way around.

Whilst Section 230 immunity is a US legal concept, the principle applies globally: AI vendors aren't platforms hosting your content. They're providing tools that generate new content, and the legal protections you assumed don't apply.

What compliance teams need to understand now

The division of responsibility between deployers and developers isn't academic. It determines who pays when something goes wrong.

A deployer using an AI system doesn't control design decisions made by the company that developed it. A developer doesn't control how another organisation deploys their system.

But both can be liable.

Here's what that means for your compliance approach:

Document everything about how you're using AI tools. Not just what tools you're using, but how you've configured them, what data you're feeding them, and what decisions they're informing.

Audit your AI vendors' compliance posture. Don't assume they're handling the regulatory side. Ask specific questions about their data handling, their security measures, and their own compliance programmes.

Map your AI systems to regulatory categories. Understand which systems qualify as high-risk under various frameworks. The EU AI Act has specific risk categories; the UK framework assesses risk proportionally within each sector.

Create clear policies about personal data and AI. Your team needs to know what information can and cannot be entered into AI tools. One employee mistake can create enterprise-wide liability under UK GDPR.

Review your insurance coverage. Most policies weren't written with AI liability in mind. You may have gaps you don't know about.

The integration challenge nobody's talking about

According to the EY survey, 80% of UK respondents reported that adopting AI has led to improvements in innovation, whilst 79% said it had improved efficiency and productivity.

But adoption doesn't mean effectiveness.

The challenge isn't creating an AI policy. The challenge is integrating AI governance into existing compliance frameworks whilst maintaining operational agility.

Your team is already managing data protection, privacy regulations, industry-specific compliance requirements, and security protocols. Now you're adding AI governance on top.

The frameworks don't align neatly. UK GDPR focuses on personal data. The EU AI Act focuses on risk levels and use cases. The UK's principles-based approach delegates to sector regulators. Your industry regulations focus on sector-specific concerns.

You need a compliance approach that addresses all of these simultaneously without creating so much friction that your organisation can't move.

What I'm seeing work in practice

The organisations handling this well share common approaches.

They've created cross-functional AI governance teams that include legal, compliance, IT, and business stakeholders. No single department owns this problem because it touches everything.

They're conducting regular AI audits to understand what systems are in use, how they're configured, and what data they're processing. Shadow AI is a bigger problem than most organisations realise.

They're building AI literacy across the organisation. Compliance teams need to understand how the technology works. Technical teams need to understand the regulatory landscape. Business teams need to understand the risks.

They're treating AI governance as an ongoing process, not a one-time project. The technology evolves. The regulations evolve. Your approach needs to evolve with them.

The path forward

You can't avoid AI. The competitive pressure is too strong. The efficiency gains are too significant. The customer expectations are too high.

But you can't ignore the liability gap either.

The organisations that succeed will be the ones that build robust governance frameworks now, before regulatory enforcement ramps up and before a major incident forces their hand.

Start by understanding your actual exposure. Map your AI systems. Review your contracts. Identify the gaps between your assumed protection and your actual protection.

Then build a governance approach that's proportional to your risk. Not every AI system needs the same level of oversight. Focus your resources where the consequences of failure are highest.

The liability gap is real. Your contracts won't protect you the way you think they will. But clear-eyed assessment and proactive governance can.

The question isn't whether to integrate AI into your operations. The question is whether you'll do it with your eyes open to the legal realities or whether you'll learn about them the hard way.

Tuesday, 6 January 2026

The metrics your CEO actually cares about (and why your current reports aren't showing them)

Test Gadget Preview Image

I've been in enough boardrooms to know what happens when marketing presents their monthly report.

The CMO walks through slides showing click-through rates, social media engagement, and website traffic. The numbers look impressive. The graphs trend upward.

And the CEO nods politely while thinking about revenue targets.

This disconnect isn't new, but it's getting worse. According to Gartner's 2026 research, 46% of CMOs identify their most urgent question as how to prioritise marketing initiatives most likely to drive growth. Meanwhile, 63% cite budget and resource constraints as their top challenge.

The pressure is real. The stakes are higher. And the old reporting playbook isn't working.

Why traditional marketing metrics fail in the boardroom

Here's what I've learned after two decades building marketing strategies: the metrics marketers love and the metrics CEOs need are often completely different.

Your CEO doesn't care about your click-through rate. They care about whether marketing is contributing to the company's ability to hit revenue targets, expand market share, and improve profitability.

The numbers tell the story. Just 14% of CEOs and CFOs view their CMO as highly effective at market shaping, according to Gartner. Yet companies with CMOs who are considered market shapers are more than twice as likely to exceed revenue and profit goals.

That gap? It's a trust problem. And it starts with how we report.

CFOs want clear proof of ROI. They speak the language of contribution margin, customer acquisition cost, and lifetime value. When you present vanity metrics that can't connect back to these core business objectives, you're speaking a different language entirely.

The vanity metric trap

Let me be direct about what qualifies as a vanity metric: any number you can't connect back to a core business objective.

That includes:

  • Social media followers (unless you can prove they convert)

  • Page views (unless they correlate with pipeline)

  • Email open rates (unless they drive measurable action)

  • Impressions (unless they build brand equity that translates to business outcomes)

These metrics feel good. They're easy to track. They often trend upward, which makes for nice presentations.

But they don't answer the question your CEO is actually asking: "Is marketing helping us grow the business?"

The most significant red flag of a vanity metric is simple. You can't draw a line from that number to revenue, profit, or market share. If the metric goes up but business results stay flat, you're measuring the wrong thing.

What CEOs actually want to see

I've watched this shift happen in real time. The conversation in 2025 moved decisively away from attribution obsession and toward business impact.

As one industry analysis i saw on LinkedIn put it: "Brands stopped obsessing over perfect attribution and started focusing on directional impact: sales uplift, contribution, and business outcomes."

This is the measurement shift that matters.

Your CEO wants to know:

How much revenue did marketing influence? Not just last-click attribution. The full contribution across the customer journey. Research shows that brands using advanced analytics report 5–8% higher marketing ROI than competitors. That advantage comes from better measurement, not better creative.

What's our customer acquisition cost relative to lifetime value? This ratio tells the story of sustainable growth. If you're spending £500 to acquire a customer worth £300, the math doesn't math.

How is brand health tracking against business performance? Data from over 7,000 consumers about 11,000 customer-provider relationships I saw showed a statistically significant correlation between brand health and sales. The healthiest brands have twice the amount of customers reporting increasing spending compared to the worst-performing brands.

What's the return on marketing investment by channel? Not vanity metrics by channel. Actual contribution to pipeline and revenue. This enables strategic reallocation decisions.

How are we performing against market share targets? Marketing exists to build brand salience and capture market position. If your reporting doesn't connect to this objective, you're missing the point.

The integration of qualitative and quantitative data

Here's where most marketing reports get it wrong. They focus exclusively on quantitative metrics while ignoring the qualitative signals that predict future performance.

Brand health matters. A lot.

Healthy brands get more attention, generate more trust, and convert more efficiently. That translates to lower acquisition costs, better margins, and stronger long-term customer value. If your brand is doing its job, your marketing spend works harder.

The qualitative metrics that belong in your CEO report:

  • Brand awareness and consideration trends

  • Net Promoter Score and customer satisfaction

  • Share of voice in your category

  • Brand perception against key competitors

  • Customer sentiment analysis from reviews and feedback

These aren't soft metrics. They're leading indicators of business performance. When brand health declines, revenue follows. When consideration increases, conversion rates improve.

The key is connecting these qualitative signals to quantitative outcomes. Show the correlation. Demonstrate how improvements in brand perception translated to pipeline growth. Prove that increased share of voice preceded market share gains.

Building reports that drive strategic decisions

The best marketing reports don't just present data. They enable decisions.

Think about it this way. Your CEO doesn't want a history lesson. They want actionable intelligence that helps them allocate resources and adjust strategy.

This is what strategic reporting looks like:

"Given these ROI figures, we plan to reallocate £100K from underperforming channels to the top two drivers next quarter."

That sentence turns your report into a springboard for strategic choices. It's exactly what the C-suite wants.

Research shows that 83% of high-performing marketers have the executive team's complete commitment to their marketing strategy. That's 2.6 times more than what underperforming teams report. You earn that commitment by demonstrating business impact through metrics that matter.

The measurement shift that happened in 2025 was cultural, not just technical. Measurement became shared infrastructure for finance, marketing, and analytics teams. It stopped being a retrospective reporting layer and became a strategic business function.

The practical framework for better reporting

You need a reporting structure that works for both marketing operations and executive strategy sessions.

Here's the framework I use:

Executive summary (one page maximum)

  • Revenue contribution and pipeline impact

  • ROI by major channel or campaign

  • Key performance trends (up or down, with context)

  • Strategic recommendations based on the data

Business impact metrics (the core section)

  • Customer acquisition cost and lifetime value

  • Marketing-influenced revenue and pipeline

  • Conversion rates by stage and channel

  • Market share movement and competitive position

Brand health indicators

  • Awareness and consideration tracking

  • Brand perception and sentiment

  • Share of voice analysis

  • Customer satisfaction and NPS trends

Channel performance (with context)

  • ROI and contribution by channel

  • Cost efficiency trends

  • Performance against benchmarks

  • Optimisation opportunities identified

Strategic implications and next steps

  • What the data tells us about strategy

  • Recommended resource reallocation

  • Tests and experiments planned

  • Expected impact on business objectives

This structure answers the questions your CEO is actually asking. It connects marketing activity to business outcomes. And it positions marketing as a strategic function, not a cost center.

Making the transition

Changing your reporting approach isn't just about new dashboards. It requires a shift in how you think about measurement.

Start by auditing your current metrics. Ask yourself: "If this number improves but revenue stays flat, does it matter?" If the answer is no, you're tracking a vanity metric.

Work with finance to align on definitions. Marketing-influenced revenue means different things to different people. Get clear on the methodology. Agree on attribution models. Establish shared KPIs that both teams understand and trust.

Build the infrastructure to track what matters. You need systems that connect marketing activity to pipeline and revenue. You need brand tracking that measures perception and salience. You need analytics that show contribution, not just correlation.

This takes time. It requires investment. But the alternative is continuing to present reports that don't resonate with the people who control your budget.

The competitive advantage of better measurement

Companies that get measurement right move faster and allocate resources more effectively.

When you can clearly demonstrate which marketing initiatives drive growth, you earn the trust and budget to do more of what works. When you can show how brand health predicts future performance, you make the case for long-term investment.

The measurement maturity shift isn't about perfection. It's about moving from retrospective reporting to strategic intelligence. From vanity metrics to business impact. From marketing language to executive language.

Your CEO doesn't need to understand marketing tactics. But you need to understand business strategy. The metrics you choose to report signal whether you see marketing as a creative function or a growth driver.

The gap between what marketers measure and what executives need is closing. The agencies and marketing leaders who adapt their reporting to focus on business impact will earn the seat at the strategic table.

The ones who keep presenting click-through rates and social media engagement will keep fighting for budget and credibility.

The choice is yours.

What this means for your next board presentation

Before you build your next marketing report, ask yourself one question: "Would our CFO find this data useful for making resource allocation decisions?"

If the answer is no, you're reporting the wrong metrics.

The pressure on marketing to prove value isn't going away. Budget constraints are real. The expectation that marketing directly contributes to growth is only intensifying.

You can respond by defending your vanity metrics and explaining why engagement matters. Or you can shift your measurement approach to focus on the metrics your CEO actually cares about.

Revenue contribution. Customer acquisition efficiency. Brand health that predicts future performance. Market share movement. ROI that enables strategic decisions.

These are the numbers that matter. These are the metrics that earn executive commitment. These are the reports that position marketing as a strategic function.

The measurement infrastructure exists. The frameworks are proven. The competitive advantage goes to the marketing leaders who make the shift.

Your next board presentation is an opportunity to demonstrate that you understand what the business needs from marketing. Not just creative campaigns and tactical execution, but measurable contribution to growth objectives.

The metrics you choose to report tell a story about how you see marketing's role. Make sure it's the right story.

The creative technologist is a bridge, not a destination

I started my fascination with code thirty-two years ago. Then I moved into design. Then brand. Then digital. Then the strategy. That progres...