Monday, 26 January 2026

The creative technologist is a bridge, not a destination

Test Gadget Preview Image

I started my fascination with code thirty-two years ago. Then I moved into design. Then brand. Then digital. Then the strategy.

That progression felt natural at the time. I didn't realise I was living proof of something that would become urgent for every creative professional in 2025.

The technical skills I picked up early became my competitive advantage. Not because I planned it that way. Because the industry shifted beneath all of us.

The uncomfortable gap in creative talent

When I interview creatives now, I'm looking for something different than I did five years ago.

Historically, the best creatives had a solid foundation in design history and understood trends that came and went. They read design blogs. They attended exhibitions. They stayed current through osmosis and curiosity.

That's not enough anymore.

Now I want to hear about their process, and that process needs to include AI. I want to know how they see that changing. What they're doing to stay current. The really good ones talk about agentic AI and how they're building their own workflows.

Here's what I'm seeing in interviews: senior creatives at the top of their game understand this. Graduate-level creatives from university understand this.

The middle layer is vulnerable.

The next generation is coming armed with the tools and expectations of how they need to work. If you're in that middle layer with ten years of traditional experience, you need to prove that experience still matters.

At MTM, our approach is human-centric. We want our creatives in front of clients, meeting in person. Experience with client expectations remains valuable. The osmosis between technically fluent juniors and client-savvy seniors creates something powerful.

But only if the seniors actually absorb the technical skills instead of delegating them.

Using AI tools poorly versus using them well

When I talk to new client partners, I'm hearing a consistent pattern. Agencies aren't refusing to adopt AI tooling. They're using it poorly.

Quality of output is suffering across the market.

Here's what using AI poorly looks like: logging into Midjourney and giving a full prompt to produce a creative output. Done. Ship it.

Here's the alternative: creating an agentic production line of many agents, with each one focused on a single task or outcome. When you layer the ability to choose which LLM you use at each stage of that process, you have something powerful.

This transforms creative artworkers and designers into art directors.

The difference isn't subtle. Amazon's agentic AI tool reduces ad creative production from weeks and tens of thousands of dollars to hours at no additional cost. That's not incremental improvement. That's a fundamental shift in how creative work gets done.

But building multi-agent workflows requires understanding systems, logic, and process architecture. That's not traditional creative thinking. You're asking creatives to become systems designers.

I realised this while working hands-on with tools to build solutions for our client partners. It forced me to understand what's out there and the best ways to produce things.

The holding company platform race

The larger agency groups are doing something interesting. They're redefining themselves as platform companies, developing operating systems designed to automate production, media, and optimisation.

Stagwell built The Machine. Omnicom produced Omni. Havas has Converged.AI.

These aren't future plans. Stagwell is investing $20 million per quarter into AI integration. The Machine promises 15% cost savings and is scheduled for full network rollout by early 2026.

At CES 2026, the vision converged: agencies as managed ecosystems of AI agents, built on proprietary data, wrapped in compliance, plugged into end-to-end marketing execution.

So what happens to mid-sized integrated agencies like MTM?

These holding company systems feed the advertising channel engine. Our offer is different. We do the hard work of thinking. Production sits in the middle. Then comes the strategic execution of how campaigns roll out.

Our service offering is fully integrated. We include PR, custom SaaS, full film production, SEO, and social. These are differentiators against brands that get homogenized output from larger agency SaaS systems.

Here's the thing everyone misses: at the moment, only the package differs. Everyone has access to exactly the same LLMs and agentic AI framework options.

The competitive advantage isn't about having technical skills. It's about how you apply them within a strategic context.

Why the creative part still matters

You might think I'm arguing for "strategic-technologist" or "systems-thinking creative" as the more accurate job description.

I'm not.

We can't lose the creative part of this role. Embracing cultural influences from society plays a huge part in design. Without a human, that's not going to happen with LLMs. Everything they're trained on exists in the past.

We need creative humans thinking about what's coming and how they can shape that creatively.

I stand by the term creative-technologist. At least until AI in creative tooling becomes commonplace in the future. That's ultimately where it will end up. The playing field will level once again.

At MTM, these processes fast-track the visualisation of ideas. But you still need to have the ideas at the start. After being fed with insight, research, and understanding, that's still very much for humans to drive.

The idea is what matters. Making things matter.

What separates good from great when tools become accessible

When AI tools become commonplace and easy to use, what separates a great creative from an average one?

If everyone has the same accessible AI tools, what becomes the new scarcity?

The answer brings us full circle: the idea. The ability to make things matter.

Research shows that humans aren't being displaced, they're being elevated. The emphasis shifts from executing tasks to designing the systems that execute tasks. Skills like creativity, critical thinking, and emotional intelligence remain vital.

So why are we putting creatives through this technical gauntlet right now?

Because the tools aren't intuitive yet. We're in a transitional phase. The creative-technologist exists specifically because of this gap.

There's a risk that by forcing creatives to become systems designers and agentic workflow builders, we dull the very thing that will matter most in three years. I think about this balance constantly.

The answer is that these technical processes serve the idea, not replace it. Speed to output is the most visible signpost that AI is here to stay. Production timelines are being reduced by 80%. That efficiency creates space for more thinking, not less.

Amara's law and what comes next

I'm a huge believer in Amara's law: "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run."

AI use is going to change, then change again before we see the full effect in the creative industries.

I'm certain there will be "another thing" in the not-too-distant future. The creative-technologist role is temporary. It's a bridge, not a destination.

Five years from now, when the tools have caught up and become intuitive, we'll look back at this messy middle period. We might miss parts of it. We'll definitely carry forward lessons we learned.

What's certain is that speed to output proves AI's staying power. The efficiency gains are undeniable. 79% of marketers chose increased efficiency as the top benefit of adopting AI. 83.82% report increased productivity since adoption.

But efficiency was never the end goal. The goal is making things matter.

What this means for your team

If you're hiring creatives right now, you face a choice.

You can wait for the tools to become easier. You can hope your traditional creative talent will adapt when forced to. You can assume the holding companies will solve this problem for everyone.

Or you can recognise that we're in a transitional phase that rewards early adopters.

The UK creative industries face a £400 billion AI skills gap. Research shows that by 2035, around 10 million workers will be in roles where AI is part of their responsibilities. Too many freelancers and smaller employers are using the technology without training, creating quality control issues.

The agencies that recognise this moment for what it is will have a competitive advantage.

At MTM, we're building teams where technically fluent juniors work alongside client-savvy seniors. We're investing in understanding agentic workflows and multi-agent systems. We're doing the hard work of thinking while using AI to accelerate production.

We're treating the creative-technologist as a bridge to cross, not a destination to reach.

Because on the other side of this bridge, when the tools become intuitive and accessible to everyone, the playing field levels again. And the thing that matters most will be the thing that always mattered: the quality of the idea and the ability to make things matter.

The creatives who survive this transition won't be the ones with the best technical skills. They'll be the ones who used technical skills to protect and amplify their creative thinking.

That's the balance worth fighting for.

Friday, 9 January 2026

The liability gap: why your AI contracts won't protect you when things go wrong

Test Gadget Preview Image

After an awful lot of talking to marketing directors, compliance teams, and operational leaders about their AI strategies. The conversations follow a pattern.

They're excited about AI's potential. They've integrated tools across their operations. They're seeing results.

Then I ask about liability.

The room gets quiet.

The assumption that's costing organisations millions

Here's what I keep hearing: "We use ChatGPT, but it's OpenAI's problem if something goes wrong. We have contracts."

This assumption is dangerous.

When you enter sensitive information into ChatGPT, it becomes part of the chatbot's data model. It may be shared with others who ask relevant questions. That's a data leakage problem. That's a compliance violation.

And OpenAI's liability? Limited to $100 (approximately £75) or the fees you paid in the past 12 months, whichever is greater.

Your contract is with the employee using the tool, not with your enterprise. You can't bring a claim about confidentiality or security risks. The legal protection you assumed existed doesn't.

The deployer versus developer confusion

The EU AI Act brings fines up to 7% of global annual turnover or €35 million (whichever is greater) for the most serious breaches. For UK businesses operating in the EU market, full compliance is required by August 2026.

In the UK, the Information Commissioner's Office continues to enforce data protection standards, with notable fines in 2025 including £14 million against Capita for a cyber-attack that exposed the personal data of 6.6 million people. Whilst the UK's approach to AI regulation remains principles-based rather than prescriptive, UK GDPR fines can still reach £17.5 million or 4% of annual global turnover, whichever is greater.

The EU AI Act creates two categories: developers and deployers.

Developers build the AI system. They control design decisions, training data, and core functionality.

Deployers use the AI system in their operations. They don't control how it was built, but they control how it's used.

The problem? The line between these categories blurs fast.

If you put your name or trademark on a high-risk AI system, you become a provider. If you make substantial modifications to how the system works, you become a provider. If you customise the system beyond basic configuration, you might become a provider.

Most organisations don't realise they've crossed this line until they're facing regulatory scrutiny.

What the data reveals about your exposure

The numbers tell a story about how unprepared organisations are for this reality.

According to EY's 2025 Responsible AI Pulse survey, nearly 98% of UK respondents reported experiencing financial losses due to unmanaged AI risks, with an average loss estimated at approximately £3 million.

What changed? Organisations realised one AI lapse cascades into customer attrition, investor scepticism, regulatory scrutiny, and litigation.

Two-thirds (64%) of UK companies surveyed allow 'citizen developers', employees independently creating or deploying AI agents, but only 53% have formal policies in place to ensure responsible AI practices. The gap between adoption and governance creates massive exposure.

Organisations adopting AI governance measures, such as real-time monitoring and oversight committees, report significant improvements. Of the UK respondents interviewed, those with an oversight committee reported 35% more revenue growth, a 40% increase in cost savings, and a 40% rise in employee satisfaction.

Yet the skills gap remains stark. Many compliance professionals now handle AI governance responsibilities without specific training for these expanded roles.

UK AI regulation: the principles-based approach

Unlike the EU's prescriptive AI Act, the UK has adopted a principles-based framework built on five core principles: safety and security, transparency and explainability, fairness, accountability and governance, and contestability and redress.

The UK government has announced plans to introduce legislation addressing AI risks, though recent comments indicate the first UK AI Bill is unlikely before the second half of 2026. In the interim, existing regulators, including the ICO, Financial Conduct Authority, and Competition and Markets Authority, enforce AI standards within their respective sectors.

For UK businesses with EU operations, this creates a dual compliance challenge. The EU AI Act's extraterritorial reach means UK firms developing or deploying AI systems for the EU market must comply with both frameworks.

That word matters: both.

You can't point to your vendor and claim immunity. You can't hide behind a service agreement. Whether you're operating under UK principles or EU requirements, if you're deploying the system, you're responsible for how it affects people.

Why your contracts don't work the way you think

I've reviewed dozens of AI service agreements over the past year. They follow a pattern.

The organisation using the AI tries to push all liability onto the provider. The provider limits their exposure to minimal amounts. Nobody addresses who's responsible when things go wrong in practice.

OpenAI's terms require you to indemnify and hold them harmless from third-party claims arising from your use of their services. You're protecting them, not the other way around.

Whilst Section 230 immunity is a US legal concept, the principle applies globally: AI vendors aren't platforms hosting your content. They're providing tools that generate new content, and the legal protections you assumed don't apply.

What compliance teams need to understand now

The division of responsibility between deployers and developers isn't academic. It determines who pays when something goes wrong.

A deployer using an AI system doesn't control design decisions made by the company that developed it. A developer doesn't control how another organisation deploys their system.

But both can be liable.

Here's what that means for your compliance approach:

Document everything about how you're using AI tools. Not just what tools you're using, but how you've configured them, what data you're feeding them, and what decisions they're informing.

Audit your AI vendors' compliance posture. Don't assume they're handling the regulatory side. Ask specific questions about their data handling, their security measures, and their own compliance programmes.

Map your AI systems to regulatory categories. Understand which systems qualify as high-risk under various frameworks. The EU AI Act has specific risk categories; the UK framework assesses risk proportionally within each sector.

Create clear policies about personal data and AI. Your team needs to know what information can and cannot be entered into AI tools. One employee mistake can create enterprise-wide liability under UK GDPR.

Review your insurance coverage. Most policies weren't written with AI liability in mind. You may have gaps you don't know about.

The integration challenge nobody's talking about

According to the EY survey, 80% of UK respondents reported that adopting AI has led to improvements in innovation, whilst 79% said it had improved efficiency and productivity.

But adoption doesn't mean effectiveness.

The challenge isn't creating an AI policy. The challenge is integrating AI governance into existing compliance frameworks whilst maintaining operational agility.

Your team is already managing data protection, privacy regulations, industry-specific compliance requirements, and security protocols. Now you're adding AI governance on top.

The frameworks don't align neatly. UK GDPR focuses on personal data. The EU AI Act focuses on risk levels and use cases. The UK's principles-based approach delegates to sector regulators. Your industry regulations focus on sector-specific concerns.

You need a compliance approach that addresses all of these simultaneously without creating so much friction that your organisation can't move.

What I'm seeing work in practice

The organisations handling this well share common approaches.

They've created cross-functional AI governance teams that include legal, compliance, IT, and business stakeholders. No single department owns this problem because it touches everything.

They're conducting regular AI audits to understand what systems are in use, how they're configured, and what data they're processing. Shadow AI is a bigger problem than most organisations realise.

They're building AI literacy across the organisation. Compliance teams need to understand how the technology works. Technical teams need to understand the regulatory landscape. Business teams need to understand the risks.

They're treating AI governance as an ongoing process, not a one-time project. The technology evolves. The regulations evolve. Your approach needs to evolve with them.

The path forward

You can't avoid AI. The competitive pressure is too strong. The efficiency gains are too significant. The customer expectations are too high.

But you can't ignore the liability gap either.

The organisations that succeed will be the ones that build robust governance frameworks now, before regulatory enforcement ramps up and before a major incident forces their hand.

Start by understanding your actual exposure. Map your AI systems. Review your contracts. Identify the gaps between your assumed protection and your actual protection.

Then build a governance approach that's proportional to your risk. Not every AI system needs the same level of oversight. Focus your resources where the consequences of failure are highest.

The liability gap is real. Your contracts won't protect you the way you think they will. But clear-eyed assessment and proactive governance can.

The question isn't whether to integrate AI into your operations. The question is whether you'll do it with your eyes open to the legal realities or whether you'll learn about them the hard way.

Tuesday, 6 January 2026

The metrics your CEO actually cares about (and why your current reports aren't showing them)

Test Gadget Preview Image

I've been in enough boardrooms to know what happens when marketing presents their monthly report.

The CMO walks through slides showing click-through rates, social media engagement, and website traffic. The numbers look impressive. The graphs trend upward.

And the CEO nods politely while thinking about revenue targets.

This disconnect isn't new, but it's getting worse. According to Gartner's 2026 research, 46% of CMOs identify their most urgent question as how to prioritise marketing initiatives most likely to drive growth. Meanwhile, 63% cite budget and resource constraints as their top challenge.

The pressure is real. The stakes are higher. And the old reporting playbook isn't working.

Why traditional marketing metrics fail in the boardroom

Here's what I've learned after two decades building marketing strategies: the metrics marketers love and the metrics CEOs need are often completely different.

Your CEO doesn't care about your click-through rate. They care about whether marketing is contributing to the company's ability to hit revenue targets, expand market share, and improve profitability.

The numbers tell the story. Just 14% of CEOs and CFOs view their CMO as highly effective at market shaping, according to Gartner. Yet companies with CMOs who are considered market shapers are more than twice as likely to exceed revenue and profit goals.

That gap? It's a trust problem. And it starts with how we report.

CFOs want clear proof of ROI. They speak the language of contribution margin, customer acquisition cost, and lifetime value. When you present vanity metrics that can't connect back to these core business objectives, you're speaking a different language entirely.

The vanity metric trap

Let me be direct about what qualifies as a vanity metric: any number you can't connect back to a core business objective.

That includes:

  • Social media followers (unless you can prove they convert)

  • Page views (unless they correlate with pipeline)

  • Email open rates (unless they drive measurable action)

  • Impressions (unless they build brand equity that translates to business outcomes)

These metrics feel good. They're easy to track. They often trend upward, which makes for nice presentations.

But they don't answer the question your CEO is actually asking: "Is marketing helping us grow the business?"

The most significant red flag of a vanity metric is simple. You can't draw a line from that number to revenue, profit, or market share. If the metric goes up but business results stay flat, you're measuring the wrong thing.

What CEOs actually want to see

I've watched this shift happen in real time. The conversation in 2025 moved decisively away from attribution obsession and toward business impact.

As one industry analysis i saw on LinkedIn put it: "Brands stopped obsessing over perfect attribution and started focusing on directional impact: sales uplift, contribution, and business outcomes."

This is the measurement shift that matters.

Your CEO wants to know:

How much revenue did marketing influence? Not just last-click attribution. The full contribution across the customer journey. Research shows that brands using advanced analytics report 5–8% higher marketing ROI than competitors. That advantage comes from better measurement, not better creative.

What's our customer acquisition cost relative to lifetime value? This ratio tells the story of sustainable growth. If you're spending £500 to acquire a customer worth £300, the math doesn't math.

How is brand health tracking against business performance? Data from over 7,000 consumers about 11,000 customer-provider relationships I saw showed a statistically significant correlation between brand health and sales. The healthiest brands have twice the amount of customers reporting increasing spending compared to the worst-performing brands.

What's the return on marketing investment by channel? Not vanity metrics by channel. Actual contribution to pipeline and revenue. This enables strategic reallocation decisions.

How are we performing against market share targets? Marketing exists to build brand salience and capture market position. If your reporting doesn't connect to this objective, you're missing the point.

The integration of qualitative and quantitative data

Here's where most marketing reports get it wrong. They focus exclusively on quantitative metrics while ignoring the qualitative signals that predict future performance.

Brand health matters. A lot.

Healthy brands get more attention, generate more trust, and convert more efficiently. That translates to lower acquisition costs, better margins, and stronger long-term customer value. If your brand is doing its job, your marketing spend works harder.

The qualitative metrics that belong in your CEO report:

  • Brand awareness and consideration trends

  • Net Promoter Score and customer satisfaction

  • Share of voice in your category

  • Brand perception against key competitors

  • Customer sentiment analysis from reviews and feedback

These aren't soft metrics. They're leading indicators of business performance. When brand health declines, revenue follows. When consideration increases, conversion rates improve.

The key is connecting these qualitative signals to quantitative outcomes. Show the correlation. Demonstrate how improvements in brand perception translated to pipeline growth. Prove that increased share of voice preceded market share gains.

Building reports that drive strategic decisions

The best marketing reports don't just present data. They enable decisions.

Think about it this way. Your CEO doesn't want a history lesson. They want actionable intelligence that helps them allocate resources and adjust strategy.

This is what strategic reporting looks like:

"Given these ROI figures, we plan to reallocate £100K from underperforming channels to the top two drivers next quarter."

That sentence turns your report into a springboard for strategic choices. It's exactly what the C-suite wants.

Research shows that 83% of high-performing marketers have the executive team's complete commitment to their marketing strategy. That's 2.6 times more than what underperforming teams report. You earn that commitment by demonstrating business impact through metrics that matter.

The measurement shift that happened in 2025 was cultural, not just technical. Measurement became shared infrastructure for finance, marketing, and analytics teams. It stopped being a retrospective reporting layer and became a strategic business function.

The practical framework for better reporting

You need a reporting structure that works for both marketing operations and executive strategy sessions.

Here's the framework I use:

Executive summary (one page maximum)

  • Revenue contribution and pipeline impact

  • ROI by major channel or campaign

  • Key performance trends (up or down, with context)

  • Strategic recommendations based on the data

Business impact metrics (the core section)

  • Customer acquisition cost and lifetime value

  • Marketing-influenced revenue and pipeline

  • Conversion rates by stage and channel

  • Market share movement and competitive position

Brand health indicators

  • Awareness and consideration tracking

  • Brand perception and sentiment

  • Share of voice analysis

  • Customer satisfaction and NPS trends

Channel performance (with context)

  • ROI and contribution by channel

  • Cost efficiency trends

  • Performance against benchmarks

  • Optimisation opportunities identified

Strategic implications and next steps

  • What the data tells us about strategy

  • Recommended resource reallocation

  • Tests and experiments planned

  • Expected impact on business objectives

This structure answers the questions your CEO is actually asking. It connects marketing activity to business outcomes. And it positions marketing as a strategic function, not a cost center.

Making the transition

Changing your reporting approach isn't just about new dashboards. It requires a shift in how you think about measurement.

Start by auditing your current metrics. Ask yourself: "If this number improves but revenue stays flat, does it matter?" If the answer is no, you're tracking a vanity metric.

Work with finance to align on definitions. Marketing-influenced revenue means different things to different people. Get clear on the methodology. Agree on attribution models. Establish shared KPIs that both teams understand and trust.

Build the infrastructure to track what matters. You need systems that connect marketing activity to pipeline and revenue. You need brand tracking that measures perception and salience. You need analytics that show contribution, not just correlation.

This takes time. It requires investment. But the alternative is continuing to present reports that don't resonate with the people who control your budget.

The competitive advantage of better measurement

Companies that get measurement right move faster and allocate resources more effectively.

When you can clearly demonstrate which marketing initiatives drive growth, you earn the trust and budget to do more of what works. When you can show how brand health predicts future performance, you make the case for long-term investment.

The measurement maturity shift isn't about perfection. It's about moving from retrospective reporting to strategic intelligence. From vanity metrics to business impact. From marketing language to executive language.

Your CEO doesn't need to understand marketing tactics. But you need to understand business strategy. The metrics you choose to report signal whether you see marketing as a creative function or a growth driver.

The gap between what marketers measure and what executives need is closing. The agencies and marketing leaders who adapt their reporting to focus on business impact will earn the seat at the strategic table.

The ones who keep presenting click-through rates and social media engagement will keep fighting for budget and credibility.

The choice is yours.

What this means for your next board presentation

Before you build your next marketing report, ask yourself one question: "Would our CFO find this data useful for making resource allocation decisions?"

If the answer is no, you're reporting the wrong metrics.

The pressure on marketing to prove value isn't going away. Budget constraints are real. The expectation that marketing directly contributes to growth is only intensifying.

You can respond by defending your vanity metrics and explaining why engagement matters. Or you can shift your measurement approach to focus on the metrics your CEO actually cares about.

Revenue contribution. Customer acquisition efficiency. Brand health that predicts future performance. Market share movement. ROI that enables strategic decisions.

These are the numbers that matter. These are the metrics that earn executive commitment. These are the reports that position marketing as a strategic function.

The measurement infrastructure exists. The frameworks are proven. The competitive advantage goes to the marketing leaders who make the shift.

Your next board presentation is an opportunity to demonstrate that you understand what the business needs from marketing. Not just creative campaigns and tactical execution, but measurable contribution to growth objectives.

The metrics you choose to report tell a story about how you see marketing's role. Make sure it's the right story.

Tuesday, 2 December 2025

The next 12 months will show which creative agencies can thrive with AI

The last year has been a bit of a fever dream, day after day, seeing creative professionals experiment with AI tools. Some are thriving. Others are paralyzed by uncertainty.

Ultimately, I think the difference is going to come down to how they're using AI over the next 12 months.

83% of creative professionals have integrated AI into their workflows. That's not a future trend. That's today. And if you're leading a creative team or managing marketing for your organisation, you need to understand what this transformation actually means for the work you do.

AI has become exceptionally good at execution. It can generate variations, automate repetitive tasks, and accelerate production timelines. What it hasn't done is replace the human ability to generate original ideas, understand emotional context, or build the strategic frameworks that drive meaningful brand growth.

The current state: AI as an amplifier, not originator

The data tells a clear story about where we are right now.

Content marketing teams save approximately 11.4 hours per week per employee using generative AI. That time doesn't disappear. It gets redirected towards strategic thinking, client relationships, and the kind of creative problem-solving that machines can't replicate.

The sensible trend in agencies at the moment seems to be to let AI to handle the mechanical parts of creative work. Resizing assets for different platforms. Generating initial copy variations for A/B testing. Producing first-draft layouts that human designers then refine. 86% of organisations report revenue growth of 6% or more after integrating AI into creative processes.

The revenue growth isn't coming from replacing humans with machines. It's coming from freeing humans to focus on higher-value work.

But, it’s not all as rosy as it sounds: 35% of agencies cite concerns about compromised creative quality as their top challenge with AI adoption. This concern is valid. When you optimise for speed and efficiency, you risk losing the nuance, emotion, and strategic depth that separates memorable work from forgettable content.

What's happening in the next 12 months

The creative industry is entering a critical phase of experimentation.

97% of creatives learn AI tools through self-experimentation rather than formal training. This matters because the agencies that encourage systematic experimentation right now will build competitive advantages that compound over time.

I'm seeing three distinct approaches emerge:

The resisters are avoiding AI tools entirely, hoping the trend will fade or prove less disruptive than predicted. This approach carries significant risk. The technology isn't going away, and clients are increasingly expecting AI-enhanced efficiency.

The adopters are implementing AI tools without clear strategy, using them because they feel they should rather than because they've identified specific value. This creates inconsistent results and often reinforces concerns about quality compromise.

The integrators are taking a measured approach. They're testing AI tools across different parts of their workflow, documenting what works, and building processes that combine AI efficiency with human creativity and strategic thinking.

The integrators are winning. This is what IDHL Labs is all about.

58% of professionals plan to increase AI use for creative generation in the next year, along with expanded use in chatbots, targeting, and forecasting. The question isn't whether to adopt AI. The question is how to adopt it in ways that enhance rather than diminish your creative output.

The consolidation that's coming

The creative AI landscape looks chaotic right now. Hundreds of tools promise to revolutionise different aspects of creative work. New platforms launch weekly. Features overlap. Capabilities blur together.

This won't last.

The creative tech sector has seen significant consolidation activity in 2025, aimed at consolidating IP and scaling AI expertise. The market is moving towards a smaller set of dominant platforms, similar to how Amazon consolidated e-commerce and Google consolidated search.

I expect we'll see three to five major AI creative platforms emerge as industry standards over the next 18-24 months. These platforms will offer integrated capabilities across design, copywriting, video production, and campaign management.

Agencies that build expertise with these emerging platforms now will have significant advantages. They'll understand the tools' capabilities and limitations. They'll have refined workflows. They'll know when to use AI and when human creativity delivers better results. IDHL Labs will allow us to plan strategically, test with purpose and ensure we have the knowledge required to be a leader in the space when the dust settles.

Leading agencies are more advanced than advertisers in AI adoption across marketing use cases, from measurement and insights to creative and content. This gap will widen as consolidation accelerates. Through IDHL Labs, we will be positioned to capitalise on this advantage for our client partners.

What this means for creative agencies

The role of creative agencies is shifting, but not disappearing.

62% of marketers believe generative AI will augment human creativity, enhancing unique human qualities such as intuition, emotion, and context understanding. This belief aligns with what I've seen in practice.

The agencies thriving in this transition are redefining their value proposition around three core capabilities:

  1. Strategic thinking becomes more valuable, not less. AI can generate content variations, but it can't determine which strategic direction will resonate with your specific audience in your specific market context. It can't identify the emotional drivers that turn awareness into loyalty.

  2. Creative direction and curation matter more than ever. When AI can generate hundreds of options in minutes, the ability to recognise which option captures the right tone, message, and emotional resonance becomes the differentiating skill.

  3. Integration and orchestration across channels, platforms, and touchpoints requires human judgement. AI tools excel at individual tasks, but connecting those tasks into coherent brand experiences requires strategic oversight that understands business objectives, audience psychology, and market dynamics.

The agencies that position themselves as strategic partners who happen to use AI tools will outperform agencies that position themselves as AI tools that happen to employ humans. One of the founding principles of IDHL Labs is that AI will make our people better. It will act as an accelerator, making our skilled specialists even better.

The governance gap that needs addressing

Here's a problem that's not getting enough attention: AI governance remains unclear in many agencies, with most lacking formal oversight structures.

This governance gap creates real risks around brand consistency, data privacy, intellectual property, and quality control. It also creates opportunity for agencies that establish clear AI governance frameworks.

With the launch of IDHL Labs, we are setting a clear marker and position. Our focus will be innovation and our processes will be transparent for our client partners and colleagues across IDHL.

It will be essential for clients to know how you're using AI in their work. They need transparency about what's human-created and what's AI-assisted. They need assurance that their data, brand voice, and strategic positioning remain protected and consistent.

The agencies building robust AI governance frameworks now, like us, will differentiate themselves as responsible partners who understand both the capabilities and the limitations of these tools.

Real-world impact: what the numbers show

The theoretical benefits of AI sound compelling. The practical results are even more convincing.

AI campaign deployment times can be reduced by up to 50%, with documented cases showing improved click-through rates and significant cost reductions. These aren't marginal improvements. They're transformational results that change what's possible within budget and timeline constraints.

But here's what the numbers don't capture: the human judgement required to achieve those results. The AI didn't decide which audience to target, which message would resonate, or which creative approach would break through competitive noise.

Humans made those decisions. AI accelerated execution.

The global AI in creative industries market is expected to grow from $1.3 billion in 2020 to $7.4 billion by the end of 2025, at a compound annual growth rate of 37.3%. This growth represents both opportunity and disruption, depending on how you respond.

What IDHL Labs will be doing in the next 12 months

If you're leading a creative team or managing marketing for your organization, here's what I recommend:

We will experiment systematically
We won’t sit around waiting for the market to settle. We will run controlled tests across multiple AI platforms, model types and workflow configurations. Every experiment will be documented, benchmarked and fed back into a live knowledge base that improves group-wide performance. This creates compounding advantage: institutional intelligence that no competitor can replicate.

We will formalise AI governance
Labs will define, implement and maintain the group-wide AI governance framework. That includes rules of use, client transparency protocols, data privacy protection, IP guardrails, model-selection criteria and automated compliance checks baked into workflows. Everyone working through IDHL Labs will operate with clarity and zero ambiguity.

We will build strategic firepower
As execution becomes increasingly automated, Labs will double down on what machines can’t replace: strategic insight, behavioural understanding and systems-level thinking. We will train and equip teams across the group to elevate briefs, sharpen propositions and craft creative and comms that move people emotionally and commercially.

We will engineer hybrid workflows
Labs will design, test and deploy integrated human plus AI workflows across strategy, creative, content, UX, dev and performance. This isn’t “AI in the corner.” This is AI embedded into production lines, with clear rules about when human judgement leads, when automation accelerates, and how the two compound output, speed and quality.

We will stay outcome-obsessed
We are not here to “use AI.” We are here to drive measurable client partner impact. Every AI deployment will be tied to a performance goal: lower cost, faster delivery, higher quality, improved conversion, stronger brand consistency or better insight extraction. If AI improves the outcome, it stays. If human work outperforms, we default to human.

The optimistic view

I'm optimistic about where this transformation leads.

The creative professionals I know didn't enter this field to resize assets or generate copy variations. They came to solve interesting problems, build meaningful brands, and create work that resonates with real people.

AI handles the mechanical work, freeing creative professionals to focus on what they do best: strategic thinking, emotional intelligence, and original ideas that connect brands with audiences in meaningful ways.

The agencies that embrace this shift, like we are, while maintaining their focus on human creativity, strategic depth, and client partner relationships will thrive. The agencies that resist change or adopt AI without strategic purpose will struggle.

The next 12 months will separate these two groups.

Thursday, 27 November 2025

Is your 2026 marketing plan already extinct?

Test Gadget Preview Image

Across various blogs, webinars, and social media - I’ve seen a fair few marketing directors (or marketing quick-fix bros) create their 2026 plans using 2024 playbooks.

Same budget allocation models. Same channel mix. Same assumption that incremental optimisation beats strategic reassessment.

The problem? 2025 changed the game in ways that make those plans obsolete before implementation.

What actually shifted in 2025

Every channel in your marketing mix hit a breaking point simultaneously.

Paid search costs continued their relentless climb, whilst AI-powered search experiences fragmented where your audience actually looks for information. Traditional search engine traffic (to websites) is predicted to drop by 25% in 2026 as ChatGPT, Google AI Overviews, and Perplexity become primary discovery interfaces rather than supplementary tools.

Your SEO strategy targets traditional search. Your audience increasingly doesn't use it, especially in the research phase of their discovery.

Paid social reached a different inflection point. iOS privacy changes that started years ago finally cascaded through campaign performance data. Targeting capabilities you'd relied on simply disappeared. The platforms compensated with AI-driven audience targeting, but that meant surrendering strategic control to algorithmic decision-making you can't interrogate or refine.

Cost per acquisition across paid social climbed 15-20% year-on-year for most sectors. Not because your creative declined, but because everyone's competing for a smaller targetable pool with less precise tools.

PR and earned media underwent their own disruption. Journalists adopted AI research tools en masse, fundamentally changing how they discover stories and sources. The traditional press release became even less effective. Media coverage value shifted as publications themselves struggled with AI-generated content policies and audience trust issues.

The outlets your audiences trust changed. The formats that earn coverage changed. The timescales for building media relationships compressed.

Meanwhile, organic social media reach continued its decade-long decline, but 2025 marked the point where even paid amplification couldn't compensate. Algorithm changes prioritised platform-native content over external links, making social traffic to owned properties increasingly difficult to generate, regardless of budget.

This wasn't one channel underperforming. It was a simultaneous disruption across your entire marketing mix.

The authenticity backlash nobody planned for

AI adoption accelerated exactly as predicted. 92% of marketers report AI impacting their role, with 88% using it daily.

What the predictions missed: consumer rejection. A great example here is the backlash CocaCola received for their ‘all-AI’ christmas ad. You are welcome to your opinion of it, but for me it’s a soulless example of everything wrong with using AI to support your creative.

50% of consumers can now spot AI-generated content. 52% are less engaged when they suspect AI authorship without human input.

This created immediate problems across channels. AI-generated social posts achieved lower engagement rates. AI-written PR pitches got deleted faster. AI-optimised ad copy converted worse than human-crafted alternatives despite better theoretical performance metrics.

The efficiency gains you planned for? They're being offset by audience disengagement across every channel.

PPC campaigns running AI-generated display creative saw click-through rates decline as audiences developed detection skills. Social media content calendars filled with AI-assisted posts achieved reach levels 30-40% below human-created equivalents. PR coverage secured through AI-enhanced pitching generated lower authority scores and less audience engagement.

Marketing leaders are shifting focus back to brand building and authentic storytelling as the most defensible asset in an environment where everyone has access to the same AI tools. The competitive advantage isn't speed anymore. It's genuine human insight and connection.

This creates a strategic tension most 2026 plans ignore. You need AI to maintain operational efficiency across paid media management, content production, and campaign optimisation. But you need human creativity to maintain audience trust across social, PR, and brand communications.

Why your planning assumptions broke

Annual planning relies on stability. You assume channel effectiveness remains relatively consistent, that audience behaviour evolves gradually, that competitive dynamics shift predictably.

2025 violated all three assumptions across every channel simultaneously.

Your PPC budget allocation assumed stable cost-per-click trajectories and consistent conversion rates. Both assumptions failed as AI search adoption and privacy changes compounded. Your social media strategy assumed targeting capabilities that privacy updates eliminated. Your PR approach assumed media relationship dynamics that AI research tools disrupted.

The planning cycle itself became the problem. By the time you analysed 2024 data, developed 2025 insights, and built 2026 strategies, the market had already moved.

I'm seeing this across sectors. Marine industry marketers planning PPC campaigns using historical CPA benchmarks that no longer apply. Energy sector communications teams doubling down on AI-generated content whilst their audiences develop detection skills. Professional services firms allocating 40% of budget to LinkedIn when algorithm changes have halved organic reach and inflated paid costs.

The gap between planning timelines and market velocity widened dramatically. Your annual plan locked in channel budgets that became inappropriate within weeks. Your content calendar committed resources to formats that audiences stopped engaging with. Your media strategy targeted publications that changed editorial priorities mid-year.

What 2026 planning actually requires

You need a different approach. Not better forecasting, but adaptive frameworks that acknowledge uncertainty across your entire marketing mix.

Start with channel agnosticism. Don't allocate budget by historical channel performance. Allocate by strategic flexibility and ability to shift as behaviour changes. Hold 20-30% of budget in reserve for mid-year reallocation in case channels underperform or new opportunities emerge.

Build integrated measurement frameworks that track cross-channel impact rather than channel-specific metrics. Your SEO content supports PPC conversion rates. Your PR coverage influences social engagement. Your paid social drives branded search volume. Stop measuring channels in isolation (I think i’ve said this every year for 10 years).

Invest in human creativity as a competitive differentiator across every channel. AI handles PPC bid management and social media scheduling. Your team's job is strategic thinking, authentic storytelling, and genuine audience understanding that machines can't replicate. This applies to PR pitching, social content creation, ad copywriting, and strategic planning equally.

Prepare for continued search transformation. Traditional SEO still matters, but answer engine optimisation, conversational discovery, and AI-mediated research require fundamentally different content strategies. Your content needs to perform in ChatGPT responses and Google AI Overviews, not just traditional search results.

Rebuild your paid media strategy around privacy-first targeting and first-party data. The targeting capabilities you lost aren't coming back. Your conversion tracking will continue to degrade. Plan campaigns that work with limited data rather than assuming historical tracking precision.

Develop PR strategies for an AI-researching media landscape. Journalists using AI research tools find sources differently. Your traditional press release distribution achieves less. Direct relationship building and genuine story value matter more than ever.

Most importantly, shorten your planning cycles. Quarterly strategic reviews aren't bureaucratic overhead anymore. They're survival mechanisms in a market that shifts faster than annual plans can accommodate. Review channel performance monthly. Reallocate budget quarterly. Reassess strategic assumptions every six months.

The uncomfortable truth

Your 2026 plan isn't broken because you made mistakes. It's broken because you built it on assumptions that 2025 invalidated.

The marketing directors I referenced at the start of this article weren't incompetent. They were applying proven planning methodologies to a market that stopped rewarding historical pattern recognition.

The question isn't whether your current plan will work. It's whether your planning process can adapt fast enough to matter.

2025 didn't just change marketing tactics. It changed the rate of change itself. Your planning approach needs to account for that acceleration, or you'll spend 2026 executing strategies that were already obsolete when you approved them.

Monday, 24 November 2025

It’s time to gear up for ‘search everywhere’

Test Gadget Preview Image

Search moved and your brand stayed put. Or, to put it into context, your brand is harder to find than ever.

At our agency, we are continually tracking search behavior data. We, among the other leaders in the space, are watching a pattern emerge that a large number of marketing teams, most likely, won’t have internalised yet. The numbers reveal something fundamental: Google's grip on search just broke (or, at least, the signs are there).

In the UK, Google's search dominance slipped to 93.35% in August 2025, down from its peak. Globally, it dropped below 90% for the first time in over a decade. That alone should trigger alarm bells.

But the real story sits in how people actually find information now.

The generational fracture

Research shows 74% of Gen Z use TikTok for search, with 51% preferring it over Google. The gap between traditional search and social discovery? Nearly non-existent for younger audiences.

In the UK specifically, 71% of TikTok's 25 million users are Gen Z, and they're using the platform as their primary search tool. These aren't entertainment platforms anymore. They're discovery engines.

The data gets more striking when you look at traffic impact. UK businesses experienced an 86% collapse in website traffic growth following Google's AI search rollout in August 2024. The algorithm didn't just change. The entire search ecosystem fragmented.

What's driving this

The shift isn't about Google failing. It's about audiences fragmenting their search behavior across platforms that serve different needs.

TikTok for visual how-to and product discovery. Reddit for community-validated recommendations. ChatGPT for conversational queries and synthesis. YouTube for deep-dive explanations. Traditional search engines for transactional intent.

Each platform operates as a distinct search engine with its own ranking factors, user expectations, and content formats. Your audience doesn't live on one platform anymore. They search everywhere.

The strategic implication

If your SEO strategy still centers on Google and Bing alone, you're optimising for a shrinking portion of search behavior. The marine sector marketing director searching for sustainability solutions might start on LinkedIn, validate on Reddit, and deep-dive on YouTube before ever touching Google.

The professional services CMO researching brand positioning might use ChatGPT for initial research, TikTok for trend validation, and traditional search only for vendor vetting.

This changes resource allocation completely. Equal SEO attention across platforms isn't experimental anymore. It's table stakes for visibility.

The brands who will be capturing attention into 2026 and beyond aren't the ones with the best Google rankings. They're the ones showing up wherever their audiences actually search. That requires integrated strategies across traditional search, social platforms, AI engines, and content hubs.

The question isn't whether to expand beyond Google. The data already answered that. The question is how quickly you can redistribute your SEO efforts before your competitors do.

Friday, 14 November 2025

Is your AI strategy ready for the next 12 months?

Test Gadget Preview Image

My ‘full on’ AI exposure is about to tick over 2 years now, in that time - I've seen that maybe 1 in 2 generative AI initiatives launch before they're operationally ready. Expecations missed, features forgotten and worst of all, users neglected.

That's not a minor oversight. That's a strategic failure at the foundation level.

I've been analysing AI implementation patterns across UK marketing organisations, and the data reveals something critical. Budget allocation doesn't predict success. Operational readiness does.

The vast majority of AI pilot programs stall in what experts call "pilot purgatory," delivering minimal impact on actual business outcomes. Only a small fraction achieves rapid revenue acceleration.

The readiness framework

Six operational factors separate successful AI implementation from expensive experiments.

Data foundations come first. Your AI outputs will only be as good as your data inputs. Companies that skip this step encounter predictable problems. In my observation, data quality issues represent the most significant AI implementation obstacle, with the majority of organisations already using generative AI reporting problems with their data sources.

We've seen organisations learn this the expensive way. Poor data quality in machine learning models can trigger significant stock price impacts, substantial market value loss, and material revenue damage.

Pilot before you scale. The overwhelming majority of AI proofs of concept never reach production. The primary reason? Organisations lack the data infrastructure, processes, and technical maturity required for deployment.

Testing reveals gaps. Pilots expose integration challenges. Small-scale implementation protects you from large-scale failure.

Transparency builds trust. Most senior technology leaders express concerns about integrating AI into operations. Your teams share those concerns. Systems that operate as black boxes generate resistance.

Organisations are responding. Companies now actively manage multiple AI-related risks, significantly more than just a few years ago. Transparency about how AI systems work and what they can't do creates the foundation for adoption.

Creative differentiation matters more now. As AI tools become commoditised, brand voice and creative judgment become competitive advantages. Most consumers expect personalised interactions, and are substantially more likely to purchase when those expectations are met.

But personalisation requires human judgment about brand positioning and emotional connection. AI handles execution. Strategy remains human territory.

AI literacy is organisational, not individual. Knowledge gaps represent the primary AI failure factor. Only a minority of organisations believe their talent is ready to leverage AI fully.

The payoff for addressing this is measurable. Organisations investing in targeted AI education see substantially higher project success rates.

Partner selection should follow results, not presentations. Companies that purchase AI tools from established vendors see more reliable results than those building custom solutions internally. Success requires partners who can integrate deeply and adapt over time.

The competitive reality

The majority of marketers now use AI regularly, with many achieving impressive returns on investment while reducing customer acquisition costs considerably. Organisations investing strategically in AI typically see meaningful improvements in sales ROI.

The question isn't whether to adopt AI. It's how quickly you can implement it effectively.

AI won't replace marketers. But marketers using AI will replace those who don't.

The difference between those outcomes comes down to operational readiness, not budget size. Get the foundations right, and the tools become force multipliers. Skip the preparation, and you're just funding expensive experiments.

Thursday, 13 November 2025

AI amplifies everything you feed it

Test Gadget Preview Image

AI doesn't create problems. It magnifies the ones you already have.

I've seen countless individuals and organisations rush toward AI implementation without asking the fundamental question: what exactly are we scaling?

The answer matters more than the technology itself.

The amplification effect

Research from UCL and MIT reveals something striking. AI doesn't just mirror human biases. It amplifies human biases by a factor of three.

When participants disagreed with AI recommendations, they changed their decisions 32.72% of the time. When disagreeing with other humans? Just 11.27%.

The technology creates a feedback loop. Small biases become large ones. Large ones become systemic.

This pattern extends beyond bias. AI scales whatever you feed it: insight or ignorance, precision or sloppiness, strategy or guesswork.

The opportunity side

The upside is real. McKinsey found that fast-growing organizations drive 40% more revenue from personalisation than slower competitors. Companies using AI-powered personalisation report an average 25% increase in marketing ROI.

71% of consumers now expect personalised content. AI makes that scale possible.

The same amplification that magnifies risk also multiplies opportunity. AI can process customer data, identify patterns, and generate personalised content at speeds human teams can't match.

But speed without direction is just expensive noise.

The data quality problem

Here's where most implementations fail. Poor data quality costs UK businesses £900 billion annually. More telling: 85% of AI projects fail because of poor data quality.

Garbage in, garbage out. The old programming axiom applies with exponential force in AI systems.

Amazon's AI recruitment tool had to be shut down entirely. It discriminated against female candidates because it learned from a decade of biased hiring data. The bias couldn't be eliminated because the foundation was flawed.

When you train AI on incomplete, outdated, or biased data, you don't get neutral results. You get amplified versions of those flaws, deployed at scale.

The blandness crisis

Even with clean data, another risk emerges. Generic input produces generic output, multiplied across every channel.

Marketing experts warn of an emerging "blandness crisis." When brands rely on AI without human oversight, everything starts sounding the same. Over 200 overused AI phrases now signal generic content.

Gen Z, the most digitally fluent generation, actively rejects AI-generated content that feels fake. They can detect when the human spark disappears.

AI can scale your voice. But if you don't have a distinctive voice to scale, you're just producing more noise.

The human factor

AI is a tool designed to enhance, not replace, human creativity. The best results come from AI-augmented work, not AI-replaced work.

Generic prompts produce generic content. Specific direction and constraints applied by humans produce distinctive, engaging output from the same AI.

The technology amplifies your strategic thinking, your brand understanding, your audience insight. Or it amplifies your lack of those things.

Good business leaders see AI as a catalyst for job creation rather than destruction. The technology reshapes roles, allowing humans to focus on higher-order tasks: creativity, strategic thinking, emotional intelligence.

What this means

AI will scale your marketing capabilities. That's certain.

The question is what you're scaling. Clear strategy or confusion? Brand distinction or generic messaging? Accurate data or flawed assumptions?

The technology doesn't judge. It just multiplies.

Before you implement AI tools, audit what you're feeding them. Check your data quality. Examine your strategic clarity. Define your brand voice with precision.

AI will amplify everything you give it. Make sure you're giving it something worth scaling.

Monday, 10 November 2025

Your agency is performing integration theater

Test Gadget Preview Image

They call it collaboration. I call it expensive theater.

Your agency presents integrated teams in pitch meetings. They show org charts with dotted lines connecting departments. They use words like "synergy" and "cross-functional excellence."

Behind the curtain, each department operates independently. Different tools. Different metrics. Different definitions of success.

The performance costs you in real pounds money.

The act looks convincing

Watch any agency status meeting. The paid search lead presents metrics. The social team shares engagement numbers. The content strategist discusses editorial calendar. Everyone nods.

But they're not actually working together. They're taking turns presenting work that happened in isolation.

Research shows agencies now juggle a growing fragmentation of marketing technology. More platforms means more silos. UK agencies report inefficient processes as their primary operational challenge.

That inefficiency flows directly to your budget.

Your budget bleeds while they perform

Large organisations waste up to 10% of operational budgets on redundant efforts caused by silos. For a company spending £100 million on operations, that's £10 million evaporating annually.

Think about your marketing spend through that lens.

Your paid team bids on keywords without consulting content. Your social team creates campaigns disconnected from your email strategy. Your analytics sit in separate dashboards nobody reconciles.

Each silo duplicates work. Each handoff loses context. Each disconnection multiplies cost.

The coordination tax compounds daily.

The trust gap nobody discusses

There's a fascinating perception gap in client-agency relationships. Research shows significant disconnect between how clients and agencies view their working relationships, with trust remaining a persistent challenge across the UK marketing industry.

That's not a communication problem. That's a trust collapse.

When agencies perform integration instead of practicing it, clients sense the disconnect. They just can't always articulate what feels wrong. The metrics look fine. The meetings seem productive. But something's off.

What's off is the expensive performance hiding the operational reality.

Recognition changes everything

Integration theater thrives on clients not knowing what real collaboration looks like. Once you recognise the performance, you see it everywhere.

Watch your next agency meeting differently. Notice who actually coordinates before presenting. Observe whether teams reference each other's work or just share space on the agenda. Ask how decisions get made between departments, not just within them.

Real integration shows up in shared tools, unified metrics, and genuine coordination costs baked into workflows.

Fake integration shows up in presentations.

Your budget knows the difference, even when the performance looks convincing.

Thursday, 6 November 2025

Why separated teams always produce mediocre brands

Test Gadget Preview Image

Most brands are bleeding revenue through a crack they can't see.

The separation between creative and strategy teams feels like smart organisational design. Strategists analyse markets and develop positioning. Creatives bring ideas to life with design and copy.

Clean division of labour.

I've seen this pattern across dozens of organisations, and the data reveals a different reality. When these teams operate in silos, brands become forgettable. Messaging fragments. Visual identity drifts. The customer experience feels disjointed because it is.

Almost half of companies admit their marketing still suffers from silo thinking, according to ISBA research. Only one-quarter feel satisfied with how their teams coordinate.

The dissatisfaction makes sense when you see the cost.

The hidden tax of separation

Teams waste more than 20 hours monthly because of poor collaboration. That adds up to six work weeks annually. In North America, the number climbs to 28 hours monthly.

Six weeks of productive time lost to organisational friction.

Meanwhile, campaign effectiveness depends heavily on creative quality. Google's research shows up to 70% of performance ties directly to the creative work itself. When strategy and creative teams don't collaborate deeply, that 70% suffers.

The math gets worse. Brand consistency can drive revenue growth up to 20%. But consistency requires tight coordination between strategic direction and creative execution. Separated teams produce inconsistent brands.

Why the dysfunction persists

Organisations separate these functions believing specialisation improves outcomes. Strategists need space to think. Creatives need freedom to explore.

The assumption sounds reasonable until you examine what actually happens.

Strategy teams develop frameworks in isolation. They hand over briefs that feel complete but lack creative input. Creatives receive these briefs and spot immediate problems. The strategic direction doesn't account for execution realities. The positioning sounds good in theory but falls flat visually.

So creatives adapt. They interpret. They fill gaps.

The result resembles a game of telephone. Strategic intent gets diluted through translation. Creative execution drifts from the original vision. Neither team feels ownership of the final output.

Customers experience this disconnect as brand mediocrity. The messaging doesn't quite land. The visuals feel disconnected from the promise. The overall experience lacks the coherence that builds loyalty.

AI amplifies the divide

AI tools promise to bridge the gap between strategy and creative execution. Instead, they're making the separation worse.

Strategists now use AI to generate positioning frameworks and messaging architectures without creative input. Creatives use AI to produce variations and assets without strategic context. Both teams move faster in isolation, which sounds productive until you realize they're moving in different directions.

The speed creates an illusion of efficiency. Strategy teams can produce ten positioning options in the time it used to take for one. Creative teams can generate fifty visual concepts before lunch. But none of it connects because the collaboration still doesn't happen.

AI becomes another layer of translation rather than integration. The tools optimize for individual team productivity while the brand suffers from the same fragmentation, just at higher velocity.

The integration advantage

Companies that align cross-functional teams generate 208% more revenue than those operating in silos, according to LinkedIn research. The advantage comes from eliminating translation layers.

When strategists and creatives collaborate from the beginning, strategy becomes executable. Creative work stays anchored to clear positioning. The feedback loops tighten. Problems surface earlier when they're easier to fix.

McKinsey found that brands scoring high on creativity metrics see 70% better organic revenue growth and shareholder returns. But creativity without strategy produces novelty without purpose. Strategy without creative partnership produces clarity without resonance.

The integration produces something neither team creates alone.

What this means

The organisational structure you choose determines the brand you build. Separated teams produce separated experiences. Integrated teams produce coherent brands that customers remember and choose.

The question becomes whether your current structure serves the brand you want to create or the one you're accidentally building.

The creative technologist is a bridge, not a destination

I started my fascination with code thirty-two years ago. Then I moved into design. Then brand. Then digital. Then the strategy. That progres...