top of page
Search

The Question Nobody in the Extended C-Suite Is Asking Loudly Enough

  • Writer: Charles Baker
    Charles Baker
  • 18 hours ago
  • 8 min read


I published a blog post early last week about how AI is reshaping the core C-suite. The piece got a decent response, but what I didn't expect was the number of direct messages that followed. Not from CEOs or CFOs. From the people who sit one layer out from that inner circle.


Chief Information Officers. Chief Data Officers. Chief Strategy Officers. Chief Marketing Officers and one Chief Sustainability officer. I don't want to overstate the numbers, it was probably around 20 direct messages. Not so large a number, but a number big enough to warrant a deeper dig into the topic and the research.


The messages weren't asking me to validate the importance of their roles or asking if their jobs were secure. Mostly though, people were asking a slightly different themed question. I can't list all 20 here, but the general topic they were asking about was: where do I actually stand?


I want to give that question the honest answer it deserves, rather than have the same conversation fifty times over coffee. I'll do my best not to re-write the blog post from last week, though I do expect a fair amount of overlap.


The uncomfortable starting point

Most of the extended C-suite was created to manage complexity.


Organisations got bigger, faster, and harder to coordinate. Regulatory environments got thornier. Data became overwhelming. Sustainability went from a PR line to a balance sheet risk. Customer relationships fractured across a dozen channels. So companies created senior roles to hold those problems.


That was a reasonable response to the world as it was, until recently.


Here's the problem. AI is dissolving certain kinds of complexity. Not all of it, and not overnight, but the specific kinds that many of these roles were built to manage: coordination, reporting, monitoring, compliance tracking, data synthesis, campaign execution, content production.


Which means the question is no longer just "what skills do I need going forward?"

The more important question is: does my role have a legitimate reason to exist beyond the one it was originally created for?


That is an uncomfortable thing to sit with. But the leaders who ask it honestly are the ones who will still matter in five years. The ones who don't will spend the next decade defending a role that nobody has the heart to cut but nobody can quite justify keeping.


As an interesting side topic, go back in time and track the life of certain roles:


Is anyone reading this old enough to remember the "Chief Administrator" or the "Chief Reporting Officer" or the "Chief Information Gatekeeper"? all once considered essential C-Suite roles up until twenty or thirty years ago (forgive me if you've seen versions of these role more recently).


Fast forward 100 years and we will no-doubt think about some of todays roles in much the same way.


What this looks like for roles today

I'll do my best to be specific, because vague reassurance is the least useful advice I can offer here.


Chief Data Officer / Chief AI Officer

Paradoxically positioned. Responsible for the technology that is restructuring everything else, yet genuinely at risk of being bypassed if AI capability migrates into individual business units rather than remaining centralised.


The CDO or CAIO who survives te next few years is not the one who owns the tools. It is the one who owns the judgment around them. Governance, ethics, decision architecture, and the harder question of how an organisation uses AI in ways that build trust rather than quietly erode it.


If your value proposition is "I understand the technology," you are probably one good hire away from being replaceable. If it is "I help this organisation make better decisions about how and where to trust AI," that is a different conversation entirely, and one that will only become more important.


Chief Sustainability Officer

This role has moved through a strange cycle in the past decade, from compliance function to strategic priority to, in some organisations, a political liability. AI does not resolve that tension. What it does is change how sustainability is measured, reported, and integrated across the business.


The risk for CSOs is staying in the reporting and communications lane while the strategic substance migrates to the CFO, the COO, or the board directly. The opportunity is becoming a genuine systems integrator, connecting ESG commitments to capital allocation, supply chain decisions, workforce strategy, and regulatory risk in ways that create tangible business value. One of those is a defensible role. The other is a function waiting to be absorbed or eliminated.


Chief Strategy Officer

The honest truth about this role is that AI is better at parts of it than most practitioners want to admit. Market scanning, scenario modelling, competitive intelligence, pattern recognition across large data sets. These are not peripheral activities for strategy teams. They are the core of what many of them do day to day.


What AI cannot do is navigate the politics, the cultural resistance, and the human judgment calls that determine whether a strategy actually gets implemented. A Chief Strategy Officer who repositions around facilitation, synthesis, and closing the gap between insight and action is genuinely more valuable in an AI-augmented organisation. One who is primarily the analyst-in-chief is in real trouble.


General Counsel / Chief Legal Officer

Legal work sits in an interesting place. AI is already transforming contract review, compliance monitoring, discovery, and legal research. The transactional, precedent-heavy, high-volume work is automating quickly.


What cannot be automated is judgment in genuinely novel territory. Advising a board on reputational exposure. Navigating regulatory ambiguity in a new market. Holding the line on ethics when commercial pressure is running in the other direction. The General Counsel whose value is built on being the most senior lawyer in the room is becoming less differentiated. The one who is the organisation's most trusted thinker on risk, consequence, and institutional trust is becoming more central.


Chief Communications Officer

AI can generate content, personalise at scale, optimise for engagement, and do a reasonable impression of brand voice with the right prompting. What it cannot do is decide what an organisation stands for, or how it should speak when things go wrong, or what kind of trust it is trying to build with the people it serves over time.


The communications leaders who will matter most are the ones who move firmly into the territory of meaning and institutional character, and let AI handle the execution layer they were probably doing manually anyway. The ones who stay in production and channel management are managing work that is becoming less defensible by the quarter.


Chief Customer Officer

Customer experience is becoming more AI-mediated, which makes the human judgment calls around it more consequential, not less. Where is automation improving the experience and where is it eroding it? At what point does efficiency tip into feeling uncared for? How do you build genuine loyalty when most interactions are algorithmically managed?


The CCO who becomes the organisation's conscience on customer humanity, the person who pushes back when efficiency arguments are being used to justify things that quietly damage trust, has a role no model can replicate. The one who is primarily managing the CRM stack and the NPS dashboard probably does not.


The skills that actually matter now

This is what most of the messages I received were really asking about. So let me be direct.


Work with AI rather than around it

This does not mean becoming a data scientist or understanding how models are trained. It means developing enough working fluency that you can ask precise questions, challenge outputs when they feel wrong, recognise when a model is confidently incorrect, and make sensible decisions about where AI should and should not be trusted with consequential work.


The leaders who are going to struggle are the ones who treat AI as either a threat to be managed or a black box to be deferred to. The ones who engage with it actively, with genuine curiosity and appropriate scepticism, will be far better placed to govern it well. Start practically. Use the tools. Build your intuitions through experience rather than briefings.


Develop a real systems view

Most extended C-suite leaders became excellent within a domain. That expertise still matters. But the premium is shifting toward understanding how your domain connects to the rest of the enterprise.


How does your work affect capital allocation? Where does it interact with the talent strategy? How does it touch the customer? Where does it create or absorb risk that sits elsewhere in the organisation? Leaders who can answer those questions fluently are harder to cut and easier to elevate. Leaders whose thinking stops at the boundary of their own function are more exposed than they realise.


Get more comfortable with genuinely incomplete information

AI will improve the quality of analysis available to decision-makers. It will not remove the situations where two reasonable people read the same analysis differently, or where the right call conflicts with what the data suggests, or where speed and rigour are both defensible priorities pulling in opposite directions.


The capacity to make a call under those conditions, and to be genuinely accountable for it, is something you develop through exposure and honest reflection on how your decisions land. You cannot outsource it, and you cannot automate it. It is the work that remains.


Build your influence, not just your authority

The extended C-suite has always operated more through influence than direct control. AI accelerates this because it further reduces the value of owning or controlling information. The value is in what you do with it.


That means communication, persuasion, and the ability to make complex ideas land with different audiences who have different priorities and different levels of patience. These are not soft skills. They are the primary mechanism through which extended C-suite leaders create impact, and they are worth investing in deliberately.


Take ethical judgment seriously as a professional discipline

As AI moves into operational and strategic decision-making, the questions of fairness, trust, and consequence become leadership questions rather than compliance questions. Someone needs to ask them. Someone needs to ask them loudly enough to be heard, and credibly enough to be taken seriously.


The leaders who build a genuine reputation for that kind of judgment will be trusted with decisions that matter. It is worth treating it as a capability to develop rather than a value to declare.


What to stop investing in

Most conversations about future-proofing focus on what to add. I think the harder conversation is about what to subtract.


The capabilities most worth de-prioritising are not broad skill categories. They are the specific activities that AI will increasingly do faster, cheaper, and with less friction than a senior executive can. Manual reporting and analysis assembly. Repeating information that already lives in data systems. Process administration that exists to manage complexity AI is now managing. Execution-layer work in communications, legal, marketing, and strategy that once required specialist expertise but increasingly requires a well-constructed prompt.

The risk is not losing those tasks. The risk is continuing to invest your identity, your time, and your energy in them after they stop being differentiators.


Your time is the scarcest resource you have. The honest question is whether you are spending it on work that is becoming less valuable or work that is becoming more so.


The harder truth

Several of the people who messaged me were really asking whether they should be worried.


Honestly, yes, a little. But not about the thing most people think.


The extended C-suite roles that will struggle are not the ones facing a sudden moment of disruption. They are the ones that simply fail to evolve, that stay anchored in what justified their creation decades ago, managed by people who are excellent at a version of their role that is quietly shrinking.


The ones that will matter are the roles that have been honestly interrogated and deliberately repositioned around judgment, trust, and the kind of integration that no system can do alone.


That does not require reinventing yourself. It requires being clear-eyed about where your role is heading, making deliberate choices about what to build, and staying curious enough to keep adapting as the landscape shifts underneath you.


The leaders I have the most confidence in are not the ones who were best prepared for the last version of the job.


They are the ones who started asking the hard questions early enough to do something useful with the answers.





 
 
 

Comments


bottom of page