Delegating tasks was more than just ‘splitting up work’
For some time now, I’ve noticed something different in the way I work. Not so long ago, for certain tasks — researching information, structuring data, drafting outlines, even sketching first hypotheses — I used to rely on an associate, usually an entry-level analyst or someone in their first role with real responsibility. It was a natural part of the system: someone young, eager to prove they could contribute, would take on those small pieces of a bigger whole.
AI and automation: tasks I no longer delegate
Now, many of those tasks no longer need to pass through someone else. I handle them myself — but not because I’ve suddenly become a wizard juggling spreadsheets, summaries, drafts and presentations. The difference lies in the tools I use: to be clear, I mean artificial intelligence itself.
Tools that are increasingly accessible and powerful allow me to internalise work I used to delegate. I don’t do it all myself: I do it with AI. This has slightly shifted my own professional inner balance, but it also externalises a cost we don’t always see: by taking on those microtasks myself, I shut out people who could have done them as part of their learning curve. Those tasks were small pieces of something bigger — they helped them build context, judgement and confidence to take on more responsibility in the future.
“Internalising tasks without considering this cost means externalising an impact that weakens the professional ecosystem by shrinking the space where the next generation learns to grow.”
I’m not saying I don’t need an analyst anymore. I still do — and a lot — but for fewer things. Or rather, for different things. What used to be a long list of fragmented tasks has shrunk, because the most routine, repetitive or low-value pieces now get done in minutes with a good prompt. This shift isn’t just happening in my world — even the Big Four are cutting graduate roles as AI reshapes the nature of consulting work as reported here.

Efficiency and its side effects
So far, so good: efficiency, speed, autonomy. But there’s a part that makes me uneasy.
Those tasks I no longer delegate didn’t just produce a deliverable for a client. They also produced a learning opportunity. Each request came with an explanation: “I need this analysis because it’s the basis for testing this hypothesis. If we validate it, we can take this approach. If not, we adjust here.”
The deliverable was just the excuse. The real value was the rationale behind it. Understanding why something is requested, how it fits into a wider process, how each part feeds the whole: analysis → strategy → execution → monitoring → feedback.
The risk of skipping the learning stage
That bigger picture is gold for someone starting out. If you just execute without asking questions, you learn nothing. But if you execute and hear the ‘why’ behind each step, you start spotting patterns, connections and dependencies. That’s where an analyst, for example, stops being just an extra pair of hands and starts becoming someone ready to take the next leap.
“If AI takes over those microtasks, it also takes over part of the learning journey. And it does so at a critical moment in a professional’s development.”
This isn’t a problem with the technology itself — it’s about how we use it. If efficiency simply replaces pedagogy, we lose something valuable.
Why explaining the context still matters
These days, every time I choose not to pass a task to someone because AI can handle it faster, I try to ask myself: What part of the context am I no longer sharing? Sometimes it’s worth explaining the reasoning anyway, even if there’s no spreadsheet left to delegate. Because understanding the ‘why’ — and the ‘what’s next’ — still makes the difference between a good associate and the future top professional they could become.



