If you work in an agency today, you don’t need a panel discussion to tell you that generative AI has entered the building. It’s already there — quietly sitting inside decks, brainstorm docs, caption drafts, mood boards, and editing timelines. What’s interesting, though, is how rarely it shows up in the work that truly stands out. Not because teams aren’t using AI, but because the best teams aren’t letting it lead. From where I sit, working closely with social and content teams, generative AI has become less of a headline-grabbing disruption and more of an infrastructure shift. It hasn’t replaced creativity. It has simply forced us to confront what parts of our creative process were never that creative to begin with.
In practical terms, AI has changed the shape of creative workflows, not their soul. GPT-based tools are being used to open up thinking — exploring alternate hooks, reframing briefs, stress-testing ideas before they go into full production. Visual teams are using image generators to explore directions faster, not to finalise executions. Automated editing tools are speeding up turnarounds, allowing creatives to spend less time fixing captions or resizing assets and more time arguing about ideas — which is where the real work happens anyway. The mistake many teams make is treating AI like a vending machine: insert prompt, collect output, move on. That approach produces content, not creativity. The teams seeing real value are the ones using AI to provoke thinking, not conclude it. They ask better questions. They discard more outputs. They know when to stop listening to the tool and start trusting instinct.
What becomes very clear, very quickly, is that AI does not remove the need for judgment — it magnifies it. When you suddenly have ten versions of an idea instead of two, someone still has to decide what feels right, what fits the moment, and what could quietly backfire. This is especially true in a country like India, where cultural context changes block by block, language carries layered meaning, and audiences are far more perceptive than dashboards give them credit for. AI doesn’t know when silence is more powerful than content. It doesn’t know when a joke will land or when it will offend. Those decisions still come down to human experience. As Nandan Nilekani has pointed out in conversations around technology and society, “Technology can amplify human capability, but empathy, creativity and leadership remain fundamentally human.” In creative work, amplification without empathy is just noise.
There’s also an uncomfortable but necessary conversation happening inside agencies about ethics — and it’s a good thing. Questions around originality, authorship, bias, and transparency aren’t theoretical anymore; they’re operational. Who owns an idea that was shaped with AI assistance? How do we ensure we’re not reinforcing stereotypes or recycling the same cultural references endlessly? The more thoughtful agencies are responding not by banning AI, but by putting guardrails around it. Clear internal guidelines. Honest conversations with clients. An understanding that using AI doesn’t absolve anyone of responsibility — if anything, it increases it. Creativity has always involved borrowing, remixing, and reinterpreting. AI just makes that process faster, which means the ethical line needs to be drawn more consciously, not less.
What’s becoming increasingly obvious is that where AI enters the workflow matters more than whether it enters at all. Used only at the end — to generate captions or automate edits — it becomes a productivity hack. Used early — during ideation, framing, and exploration — it becomes a creative accelerant. It allows teams to test more ideas, challenge their own biases, and arrive at stronger thinking before committing time and budgets. But it also exposes weak briefs, unclear strategies, and lazy thinking very quickly. AI doesn’t fix those problems; it highlights them. In that sense, it’s an uncomfortable mirror for the industry.
The fear that AI will “replace creatives” misunderstands what creativity actually is. Creativity isn’t output. It’s decision-making under uncertainty. It’s taste. It’s timing. It’s knowing when to break the pattern everyone else is following. Algorithms are designed to optimise patterns, not disrupt them. Some of the most impactful creative work doesn’t perform immediately, doesn’t look obvious, and doesn’t test well in advance. It works because someone believed in it before the data did. That belief — that leap — cannot be automated.
As generative AI becomes standard across agencies, the advantage won’t lie in who uses it, but in who uses it thoughtfully. The future belongs to teams that understand technology deeply enough to know its limits, and themselves well enough to trust their instincts. AI will continue to evolve. Tools will get faster, cheaper, and more impressive. What won’t change is the need for ideas that feel intentional, culturally grounded, and emotionally real. In the end, generative AI doesn’t replace creativity. It exposes what creativity was always supposed to be.

