
Using AI for marketing strategy can lead you to falsely believe you're differentiating your brand.
Nowhere is AI causing more disruption in business than in marketing. From automated campaigns to digital customer service agents, generative AI is taking over much of the marketing process.
Marketing strategy is one of the most easily rationalized functions for outsourcing to AI. It's costly, it's a time sink for key executives, it's a grinding process, and the outcome can feel highly subjective.
Comparatively, a few minutes with AI can produce a strategic brief that reads as professional and credible as the product of an expensive consulting program.
But several factors make this a bad bet, delivering apparent short-term gains threatened by long-term strategic weaknesses.
Watch Me Pull a Rabbit out of a Hat
Let's illustrate one of the key under-reported weaknesses of AI with a simple parlor trick. I'm going to give you a generative prompt for ChatGPT that in no way indicates the details of what AI will create, and yet before you run this prompt I'll tell you with surprising specificity what the result will look like. Here's the prompt:
Generate an image that looks like an iPhone 6 flash photo, that looks chaotic and uncanny.
Go ahead and try this with ChatGPT before you read on and I'll tell you what your image will most likely look like. (You can do this on Gemini, but due to differences in training, the result will be somewhat different.)
Your photo will almost certainly look like the depressing aftermath of a raging house party with creepy dolls and perhaps other odd characters like stuffed animals looking into the camera amid all the post-party trash and clutter.
Why Does this Work?
No two people will generate exactly the same image, and yet the vast majority of images generated from this prompt on the same LLM will share the same elements and the same vibe.
Each permutation seems unique to each user, and yet the results are so similar that anyone cross-checking outputs across users would immediately recognize them as belonging to the same group. For each individual who runs the prompt, the result doesn't seem like it can be predicted from the words, so each person believes their result is unique to them.
Run the same prompt on a different model and you may get a different vibe, but the vibe will look the same across all results from the same model.
What makes this parlor trick work is that AI is looking for words in the prompt to understand your meaning. It then searches through its training to find the most common themes attached to those words to come up with ideas for the image.
In this case, three elements in the prompt map to a very predictable outcome:
While humans see these elements as vague and relatively generic, to AI they immediately map to thousands or even millions of records in their training.

Why This Matters for Strategy
What's happening in this demonstration is similar what happens when companies turn to AI for strategy. Generic inputs map to high-density regions of the training data. The output feels personalized because you supplied the prompt, but it produces a similar result for every other prompt built from the same generic inputs, including the prompts your competitors are feeding to their LLMs.
Since the late 90s, I've worked in marketing strategy in Silicon Valley. I've worked with hundreds of technology companies on brand strategy, positioning, advertising and sales. One of the most time-honored truisms of this space is that the vast majority of companies are very weakly differentiated.
Differentiating your brand clearly from competitors is one of most critical elements of marketing strategy. Yet the vast majority of technology leaders focus on positioning elements just like the ones we see in the uncanny parlor trick—elements that seem unique to the user, but map to generic concepts that aren't in fact differentiated at all.
Ask any technology company what makes them unique to their customers, the majority will list attributes like quality of service, customer focus, responsiveness, innovation, expertise, security and ROI.
Put that list into a prompt asking for a differentiated positioning strategy, and you’ll get a response that feels distinctive, but will likely be functionally interchangeable with your competitors
Getting technology leaders to dive deeper to find the differentiating characteristics that make their brand unique requires challenging cherished beliefs, digging into customer and employee experiences, forcing difficult comparisons to competitors, and often questioning everything the market approach previously considered settled.
This is hard work that AI seems to solve. AI offers us a magical shortcut that saves time and money and produces an immediate, credible-sounding strategic brief. We accept the brief as unique because just like a horoscope or a fortune-telling, we immediately personalize what's presented to us as uniquely specific to us.
We have no idea that the generic "differentiating" ideas we put into our strategy prompts index directly to the same ideas in the same models that every one of our competitors is using for the same purpose.
But Wait, Can't We Solve that with AI?
The most common solution I hear for this dilemma is leveraging AI to constantly modulate content to improve response. As if A/B testing unleashed by AI at scale will become a strategy engine through continuous market validation.
But this is a fallacy. Testing optimizes response within a given frame; it doesn't tell you whether the frame is right. If your conversion rate goes from 2% to 5% that looks like a big win, but it doesn't tell you whether or not that's the customer you should be acquiring. You may have just built an efficient pipeline for customers who churn in six months, or for a segment that caps your growth at a fraction of your real market.
Formulating strategy requires judgement, qualitative signals and the formulation and challenging of hypotheses. When you outsource this critical process to AI, you're outsourcing thinking to a machine that doesn't think, but provides the most credible sounding result according to thousands of previous strategy documents in its training set.
To put it simply, testing can help refine your message not formulate it. In a similar way, AI can help you identify and work through the right processes for developing a strategy, but giving it the task of formulating your strategy is a fool's errand.
The takeaway: the same property that makes AI so powerful is what makes it strategically dangerous. Pattern matching against a giant corpus of data makes AI look genius. But it’s optimizing for plausibility within that data set, while strategy requires finding white-space positions outside of it.
Scaling models is unlikely to solve this problem in the short run. But there are ways to improve strategic output by approaching it in a different way.
Over the next few posts we'll talk more about the challenges and solutions for working on marketing strategy with AI