Magazine / What To Do When The Customer Isn’t Always Right

What To Do When The Customer Isn’t Always Right

Career Entrepreneurship

There are three legs to the lean startup concept: agile product development, low-cost (fast to market) platforms, and rapid-iteration customer development. When I have the opportunity to meet startups, they usually have one of these aspects down, and need help with one or two of the others. The most common need is becoming more customer-centric. They need to incorporate customer feedback into the product development and business planning process. I usually recommend two things: try to get the whole team to start talking to customers (“just go meet a few”) and get them to use split-testing in their feature release process (“try it, you’ll like it”).

However, that can’t be the end of the story. If all we do is mechanically embrace these tactics, we can wind up with a disaster. Here are two specific ways it can go horribly wrong. Both are related to a common brain defect we engineers and entrepreneurs seem to be especially prone to. I call it “if some is good, more is better” and it can cause us to swing wildly from one extreme of belief to another.

What’s needed is a disciplined methodology for understanding the needs of customers and how they combine to form a viable business model. In this post, I’ll discuss two particular examples, but for a full treatment, I recommend Steve Blank’s The Four Steps to the Epiphany.

Let’s start with the “do whatever customers say, no matter what” problem. I’ll borrow this example from randomwalker’s journal – Lessons from the failure of Livejournal: when NOT to listen to your users.

The opportunity was just mind-bogglingly huge. But none of that happened. The site hung on to its design philosophy of being an island cut off from the rest of the Web, and paid the price. … The site is now a sad footnote in the history of Social Networking Services. How did they do it? By listening to their users.

randomwalker identifies four specific ways in which LJ’s listening caused them problems, and they are all variations on a theme: listening to the wrong users. The early adopters of LiveJournal didn’t want to see the site become mainstream, and the team didn’t find a way to stand up for their business or vision.

I remember having this problem when I first got the “listening to customers” religion. I felt we should just talk to as many customers as possible, and do whatever they say. But that is a bad idea. It confuses the tactic, which is listening, with the strategy, which is learning. Talking to customers is important because it helps us deal in facts about the world as it is today. If we’re going to build a product, we need to have a sense of who will use it. If we’re going to change a features, we need to know how our existing customers will react. If we’re working on Positioning for our product, we need to know what is in the mind of our prospects today.

If your team is struggling with customer feedback, you may find this mantra helpful. Seek out a synthesis that incorporates both the feedback you are hearing plus your own vision. Any path that leaves out one aspect or the other is probably wrong. Have faith that this synthesis is greater than the sum of its parts. If you can’t find a synthesis position that works for your customers and for your business, it either means you’re not trying hard enough or your business is in trouble. Figure out which one it is, have a heart-to-heart with your team, and make some serious changes.


Especially for us introverted engineering types, there is one major drawback to talking to customers: it’s messy. Customers are living breathing complex people, with their own drama and issues. When they talk to you, it can be overwhelming to sort through all that irrelevant data to capture the nuggets of wisdom that are key to learning. In a perfect world, we’d all have the courage and stamina to perservere, and implement a complete Ideas-Code-Data rapid learning loop. But in reality, we sometimes fall back on inadequate shortcuts. One of those is an over-emphasis on split-testing.

Split-testing provides objective facts about our product and customers, and this has strong appeal to the science-oriented among us. But the thing to remember about split-testing is that it is always retrospective – it can only give you facts about the past. Split-testing is completely useless in telling you what to do next. Now, to make good decisions, it’s helpful to have historical data about what has and hasn’t worked in the past. If you take it too far, though, you can lose the creative spark that is also key to learning.

For example, I have often fallen into the trap of wanting to optimize the heck out of one single variable in our business. One time, I became completely enamored with Influence: The Psychology of Persuasion (which is a great book, but that’s for another post). I managed to convince myself that the solution to all of our company’s problems were contained in that book, and that if we just faithfully executed a marketing campaign around the principles therein, we’d solve everything. I convinced a team to give this a try, and they did tried dozens of split-test experiments, each around a different principle or combination of principles. We tried and tried to boost our conversion numbers, each time analyzing what worked and what didn’t, and iterating. We were excited by each new discovery, and each iteration we managed to move the conversion needle a little bit more. Here was the problem: the total impact we were having was miniscule. It turns out that we were not really addressing the core problem (which had nothing to do with persuasion). So although we felt we were making progress, and even though we were moving numbers on a spreadsheet, it was all for nothing. Only when someone hit me over the head and said “this isn’t working, let’s try a radically new direction” did I realize what had happened. We’d forgotten to use the all the tools in our toolbox, and lost sight of our overarching goal.

It’s important to be open to hearing new ideas, especially when the ideas you’re working on are split-testing poorly. That’s not to say you should give up right away, but always take a moment to step back and ask yourself if your current path is making progress. It might be time to reshuffle the deck and try again.

Just don’t forget to subject the radical new idea to split-testing too. It might be even worse than what you’re doing right now.


So, both split-testing and customer feedback have their drawbacks. What can you do about it? There are a few ideas I have found generally helpful:
Identify where the “learning block” is. For example, think of the phases of the synthesis framework: collecting feedback, processing and understanding it, choosing a new course of action. If you’re not getting the results you want, probably it’s because one of those phases is blocked. For example, I’ve had the opportunity to work with a brilliant product person who had an incredible talent at rationalization. Once he got the “customer feedback” religion, I noticed this pattern: “Guys! I’ve just conducted three customer focus groups, and, incredibly, the customers really want us to build the feature I’ve been telling you about for a month.” No matter what the input, he’d come around to the same conclusion as before.

  • Or maybe you have someone on your team that’s just not processing: “Customers say they want X, so that’s what we’re building.” Each new customer that walks in the door wants a different X, so we keep changing direction.

Or consider my favorite of all: the “we have no choice but to stay the course” pessimist. For this person, there’s always some reason why what we’re learning about customers can’t help. We’re doomed! For example, we simply cannot make the changes we need because we’ve already promised something to partners. Or the press. Or to some passionate customers. Or to our team. Whoever it is, we just can’t go back on our promise, it’d be too painful. So we have to roll the dice with what we’re working on now, even if we all agree it’s not our best shot at success.

Wherever the blockage is happening, by identifying it you can work on fixing it.

  • Focus on “minimum feature set” whenever processing feedback. It’s all too easy to put together a spec that contains every feature that every customer has ever asked for. That’s not a challenge. The hard part is to figure out the fewest possible features that could possibly accomplish your company’s goals. If you ever have the opportunity to remove a feature without impacting the customer experience or business metrics – do it. If you need help determining what features are truly essential, pay special attention to the Customer Validation phase of Customer Development.

Consider whether the company is experiencing a phase-change that might make what’s made you successful in the past obsolete. The most famous of these phase-change theories is Crossing the Chasm, which gives very clear guidance about what to do in a situation where you can’t seem to make any more progress with the early-adopter customers you have. That’s a good time to change course. One possibility: try segmenting your customers into a few archetypes, and see if any of those sounds more promising than another. Even if one archetype currently dominates your customer base, would it be more promising to pursue a different one?

As much as we try to incorporate scientific product development into our work, the fact remains that business is not a science. I think Drucker said it best. It’s pretty easy to deliver results in the short term or the long term. It’s pretty easy to optimize our business to serve one of employees, customers or shareholders. But it’s incredibly hard to balance the needs of all three stakeholders over both the short and long-term time horizon. That’s what business is designed to do. By learning to find a synthesis between our customers and our vision, we can make a meaningful contribution to that goal.

 


A version of this post originally appeared on Eric Ries’s website

Download
the Next Big Idea App

Also in Magazine

Sign up for newsletter, and more.