Skip to main content

Early Customer Feedback Fallacies

At the center of successful product launch are customers who are happy, excited or relieved that something has been delivered to them.  Given that customers are such a key part of this equation it’s astonishing how often product development teams get so “visionary” that they forget the very people they are trying to serve.  Focusing on real people with real needs has a number of positive side effects, maybe the most important of which is your company survival.  According to CB Insights 42% of startups fail because they are solving a problem that doesn’t have a sufficient market need.  Here are some common fallacies that I have noticed through my 20+ years in software development:

Fallacy #1: No one will understand our vision until they see it!

Some people have the belief that in order for someone to understand what they are talking about they have to show a working product.  This is symptomatic of confusing the problem and the solution.  While it is true that sometimes people need to see the solution, the solution can not be built before actually learning about the problem.  When we were working on early versions of DocuSign the founder, Tom, has spent a lot of time figuring out what real problems people had without necessarily jumping to the demos.  He found out that real estate agents hated to drive for some forgotten form, sales people wanted to close a deal while having someone on the phone, out of state candidates wanted to see and sign their offer letter without waiting till they moved or visited.  These were real problems that customers agreed with when we confirmed them.

Fallacy #2: Customers won’t meet and waste their time unless they know we can help.

This is another common problem and usually stems from a weak value proposition.  Have you ever had something that bothered you a lot?  If it’s a priority for you and a vendor wants to hear you out and potentially solve this problem for you, I bet you would take that meeting!  If people don’t want to take the time to talk to you about the problem you are trying to solve, how likely do you think they are going to take a sales call and actually pay money to buy your product?  If you are postponing these conversations or you reached out to some customers and they don’t want to talk, you are headed in the wrong direction.

Fallacy #3: There are a lot of customers, we can always talk to them later!

If there is a large population of potential customers, it should be easy to pick out a few representative people who you can interview.  If it’s a struggle to find such examples that set might not be so large.  Also while the set of customers might be large, the problem you are trying to solve might not be at the top of their priority list and you will then need to spend a lot of marketing and sales efforts to make them pay attention to it.

Fallacy #4: We can’t be a slave to one customer!

Sometimes product teams believe that just listening to particular customers makes you a consulting company instead of something that creates a product that’s usable to the masses.  This is a danger you can not ignore.  The real solution here is to talk to *more* customers, not less.  It’s through the analysis of many different people and conversations with them that you will find the common themes that will appeal to a large enough population.

Talking to customers early and getting requirements from them has another side effect: a lot of product development teams rally to meet the deadlines if they know that they will help some real company instead of meeting a hypothetical corporate goal.

I believe that a worthy problem will get customer engagement early and often and the sacrifices that the product development organization goes through to ship something will be rewarded with a lot of usage.  Getting caught up in the fallacies I cited above puts you at risk of wasting your time.



Popular posts from this blog

SDET / QA Engineer Interview Checklist

After interviewing and hiring hundreds of engineers over the past 12+  years I have come up with a few checklists.  I wanted to share one of those with you so you could conduct comprehensive interviews of QA Engineers for your team. I use this checklist when I review incoming resumes and during the interview.  It keeps me from missing areas that ensure a good team and technology fit.  I hope you make good use of them.  If you think there are good questions or topics that I have missed - get in touch with me! SDE/T or QA Engineer interview checklist from Mike Borozdin If you like this checklist you might want to check out these posts: Emotional Intelligence in Software Teams   and  Good-bye manual tester, hello crowdsourcing!

Code versus Configuration

At Ethos we are building a distributed mortgage origination system and in mortgage there is a lot of different user types with processes that vary depending on geography.  One of our ongoing discussions is about how much of the logic resides in code vs. being in a workflow system or configuration.  After researching this topic for a bit, I have arrived at a conclusion that the logic should live outside of code very infrequently, which might come as a surprise to a lot of enterprise software engineers. Costs of configuration files and workflow engines First thing that I assume is true is that having any logic outside of the code has costs associated with it.  Debugging highly configurable system involves not only getting the appropriate branch from source control, you also need to make sure that the right configuration values or the database.  In most cases this is harder for programmers to deal with.  In many FinTech companies where the production data is not made readily acce

Intuitive Programming - Comments

Comments are a topic of vibrant discussion.  Ever since programmers could leave some text in the program that was ignored by the machine the debate started: “what’s a good comment, what’s a bad comment, why comment?” There are endless instructions to programmers that say many of the following things: 1) Describe your function! 2) Don’t write in the comment what you wrote in the code. 3) Tell people why you are doing what you are doing. What I think has been missing from this discourse is the audience for comments and through those audiences there is intent.  The code is being read, skimmed or analyzed by people and tools.  So what are the audiences and reading modes? 1) Maintaining and enhancing the code 2) Skimming through the entire module or file to figure out what the overall structure is 3) Reviewing the test files to check out the test coverage and edge cases 4) Seeing the docstrings of functions while being in a separate file altogether 5) Reviewi