Customer Interview Script: Screenshare Test
This was written as part of the rough draft for Deploy Empathy, a practical guide to interviewing customers.
Like it? Order the book:
—————————————
The screenshare test is an essential part of your toolbox. Similar to usability testing,* it involves putting a "physical" product in front of someone for them to explore in a guided way.
It can be used for landing pages to see why people aren't converting, to understand difficulties people might have with using a tool, to see why people keep emailing support about something that they can do on your website without you, to test a prototype, to see them implement your service...
By putting a product in front of someone, we can also reveal deeper insights about our customers' processes and organizational dynamics that will help us identify unanticipated snags that might reduce conversions or successful use of the product.
I'll give you an example.
In late 2017, we began the process of making a HIPAA-compliant version of our service. For years, we'd had potential customers asking us they could process patient addresses, and we decided to finally add it.
The data handling rules for health-related data in the US are quite strict, and this involved basically rebuilding our infrastructure from scratch to make it comply with HIPAA. Given the expense and time required on our end, we put a lot of research into this. We did surveys, interviews, competitor research, and market sizing. We even hired external consultants to ensure compliance.
But we figured that since people wanted the same process as our existing product, we didn't do much on the usability testing side. We'd just mirror the UI, with different piping under the hood.
I continued regular conversations with potential customers and surveys through the development process, getting more clarity on things like pricing (down to which plans they intended to buy) and trying to effectively pre-sell the product.
Almost a year later, we launched. And emailed all of the people who'd said they'd definitely use it.
And waited.
And... nothing.
Not even on our lower-cost pay-as-you-go plan.
I got responses from a few people, and they said it was still working its way through internally.
Hmm.
A week went by.
Given the enthusiasm we'd received to that point, I was confused, and knew we'd missed something. So I went to Reddit, found the HealthIT subreddit, and told people I'd give them a $25 Amazon gift card for helping us test our site. I found developers, data analysts, and even hospital executives.
The first test was going as I thought it would during our building phase. Positive feedback, nothing was confusing on the site, clear what it was and why it was a better product than other options.
And then I asked: "Let's say you decided to use this product. What would you do next?" expecting them to click on the "sign up" button.
"Oh, I'd go talk to our Legal department."
"Oh?"
This was not new information that Legal would be involved. We'd even designed a whole new onboarding flow specifically for uploading the required documents before using the product. We had a new section on the website devoted to security and different measures we'd taken to ensure compliance.
But something in their tone of voice told me this wasn't their favorite step in the process.
[Pause]
"Yeah, I'd have to go talk to legal. And they'd probably have to do a security review, in addition to contract review. Man, sometimes that takes six months, or longer."
Oh shit.
We knew Legal would be involved. We knew security reviews would be involved. We did not realize it would be a six month plus process... even for a product that would cost them $500 a year.
Our pay-as-you-go tier only works because it's self-service. Doing that level of legwork for a lower-tier plan just wasn't feasible for us.
Surviving six months of reviews and negotiations for a plan that required us to keep very expensive servers running even when we didn't have any customers? That wouldn't work, either.
The next screenshare test revealed the same thing. And the one after. And the one after.
And the one after.
We gave it another week.
Product management expert Marty Cagan talks about how, in order to succeed, a product must be viable, usable, valuable, and feasible. We'd nailed most of that: it was valuable for the customer, usable, and feasible for us to build.
But was it viable for us as a business when the rubber hit the pavement? Turns out, no. A sales cycle of six months would kill us.
A month in, we decided to scrap the pay-as-you-go tier, and only offer the product as an enterprise option. (The benefit of this was that because of the way this plan is built, we only incur costs when we have a customer.) For months, this basically meant the product was a landing page only with no customers, effectively dormant.
Then one person signed up... and another a few months later.
It took about a year for the product to find its footing. It's now our fastest-growing product and contributed significantly to our 56% growth rate last year.
The vast majority of those customers did indeed require months of discussions and reviews. We had one last a year and a half. (Though another six hours -- but that's an anomaly.)
The moral of the story?
Get your product in front of people. And don't just ask about the product. Ask about the other parts of the decision process too -- the parts that have nothing to do with technology.
(But without further ado, let's get to the script.)
The Script: Screenshare Test
As with the other scripts, this is intended as a jumping off point. I suggest copy/pasting this into a document and removing/adapting sections to fit your product.
Unlike other scenarios, you want to be on video for this one. I suggest setting the call as audio-only and granting the person screen share permissions, reducing the need for visual interaction. I also strongly suggest recording these sessions to give your brain the most amount of space to absorb what the person is doing, where they hesitate, and so forth. Remember to always ask for permission before recording.
I would suggest scheduling these sessions for half an hour. An hour is the absolute most, as it might be tiring for both you and the participant. (I have observed that screenshare tests can be more mentally taxing than interviews.)
A key way this differs from other types of interviews is that it's more observational, with the person driving the flow. Most of the session is spent listening to their narration and asking things like "Is that what you expected to happen?" and "Can you tell me more about why you'd want it to do that/why you thought it would do that?"
If the person asks you what will happen, you'll need to deflect it back to them. For example:
Person: "Will this let me do [X]?
You: "Can you tell me more why you'd want to do that?"
The hardest part about this kind of interview is watching someone struggle through something without giving them hints or prompts, and not giving them answers when they ask how to do things.
It's hard to resist, but you can do it!
You will learn so much from this kind of test if you can manage to tamp down your desire to rescue them from confusion in the moment. Remember... the confusion you're seeing on screen is probably replicated by dozens or hundreds of users you've never been able to observe but have showed up in your data as drop-offs, so being patient and not jumping into help in the moment will help many more people in the future.
(And if you fixing something later, you should email this person to tell them that they helped improve it for others. That kind of follow-up will go a long way.)
Alright. Here we go.
Introduction
Thank you so much for taking the time to do this with me today. I've planned that this will take about half an hour. [After that, I'll send you the gift card as promised.]
Before we start, do you have any questions for me? [pause]
Do you mind if I record this session? [quickly] That'll just make it easier for me to review it later, and it won't be shared externally. [pause]
So, before we get started, I want to just make sure it's super clear that we're testing the site, not you. You can't do anything wrong. The more honest you can be is helpful for us. If something doesn't make sense to you, there's a good chance it wouldn't make sense to someone else, and we'd rather figure that out now rather than when we launch it and accidentally confuse a whole bunch of people. So your honesty is super helpful. [pause]
So you're going through the site, please narrate out loud everything you think as you think it. Remember, you can't offend me. [pause]
Acclimating
Ok cool, let's get started. I've just given you screenshare permissions. Can you try to share your screen? Feel free to close any tabs or anything that you wouldn't want someone else to see.
Ok, looks like I can see your screen. Great.
First Questions
[If you want to see how they find your product] If you wanted to go to [our website], how would you do that?
Can you take a look at this page and tell me what you think you can do here?
What do you think this product does?
Thinking big picture, where might this product fit into your process? What might you need to do beforehand, and what would come after?
Expectations
You will likely need to repeat this cycle several times -- think, key CTAs, key actions, and so forth.
If you clicked on that button, what would you expect to happen?
Go ahead and click it.
Was that what you thought would happen?
Task Analysis
This may be repeated in several loops to analyze key actions. For example, for Geocodio, this might be "create an API key" or "geocode a spreadsheet."
Let's say you wanted to [do thing your product does]. Without doing it, can you tell me where might you go to do that?
Ok, can you show me where you think you would do that?
Before you click, can you tell me what you would expect to happen?
Is that what you expected to happen?
Can you go ahead and [do the thing]?
[Observe them going the process. If they hesitate, ask them why.]
Decision Process
What would you want to know about this product before deciding to use it?
How would you find that information? [go through the 'Expectations' process above]
Let's say you decided you want to use this product. What would your next step be?
Would you need to talk to anyone else in your organization before using it?
Purchase Process
Let's say you have approval to buy this product. [I'm going to give you a fake credit card so you can do that.] What would you do first?
Watch them go through the purchasing process
[Before they click to buy] What would you expect to happen once you click 'buy'?
[After they purchase] Was that what you expected to happen?
If you wanted to then use the product, what would you do?
Magic Wand
I have a bit of a lighter question, so bear with me here. If you had a magic wand, and you could change anything about this site or product or anything else, what would it be? It can be multiple things.
Wrap-Up
Thank you so much for taking the time to do this today. I just have one more question. Is there anything else you think I should know about what you saw today? [note: this usually elicits fewer reactions than in an interview because of the Magic Wand question, but it's still worth asking.]
Thanks again for doing this. You can stop sharing your screen now, and I'm going to go ahead and send you that gift card now if you can stay with me for a moment.
[email] is your email, right? Okay, it's on it's way now. I'll just hang here with you for a moment while we wait for it to be delivered. [pause, let them offer any other thoughts that may have come to them]
Did it arrive? Great. Okay. Thanks again!
A note on language
A key goal of this is to figure out what is confusing about what we've built. (I'd say "what if anything," but in my years of being in this space, I've learned there's always something.)
However...
I strongly advise you not to use the word "confusing," unless the person themselves uses it.
In the same way that some people might be offended by the question "How do you feel about that?" and react better to "What do you think about that?", some people may find it off-putting for someone else to imply they were confused, as if it implies something they did wrong.
Instead, I have gotten better reactions to phrases like "makes sense." For example,
"Is there anything on this page that doesn't make sense?"
"What part of that process didn't really seem to make sense?"
"I notice you hesitating. Is there something that doesn't make sense?"
I have noticed that sometimes "confusion" is something inflicted upon us by an external force, whereas whether something "makes sense" assumes our inner concept of what is sensical is the correct one.
The exception to this is if the person uses the word confusing themselves and you use it in a mirroring context. For example,
Person: "Jeez that form is confusing!"
You: "Can you say more about what was confusing about the form?"
Incentives for Screenshare Tests
If this is a project where you've specifically recruited non-customers or customers, offer a monetary incentive. Reviewing a product, even if digital and intangible, feels different to people than a conversation does -- it feels more like work. The social dynamic is that they're doing you a favor, so to match that feeling. $10-25 to Amazon is usually good. I wouldn't offer swag in this scenario.
By contrast, it's different when someone (customer or not) asks you for a feature or raises an issue with how something works. For example, a few weeks ago, a potential customer asked us if we had an integration with a specific ad platform. I'm unfamiliar with ad platforms, so I asked them to walk me through what they were trying to do so we could consider what the scope of it would even be. Since there was a strong incentive on their end, the dynamic was more that we were doing them a favor than the other way around. It would have been awkward to then offer them a gift card.
It might be business, but these sort of social dynamics are important.
*Point of clarification: What I discuss here is more akin to what JTBD and Lean UX literature would discuss as prototype testing, except applied to new and existing experiences alike. This article is more focused on business validation/solution evaluation cases, with a sprinkle of usability. Usability testing is generally focused on user interface factors and whether the mental model of the product matches the user's mental model of a specific problem, but less so on discovering overarching goals (jobs) and processes. For example, usability testing encompasses whether people can locate a button (a signifier), whether a dashboard permits someone to delete their account (affordances), the kind of content a user might expect to find in the navigation, and many other scenarios. Related is the topic of accessibility design and testing, which includes but is in no way limited to, say, testing with screen readers or on slow internet connections. Accessibility-related and usability-related issues will often come up in this type of session, even though it is not the primary purpose. For more on accessible design, please see here.