Studies show that there is a direct correlation between exposure to user feedback and design or product improvements. To put it simply – you will build better products by watching other people using them!
Interestingly enough, you’ll improve even if you don’t immediately act on a single finding.
Seeing just a single user struggling with a certain part of a product you designed/ built, will leave an impression you will remember vividly the next time you touch this part of the product.
This should of course be no excuse for not fixing usability problems you detected, but it shows why exposure to real users can be such a powerful tool – you’re building empathy with your audience on an ongoing basis.
Step 1: Measure user exposure hours
According to Usertesting’s 2015 Industry Survey report , more than half (59.94%) of respondents say their company will conduct tests more frequently in 2016.
Usability Testing Industry Report 2016 from UserTesting.com
But how do you know whether you reach your goal of increasing the frequency of user feedback in 2016?
You can’t improve what you can’t measure.
– Darren Hardy
One simple idea would be to count the number of usability tests done or number of participants tested within a year. But this number would (hopefully) increase continuously over the year, so it wouldn’t tell you how good or bad your team is involved in the research.
Don’t spend time running after vanity metrics
So, instead of counting the number of usability tests done or participants tested –
measure the hours and frequency your team spends watching real users interacting with your or the competitor’s design.
There are great and easy ways of displaying metrics like these to keep your team on track and ensure ongoing progress.
This is a low fidelity dashboard Daniel Travis suggests in his presentation and which can be set up in a few minutes:
Usability Dashboard to track the exposure hours of your team. Image by David Travis
Exposure hours to usability testing are quite easy to measure if your team is already tracking their time and effort.
So, how should your team increase exposure hours?
Great way to do this is with unmoderated or moderated usability testing or field studies .
Any method which involves watching people who are unfamiliar with your product or service actually using it and trying to accomplish relevant goals.
What counts as user exposure?
Nielsen has put up a fantastic list of what doesn’t count as exposure to user feedback:
- Talking to users or interviewing them (nice, but not exposure to actual use)
- Focus groups (ditto)
- Surveys (ditto)
- Using products yourself (yes, this is real use, but you are not an average user)
- Reading the report from a usability study (a great learning resource, but a summary is not the same as your own direct observation of live user behavior)
- Summative findings from a quantitative user study, such as card sorting, tree testing, and so on (only counts as user exposure if you watched each individual user while they carried out the test)
Observe more than just your own design!
Don’t just observe people using your own product or service.
You can learn a lot from other people who are trying to solve the same problems as you are, and you can then compare strengths and weaknesses of your product to those of your competitors.
“The way to learn is simple: watch what users do. Preferably with as many diverse user interfaces as possible: not just your own design, but the competitors’ designs as well.”
There are many great ways of testing competitors’ designs with remote usability testing .
Step 2: Get everyone in your team on board
Everything you design is meant to be used by people, and the more usability tests you watch, the better you’ll become at understanding how people think and WHY they do what they do.
User researcher’s fallacy: “My job is to learn about users”.
Truth: “My job is to help my team learn about users”. #ux
— Caroline Jarrett (@cjforms) 4. Juli 2014
Usability testing will make you an expert in understanding human behavior, something that can be beneficial for all teams in your company – from marketing to development!
While in an ideal world all of your team members would participate in all feedback sessions, this isn’t always feasible.
This is why it’s important to provide video clips of the tests’ findings for people to watch on their own time.
Jacob Nielsen proposes the following – different annual user exposure hours for every job position:
Image from NNgroup
“Each team member has to be exposed directly to the users themselves. Teams that have dedicated user research professionals, who watch the users, then in turn, report the results through documents or videos, don’t deliver the same benefits”
There are a couple of great ways of making it easy for your team to experience to power of observing users:
- Make sure everyone in your team knows when your usability testing sessions take place and invite everyone in advance to add questions and join setting up the tasks and scenarios.
- Schedule a weekly Usability-Testing-Watching-Event. Bring some popcorn and make watching videos of people interacting with your design a fun activity ( which it really is ).
- If you’re using a remote usability service to gather your insights, share the usability videos with your whole team using asynchronous media like Slack, so that everyone can watch the videos on their own time.
That said, how much time should your team invest in user exposure?
Everyone in your team should spend at least two hours every six weeks watching user feedback and learning from it .
There seems to be a general agreement about this amount of time in the UX and Usability sector:
“Two hours gives you enough time to see the subtleties and nuances of how people use products. It has to be recently (last six weeks) or else it is forgotten. Once you start to see the same problems over and over again, you focus and fix them. The best organizations do this weekly.”– Jared Spool
This is how your schedule would look like following this process:
While I definitely agree with the amount of time and the approach of having a continuous schedule of user exposure, I see a few downsides to this 6-week cycle:
- It might take up to 6 weeks till a problem which occurs as a consequence of a previous fix of another bug is detected.
- It’s hard to get into the habit of watching usability tests on an ongoing basis with a 6-week interval.
- A discussion might occur after 2 hours of watching usability tests.
Step 3: Follow a weekly schedule
Our experience with the remote usability subscription – Userbrain has shown us that it’s much easier to convince people to invest time in user feedback if there is some kind of weekly automation that engages them in a continuous feedback loop.
So this is the kind of schedule I would suggest – spending the same amount of time, but only distributed differently:
This weekly exposure to user feedback offers a few benefits :
- You can spot problems much faster and therefore move quicker through the design and development circles.
- Instead of waiting 5 weeks for new results to get in, you’ll have new user feedback every week.
- It’s much easier to convince people to free up just 20min than 2 hours at a stretch.
- Watching usability tests every week can easily allow team members to get into the habit of doing so and builds up a shared understanding of your users’ behavior and the WHY behind their clicks.
The best way of building a useful and fulfilling product or service for your customers is building it with the help of their feedback!
Watching what users do and learning WHY they do it – on an ongoing basis – builds up empathy and let’s you act on your insights very quickly.
You will ultimately be able to build better products just by watching other people using them.
So have you set a specific goal for the number of hours of user feedback you are expecting this year?
Share your experiences in the comments so we can all grow from them.