Need to run lots of user tests? Introducing Solid’s auto-moderated studies: Artificial Interviewer Anna guides your users and reminds them to think out loud.
Hey everyone!
Jonas, Founder of Solid User Tests here ✌️
I published my first website in 1998 (when I was 12 years old) and learned early that you have to show your interface to real people when you want to understand how they perceive your concepts and designs.
A few years ago, still working as a Freelance UX Designer, I had to conduct 40+ moderated user tests, which was just so much work and we didn't learn much after 10 sessions. But sometimes you just want this extra level of confidence – or your stakeholders request it.
I was never happy with the solutions out there for unmoderated tests. People rush through the tests and when things get complicated, they stop thinking out loud. In a moderated setup you can remind participants to think out loud again, in an unmoderated setup you can't.
With Solid, you can now create auto-moderated studies, where Anna does the moderation for you. She helps users set up their device (sharing screen, microphone, camera) and talks to users – which feels much more natural for a tester than just reading text. Additionally, Anna reminds testers to think out loud, if they get silent.
We already ran 500+ sessions and can say with confidence that you should run all your unmoderated user tests with Solid – auto-moderated.
I'm not sure, which podcast it was, but I heard an argument on some show about moderated vs. unmoderated user tests. Do you think, Solid could replace moderated user testing?
Not all moderated tests should be replaced with Solid. I would say, currently 10-20% or moderated tests can be done auto-moderated.
Anna's capabilities will improve and in a few years, she might be able to conduct 50%+ of moderated user tests.
Today, when you are still exploring, you should probably do moderated user tests and add an additional 10-20 auto-moderated tests to get a bigger sample size.
You can replace all unmoderated user tests with Solid's auto-moderated tests!
Thanks @dominikg
It’s Anna, our avatar. With other tools, testers read instruction text and are asked to think out loud. This feels weird.
Anna talks to testers, helps them to set up their device (screen sharing etc), and walks them through the test session. Anna even asks testers to think out loud; testers become silent when they are confused.
This results in more high-quality test sessions, and therefore less work for the UX teams.
I had the chance to conduct a couple of unmoderated user tests with Solid. For us it was both a test of our prototype as a well as a test of the tool Solid and unmoderated user tests itself. Talking about the latter, I was positively surprised. We gained valuable feedback from those tests. As a surplus, it was easy to organize the tests and to evaluate them afterwards. In my opinion these are the two best arguments, why you should give it a try as well!
Curious to see how the journey of Solid will continue! :)
Solid User Tests
Solid User Tests
Relevance Coin
Solid User Tests