“I’d normally be doing this as a two-person team, so if some parts feel strained, then that’s why”. Rachel Littlefair needn’t have worried. Giving a comprehensively researched and informative talk, she laid out the land for the current state of remote usability testing.
For those that don’t know what remote usability testing is, it pertains to using software to be able to do usability testing with someone regardless of their location. With moderated testing, you still do it, teleconference style. With unmoderated testing, you set the test (usually with written instructions) and the participants go through it themselves.
Littlefair’s work at The UX Agency brought a particularly in-depth perspective. The company specialises in doing usability testing with difficult-to-recruit niche audiences such as stockbrokers and scientists, as well as people in countries where they may not speak fluent English. This meant that she was able to pinpoint particular concerns when it came to tools that might not be picked up by people doing testing with more general and UK specific audiences.
She provided impressively comprehensive breakdowns of the various tools available in each of the two categories.
I was interested to hear that her recommended remote moderated tool was … that conference call stalwart, GotoMeeting, mainly as it didn’t have too many issues with being downloaded, worked well internationally, could be used for testing products via VPN etc, and was OK (if not perfect) to set up with people that weren’t overly confident speaking English.
Some other interesting ones included the no-download appear.in – “though the fact that it has animal rooms means that it’s not suitable for using with stockbrokers or other corporate people” and join.me. Webex was the former gold standard but has slipped, and also tends to trip a lot of corporate networks. Skype is usable but does have bandwidth problems, and Google Hangouts “we tried. I don’t want to talk about it”.
Overall, the market when it comes to remote moderated usability testing for mobile is still pretty wide open as most aren’t very good.
When it came to unmoderated tools, there were a lot more options … but also a lot more variability in terms of results.
She did cite UserTesting.com for being particularly bad when it comes to users being trained to give answers in a particular way, but also noted generally that you get charged a premium should you – shock horror! – want to use your own participant bank rather than a general pool.