I recently carried out a day of moderated open card sort testing as part of a website refresh project, which provided an opportunity to experiment with both physical and digital card sorts.
In the spirit of reflecting on Lagom’s research practice, this blog talks about some of the things that I have learned through the process and how I think it affects our practice going forwards.
Physical card sorts
The physical card sort sessions involved printing the site content onto a set of index cards and participants using post-it notes to write down the group headings as the card sort progressed.
In terms of delivering the session, using physical cards felt interactive and engaging for the users. The use of post-it notes allowed participants to reflect and refine their group headings as the session progressed. Similarly, the use of index cards provided participants with the opportunity to write down new content terminology in instances where they felt a term was incorrect or ill-defined.
To analyse the results alongside a larger number of unmoderated card sorts, I needed to upload the physical responses to our card sorting software (Optimal Sort). This process takes up extra time, and may have some implications for the quality of the research.
One of the benefits of the physical approach is that it affords participants the opportunity to create and articulate sub-groups. Whilst Optimal Sort works well generally, it does not provide an equivalent opportunity to create sub-groups or allow participants to explore new content terminology. Consequently, there is a risk of losing the thinking associated with the creation of sub-groups when importing the findings to the software.
Digital card sorts
The digital card sorts sessions had some advantages in comparison to the physical approach. Notably, the process was more efficient in terms of translating the findings from participants into the analysis software, as participants were directly inputting the data into the tool. Optimal Sort works well for this process. However, as it lacks options to create and record sub-groups, I had to capture this information through conversation rather than allowing participants to map it out themselves.
Another benefit of the digital process is that it offered the chance for clients to observe the sessions without physically being in the room, by using screen sharing software (whereby.com). The client could hear feedback first hand throughout the day. This proved particularly valuable when analysing the results with the clients, who were better able to understand the nuances in terminology used by participants on the day.
How will we deliver card sorts in the future?
Like most things the answer to this question feels particularly context dependent. Both physical and digital options have their advantages and disadvantages. However all things being equal I have a preference for the physical option, because it offers possibilities for participants to create sub-groups and rename cards as they participate in the activity, whilst also feeling more engaging and fun. I feel that this ultimately results in better insights, which is the most important factor in research at the end of the day.