DM7917 Week 6 – Usability Test 2 Report and Video

A video and audio recording of Usability Test 2

Log

  • 00:00 – Begin by reading introduction to tester
  • 02:51 – Log-in screen shown to tester. “It is clear, and easy to understand what I need to do”
  • 03:20 – Premise of app is explained to tester (I almost forgot this)
  • 03:45  – First task given “to register as a new user”
  • 04:10 – Tester highlights that there is no option to reset login details “There isn’t an option to say that I’ve forgotten my login details”
  • 05:15 – Open questions being asked 
    • “Is that [screen] what you would expect to see, or not?” 
    • “What would you expect to do on this screen?” 
    • “Could you show me how you would interact with that screen?”
  • 05:45 – I’ve started including an overlaid keyboard, as there is potential for this keyboard to obscure important aspects of the UI
  • 06:25 – I explain a privacy motivation for why passwords are represented as stars / hashed
  • 06:45 – Tester explains that they wouldn’t expect to be able to copy their new password to the ‘confirm password’ field. I ask “Why?” to clarify the tester’s reasoning
  • 09:15 – Tester asks to clarify the purpose of the Instruction screen. “So is this the home page? Is this the first thing you’d see every time?” I explain that a new user would see this, and for further clarity. “It feels like you’re getting straight into performing the test without having selected to do it?”
    • I ask the tester to volunteer what they’d expect to see. “Maybe a Home Screen? A menu that allows you to click on ‘Start a test, or start an activity… So you know what you’re doing”. I feel that they (just like the previous tester), have misunderstood the step-by-step instruction process. I need to reconsider the onboarding process for the next iteration.
  • Tester continues, wanting to act-out with the instruction process, rather than just swiping and reading. They expect the pages to swipe automatically once they’ve carried out the written actions
  • 12:00 – Tester would expect the second instructions screen to have a video that can be played-on-click, then they would interact with it as they would the actual test
  • 13:16 – Tester would expect analysis on their performance after completing the test, as well as explanations of hazards that they missed. Further H&S tips would be appreciated, to explain the outcome of missing hazards
  • 15:05 – Tester feels that the “Studio rules” section reference in the Instruction process is unclear… They’d rather be told where to find that section. “In the menu…”
  • “What’s the purpose of revision?” Need to make clearer
  • 17:55 – Tester assumes that the “Help” button in the top bar is unique to each page. This is my intention. At this point, I explain to the tester that they are following the instructions and I appreciate their thoughts.
  • 19:20 – Tester correctly identifies that they can scroll on the “Start a Test” screen, but only due to the paper prototype’s form. They would prefer a scrollbar on the UI to make this clear
  • 19:50 – Tester would prefer an explanation of the content of each video, rather than a thumbnail – I’m unsure of this as I do not want to give the answer/allude to the hazard of each test
  • 21:09 – “One video [per line] in the list and the explanation to the side of it, to make it clearer”
  • 21:39 – Tester would have expected to press a play icon to start a video when ready, rather than autoplaying once the test is selected
  • 22:45 – Tester feels unclear where on the screen to tap when a hazard occurs, and whether they must tap the screen on the first run through of the video, or if they watch is first, then tap the next time.
    • Tester also sees the play button screen as an opportunity to give final instruction on what to do
  • 24:25 – Tester was unaware that they needed to rotate the phone, this is likely due to the wireframe being visible and no imagery being present
  • 26:55 – Tester expected feedback to be given to them in landscape mode following the video, as they were not prompted to rotate their phone back to portrait mode
  • 28:10 –  Tester unclear of meaning of stars. “Are they a reflection of how many I got right?” The number aspect of these stars could be misleading. 
    • Tester is also uncertain about whether this is the best possible performance… 5 of out what? 
  • 29:40 – “What goes into the 5 star performance… if there’s only one hazard to spot?”  “1 star for speed, 1 star for accuracy” “I don’t see what [the 5 stars] are based on” 
  • 31:10 – “What’s going to be here?” Tester would like a still/screenshot of the video to be used in the results, rather than a generic image. “It cements in my mind what I’ve spotted”
  • 33:09 – Tester expected to return to the “Start a Test” screen after having finished reading her test results
  • 33:55 – Tester liked being able to see which tests they have already completed. A percentage of completion could also be good.
  • 35:05 – Tester correctly identified and used the hamburger menu
  • 35:41 – Tester would expect the menu to fill the screen rather than half of it. Accessibility point raised – filling the screen is ‘clear because there’s less content in the background”. I push for further explanation. “If you’re on a phone, if you’ve got a small screen… You think you’re clicking on instructions but the screen thinks you’re clicking on what’s behind it” Then the app would not function as the user intends
  • 37:10 – Clarity wanted on type of studio. TV Studio, music studio? Etc
  • 38:04 – “Studio Rules” screen appears as tester expects
    • Page controls correctly tell Tester that they can swipe on the page
    • Tester prefers this layout for the “Start a Test” screen as it can include explanation about content of each test
    • Tester asks whether they can click on each image on the Studio Rules screen. If so, they could to more closer at each image to see more detail.
    • Tester contemplates whether it could be useful to have an explanation in the Studio Rules menu, to read before the user reads the rules
  • 42:42 – Setting screen appears as tester expects, including the “sliders”
  • 43:27 – Tester would expect current “View” section to read as “Display”
  • 43:50 – “Help” section is confusing because “Help” already appears in top bar. “Controls” could be better

Summary

  • I must shorten my usability tests to focus on short task flows. Watching the recording back and making notes takes a considerable amount of time, so I can understand why professional user testing includes a facilitator, ‘computers’, and observers
    • Usability test was much longer than I had anticipated as tester had many views to share and I often asked for expansion upon them
    • I was also keen to stick to the tasks that I had written down, as this would be an improvement on my first usability test. I subsequently realised how long it would take to complete each task flow
  • Seemingly obvious interactions and not necessarily obvious in a paper prototype, such as the need to rotate the device. They are not as ‘intuitive’ as I thought, and perhaps this is due to the lack of rendering in the prototype I used? (Lack of imagery)
  • I’m beginning to realise that testers will cast light on so many potential issues. I must develop a method of prioritising them in-order to progress to a high-fidelity prototype and proof of concept. Mapping them to the Design Hierarchy of need could be useful here
  • Sliders, scroll bars, and page controls – all part of iOS Apple’s Human Interface Guidelines are very recognisable to this tester. For the sake of completing this project, perhaps I should make the same assumptions about Android developer guidelines https://developer.android.com and the Android community

Actions

  • Include an option/screen for users to reset their login details/password
  • Iterate on instruction screen that that users can clearly see that it is set of instructions, and not pages to interact with
  • Map suggested improvements to the Design Hierarchy of Need, as there are many improvements being suggested and I must prioritise
  • Rewrite instructions to clarify the purpose of revising in the Studio Rules section
  • Include scrollbar in UI to show that “Start a Test” screen is scrollable
  • Include Autoplay as an option
    • Switch Autoplay off by default
  • Stars meaning is unclear (numeracy of stars is problematic – what’s the best amount? How did I achieve them?)
    • Explanation of how users are assessed is required
    • Consider switching stars to Platinum, Gold, Silver, Bronze medals?
  • Menu to be made full screen in the interest of accessibility
  • Keep type of studio consistent. Name “TV Studio” everywhere
  • Change “View” to “Display” in settings
  • Change “Help” to “Controls” in settings, perhaps? Is there a better name? Research other apps

Leave a Reply

Your email address will not be published. Required fields are marked *