DM7903 Week 5 – Usability Test 1 Report

Above: A video recording of Usability Test 1 or DM7903

Report

  • 00:00 – Briefing starts
  • 01:30 – Briefing ends
  • 01:30 – Splash / Loading screen presented to tester
  • 02:04 – Augmented Reality Experience screen presented to user
  • 02:21 – Confirmation that placement ring/target is misunderstood as an object (could be due to drawing)
  • 02:40 – Tester reads instruction and presses placement ring to place first object
  • 03:10 – Tester would like to press the ‘freeze’ button to move the object. Voice concern of putting their finger over the object they’d like to move because they’d be ‘covering the object’
  • 03:45 – Tester realises that a swiping gesture can be applied to move the object around the screen
  • 04:15 – Tester understands how rotation functionality works
  • 04:55 – Tester confirms they understands that they ‘can hold [their] phone and walk around the object’ to view it
  • 05:15 – Tester confirms they understand the ‘Freeze Frame’ functionality
  • 05:47 – Tester recognises expected behaviour of toggle switch, which they call a ‘bar’. Not a button.
  • 06:40 – Tester is revising the briefing given to them on a computer screen (out of shot)
  • 06:50 – Tester confirms understanding that they can visit the Object Selection screen by pressing the respective button in the tab bar
  • 08:00 – Object Selection screen presented to tester
  • 08:10 – Tester’s expectations of the screen’s behaviour matches my intentions
  • 08:35 – Tester recognises repositioning of task bar and lack of ‘Freeze Frame’ functionality and camera on the screen
  • 09:12 – Tester interacts with Object Selection screen as expected to select and place a new camera
  • 09:47 – Augmented Reality Experience screen presented to Tester again
  • 09:58 – Tester confirms that they expected to revisit this screen after selecting a camera on the Object Selection screen
  • 10:05 – Tester explains they would expect the application to behave as previously when following the interactive walkthrough
  • 10:29 – Interactive walkthrough was required again to prompt tester to continue to try reviewing a Freeze Frame. They did not continue to explore the functionality without prompting
  • 10:42 – Unclear whether Tester is aware that their first ‘Freeze Frame’, taken during the walkthrough process, has already been stored in the app, but isn’t ‘saved’ to the OS’s Camera Roll… 
    • Perhaps there is room for confusion here. The ‘Freeze Frame’ has actually been saved in the app, but not saved to the OS’s Camera Roll. Maybe clearer terminology and explanation needed? ‘Save’ vs ‘Export’?
  • 11:32 – Freeze Frames screen presented to Tester
  • 11:32 – Tester confirms that the screen matches their expectations
  • 11:49 – Tester selects first ‘Freeze Frame’ as expected
  • 12:11 – Tester presented with Freeze Frame #01 screen
  • 12:19 – Tester selects the ‘Save’ option as expected from the tab bar

Summary

Watching back on the usability test I feel that I could improve some of my verbal communication. At times I am unclear or appear unconfident, when really I’m spending a bit of time thinking about what I’m saying as I try not to ask any leading questions. As I complete more usability testing and develop my techniques, I am sure this will improve.

I am very satisfied with the outcomes of this usability test. Many of the behaviours that I’ve incorporated into the design were recognised, and I think this is largely because I borrowed some recognisable artefacts from Apple’s iOS. Such artefacts include a toggle switch, a tab bar, and a drawer for images (much like common photo album applications). The Tester’s sudden recognition of the toggle switch’s change in state at 05:47 is testament to how recognisable UI artefacts and their resulting behaviours ease the user into a new experience. Recognition of gestures such as tapping to make selections, as well as swiping around the screen to move the object, also align with my earlier research, “standardised interaction schemes are required to overcome any limitation in user understanding” (Craig, 2013). 

The tester embraced the interactive walkthrough, although there was some reliance on it, as evidenced by the tester not proceeding to the freeze frames screen without the walkthrough being reinstated. This raises an interesting debate between interactive walkthroughs and other forms of onboarding, which I haven’t explored in this project – a potential research idea for the future. 

Actions:

There are three issues identified within the session that may require remedial work as I produce the medium-fidelity prototype. These are:

  • Confirmation that placement ring/target is misunderstood as an object. It is very likely that the ‘target’ (Andaluz et al., 2019) is confused as an object due to my drawing abilities and the paper prototype as a medium.
  • Tester would like to press the ‘Freeze’ button to move the object. Voice concern of putting their finger over the object they’d like to move because they’d be ‘covering the object’
    To some extent this may be due to the presentation of a paper prototype. However from my research of other augmented reality experiences, I have noticed that small, translucent handles often appear around objects to give the use of something to hold onto and not cover the object itself
  • Onboarding process was required again to prompt tester to continue to try reviewing a Freeze Frame. They did not continue to explore the functionality without prompting
    I suspect this may be a pitfall of instructional walkthroughs, i.e.: How can the user be clear that the walkthrough has finished and not still waiting for them to continue? However having not yet thoroughly researched onboarding, there may be methods of producing them that prevents such reliance. I will explore this further!
  • Unclear whether Tester is aware that their first ‘Freeze Frame’, taken during the onboarding process, has already been stored in the app, but isn’t ‘saved’ to the OS’s Camera Roll… 
    • Perhaps there is room for confusion here. The ‘Freeze Frame’ has actually been saved in the app, but not saved to the OS’s Camera Roll. Maybe clearer terminology and breadcrumb explanation needed? ‘Save’ vs ‘Export’?

There isn’t yet any confirmation for the user that freezing the screen, i.e. creating a freeze frame, actually saves that frozen image anywhere. This will need to be explained in the onboarding process. I do feel that the terminology also needs to be clarified.

Saving should happen automatically within the app, facilitating a review opportunity on the Freeze Frames screen. An option to ‘Export’ or delete each Freeze Frame should be given later, when each ‘Freeze Frame’ is being reviewed. This is keeping to the behaviour explained by Herskovitz et al.

Informed Consent Form


References

Andaluz, V., Mora-Aguilar, J., Sarzosa, D., Santana, J., Acosta, A. and Naranjo, C. (2019). Augmented Reality in Laboratory’s Instruments, Teaching and Interaction Learning. Augmented Reality, Virtual Reality, and Computer Graphics : 6th International Conference, [online] 11614. Available at: https://ebookcentral.proquest.com/lib/winchester/detail.action?pq-origsite=primo&docID=5923384#goto_toc [Accessed 30 Sep. 2021].

Craig, A.B. (2013). Understanding Augmented Reality: Concepts and Applications. San Diego: Elsevier Science & Technology Books.

Herskovitz, J., Wu, J., White, S., Pavel, A., Reyes, G., Guo, A. and Bigham, J. (2020). Making Mobile Augmented Reality Applications Accessible. ASSETS ’20: International ACM SIGACCESS Conference on Computers and Accessibility. [online] Available at: https://dl.acm.org/doi/10.1145/3373625.3417006 [Accessed 26 Sep. 2021].

Leave a Reply

Your email address will not be published. Required fields are marked *