DM7908 Week 10 – Experimenting with 3D Objects and Blender

This week my focus has been on creating 3D virtual versions of my client’s products (a makeup bag) and pulling these together to create a short video to place on the product page and to emulate an AR experience. In DM7903 I experimented with photogrammetry to create virtual 3D models, importing them into Apple’s Reality Capture software and screen-recording the outcomes; during this time I also dabbled with combining Adobe After Effects and Cinema 4D Lite with the 3D models to emulate an AR experience, but this resulted in very large render times.

Working within Constraints

Using Adobe After Effects and Cinema 4D Lite for this project would not be suitable due to the project constraints. An Adobe subscription is priced between £20-£30 per month, and the learning curve involved in learning both pieces of software would be too large. Instead, I have opted to use a 3D software package called Blender for this project, which is free, requires a smaller learning curve, and has a wealth of free tutorial information online to foster learning.

Having never used Blender before, I sat down with lecturer Rob Blofield for a tutorial on using Blender. I had already created three photogrammetry models of a make-up bag in different stages of being unzipped (see below). The tutorial was very beneficial in acquainting me with the user-interface and the possibilities available. The outcome produced in the session was very impressive, involving the makeup bag being animated so that it would effectively zip itself up during the animation; however, this was very complicated and advanced to achieve, so would not be suitable for my client.

Above: Three photogrammetry scans of a make-up bag

First Experimental Render

I decided to use Blender to create a stop-motion photogrammetry animation of the product rotating, tilting, and unzipping. My first attempt is shown below:

Above: My first experimental render in Blender

Step-By-Step Recorded Process and Troubleshooting

I have included a bulletpointed description of my creative process below:

  • Convert .USDZ files to .USDC, simply by zipping the .USDZ file and unzipping it again. This separates the textures from the model as a format that is compatible with Blender
  • Once each model was imported, I assigned the correct texture to each one. I made sure to set the surface parameter ‘Principled BSDF,’ as this would support transparency in the Cycles render engine
  • Then, I adjusted the positioning and scale data of each model so that they were all similarly aligned. Later, when transitioning between each model, their similar positioning should allow each transition to appear smooth
  • Next, using keyframes and the timeline, I began to animate the rotation, tilt, and visibility of each model. Timings of each movement needed to reflect the movement of the user interface in high-fidelity prototype, created in Figma. I noted the timings below in respect to working at 25 frames per second, and worked to those
  • Finally, I set the background for the render as plain white (sRGB colour space, default vector, strength = 2000). I decided to do this so that the resulting render would match the white background of Blossom & Easel’s product page, creating a seamless experience between video and page background.
Above: Three imported photogrammetry models
Above: A working document, noting animations and timings (at 25fps)

The one troubleshooting aspect that I needed to address involved the visibility/transparency of objects. I decided to use the Cycles render engine rather than the default, Eevee, as this handled transparency much better, as when using Eevee the transparency would be presented as solid black objects. Some mysterious black outlines still appeared when using the Cycles render mode, however I was able to address this by increasing the number of Transparency bounces in the Light Paths tab.

Lighting and Camera

When setting up the lighting and camera for the renders, I was able to use my background knowledge from studying a degree in Photography.

For lighting, I used three area lights, one placed above the 3D virtual object (set to 80W), and the other two placed either side of the object (set to 120W). The light above the object would lighten any dark shadows, allowing details to be seen, while the bright lights either side would emphasise the 3D qualities of the object.

The camera would be placed in-front of the object, with the object filling the frame. A 50mm lens would be used to reduce any chance of lens distortion, and no BOKEH effect would be applied, as although this would provide depth it may be self-defeating as it could impact the user’s view of the object.

When creating the second render, I needed to slightly tilt the camera during the animation so that the object would stay in frame. These was achieved simply using keyframes.

Second Experimental Render

Below is the second attempt of my experimentation:

Above: A Youtube video depicting the second experimental render

To create these previews, I’ve been experimenting with outputting the renders in different resolutions and qualities. I’m keeping both variables as low as possible for time efficiency reasons, however the final render will be optimised for the dimensions required by Figma for the high-fidelity prototype.

Third Experimental Render

For my final experimental render, I had only a few elements to make. Firstly, I wanted the product’s first impression to be a side-view, as this would best showcase the design and artwork to the consumer, so I adjusted its datum rotation angle on the Y-axis to be 43 degrees (see below).

I also felt that the object was being slightly over-exposed by the light above it, causing it to appear ‘bleached’ and reducing the consumer’s ability to observe the design, artwork, and zip. Parts of the object nearest the camera were also a little darker than those further away, which felt jarring. To rectify this I reduced the light’s power to 70W, and moved it slightly nearer the camera’s position. Below you can see a side-by-side comparison.

Above: The 3D virtual object rotated to 43 degrees on its Y-axis

Above: A Youtube video depicting the third experimental render

Next Steps

My next step will be to create more photogrammetry models of the make up bag (approximately five – representing different stages of being unzipped) as well as some make-up paraphernalia such as lipsticks, which will be introduced to the final render for scale. I will then complete the above process again, and import the outcomes into the product page of my final prototype.

The AR experience will be created using Apple’s Reality Capture, as this is a free and user-friendly solution.

DM7908 Week 10 – Blender Tutorials

In this blog I’m posting links to Youtube tutorials that I’ve watched to supplement the tutorial that I’d organised with my lecturer, Rob Blofield.

Many photogrammetry applications on Apple’s platforms product virtual models at .USDZ files, which is Apple’s flavour of Pixar’s Universal Scene Description format. .USDZ files are not compatible with Blender and must be converted using the simple method outlined in the video above. Once converted, importing of the model and their textures is a simple process.

From observing the above two videos, I was able to learn how to adjust the transparency of models in blender to produce a fade in/fade out effect. This effect would be an imperative part of producing stop motion using my photogrammetry scans. The method in the lower-most video produced the most desirable effect when using the Cycles render engine and ‘Principled BSDF’ surface parameter for the objects.

Regarding lighting and cameras, these behave very similarly to those in Maxon’s Cinema 4D Lite, so the learning curve here was shorter for me. I referred to the below links to ascertain the types of lighting and camera controls.

Lighting: https://renderguide.com/blender-lighting-tutorial/

Cameras: https://docs.blender.org/manual/en/latest/render/cameras.html

DM7908 Week 9 – Figma Tutorial Notes

I began by watching a video of a UX Designer demonstrating the same design process in both Adobe XD and Figma. Her methods allowed me to see that on the whole the applications are similar, however they take slightly different approaches to some tasks.

Video link: https://youtu.be/r1alNWC2ZlU

Notes:

  • Adobe XD and Figma shortcuts are different
  • Adobe XD – “Icons for Design” plug in, might be useful for future mockups
  • Idea for ‘Design Selector’ on my prototype – Overlap circles, but move one slightly to the side, then use the intersect tool. Set the resulting shape and a blend of each circles colour (see Fig.1)
  • Figma – Scrolling may be inverted (could be Youtuber’s personalised setup)
  • Layout between applications is very similar, but Figma’s is simplified with advanced options in OS’ toolbar menu
  • In Figma, font size does not adjust with text box size, unlike Adobe XD
  • Figma’ – Auto-layout’ on buttons keeps padding size the same, despite button size
  • Icon plug-ins function in similarly to Adobe XD, but the selection is potentially smaller
Fig 1: Intersected circles

Having been reassured of the surface-level similarities between the applications, such as the functionality and layout, I decided to dive in a little further. I found a YouTube tutorial that claimed to teach Figma in 24 minutes, and I was hoping to learn more about the advantages of using Figma over Adobe XD, especially as I have heard of Figma being quite widely used by in-house designers at retailers such as New Look.

Video link: https://youtu.be/FTFaQWZBqQ8

Notes:

  • Figma has good collaborative functionality and can work in-browser (should the user reworking on a computer that doesn’t support the downloadable application
  • Figmaresources.com – Lots of free resources and templates available
  • “Evericons” = Resource pack with a lot of common icons
  • ‘Duplicate to your drafts’ function allows you to copy other designer’s graphics and files for a head start on prototyping or collaborating
  • Shortcuts
    • R = Rectangle
    • Option (Mac), Alt (Windows) = Show spacing to nearest objects on X/Y planes

It appears that Figma’s collaborative qualities as well as its ability to work on a large variety of machines is potentially why it is favoured so much in the UX industry. I’m really pleased to learn of its similarities to Adobe XD, and I’m looking forward to using it in this project to create my high-fidelity prototypes.

As I will be including 3D models in my prototypes (scanned using photogrammetry), they will likely need to be imported as video files. So, I have also watched the below tutorial on how to use the ‘Anima’ plugin with Figma to achieve this. I have some prior knowledge of using this plugin with Adobe XD for the DM7903 project.

Video link: https://youtu.be/gpAJ6hJ3eFk

References:

AJ&Smart (2020). Figma UI Design Tutorial: Get Started in Just 24 Minutes! (2021). [online] Youtube. Available at: https://youtu.be/FTFaQWZBqQ8 [Accessed 10 Aug. 2022].

Beard, M. (2022). Figma vs. Adobe Xd Design with Me | How Different Are they? [online] Youtube. Available at: https://youtu.be/r1alNWC2ZlU [Accessed 10 Aug. 2022].

Tech Phoenix Media (2021). Add Videos to Your Designs in Figma Using Anima Plugin. [online] Youtube. Available at: https://youtu.be/gpAJ6hJ3eFk [Accessed 10 Aug. 2022].

DM7908 Week 9 – Mockups and Medium Fidelity Prototyping

This week I’ve been adjusting my low-fidelity prototype in response to the usability test, and interpreting it as several iterative medium-fidelity mockups.

Please note: So I can allocate more time in this module to learning high-fidelity software such as Invision or Figma, I have decided to shorten the medium fidelity prototyping stage by producing non-interactive mock ups in Apple’s Keynote software.

At medium-fidelity level, I am paying closer attention to the colours, graphics, typefaces and font sizes, as well as the flow between the product page and the augmented reality experience. The prototype will be produced at some speed in order to maximise the amount of time I spend on producing a high-fidelity prototype using professional software, hence the use of Apple Keynote software, which I am very accustomed to using.

Stage 1

Above: A Keynote slide featuring screenshots of the first development stage at Medium Fidelity

Initially, I spent about 30 minutes interpreting my low-fidelity paper prototype using graphics available in both Keynote and Apple’s iOS 13 design resources (the latest available on their website). This permitted me to produce a convincing realistic user interface (at operating system level at least) efficiently and in a short space of time. I paid no attention to the Blossom & Easel corporate image (graphics, typeface, imagery etc) at this stage, but did consider layout, particularly implementing gestalt theory, linking and dividing sections using proximity and similarity, however this was still very rough.

Above: Designing of the heading bar in Apple’s Keynote

Some elements, such as the Blossom & Easel webpage header, needed to be created from scratch and were informed by my research of the client’s current mobile experience. I used a placeholder typeface for convenience, and borrowed icons from Apple’s ‘SF Symbols’ application, which catalogues all symbols available on iOS, Mac OS, TVOS, and WatchOS as vector graphics. I did need to create the hamburger menu icon myself, as these menus do not sit within Apple’s approach to UX design.

Above: Safe area guidance from Apple’s Human Interface Guidelines being applied to the Medium Fidelity Prototype

As a final consideration at this stage, I applied Apple’s safe area guidance as instructed for an iPhone 11. Due to the variation in pixel density available on the smartphone market, I am designing at 1x scale, which would be interpreted as 2x on my iPhone 11 (Belinski, n.d.; Apple, n.d.; karthikeyan, 2017). As I am designing with vector graphics, I do not need to be concerned with interpolation that may come with upscaling of raster graphics (Malewicz, 2021).

Stage 2

Above: Two Keynote slides with screenshots of the second development stage at Medium Fidelity

The following day I revisited my initial medium fidelity mock up, taking it to the second stage by adding placeholder graphics and imagery, amending typefaces to reflect the Blossom & Easel branding guidelines, and making further refinements regarding the gestalt principles.

A particularly interesting debate I had with myself was whether to left-justify or centrally-justify the ‘Seam’ and ‘Zip’ text. Centrally justifying the text would continue the corporate image established throughout the design and maintain visual symmetry, however left-justification would closely associate the text with the image adjacent to it (Buninux, 2021).

Another update to the experience was the inclusion of an onboarding prompt, “Drag to Rotate”. Without this prompt, it may not be clear to the user that they can swipe across the 3D model to rotate it and inspect its properties. My intention was for the prompt to display for a few seconds and then fade away, unless the user follows the prompt immediate, in which case the prompt will immediate fade.

Stage 3

Above: A comparison of two contrast ration options

In the final stage of medium fidelity prototyping, I was focused on criticising and improving some of the visual design decisions. One example of this was my critique of colour contrasts with an aim to address accessibility for users living with vision-impairments. In the above image I make an adjustment to the background colour of the product page’s footer, increasing its colour contrast ratio from 3.11:1 to 7:1, meeting WCAG AAA guidelines (WebAIM, 2021).

Above: Two Keynote slides with screenshots of the third development stage at Medium Fidelity

Further adjustments involved updating the design selector with real product design patterns, and amending the onboarding prompts to reflect the OS-level prompt graphics. By aligning the appearance of website prompts with the OS-level prompts, I aim to leverage the familiarity the user has with them, and so I can anticipate that the user will read and understand the prompts’ behaviour.

Although animations are not usually considered at medium-fidelity level, I made an exception for this module; as the interactivity between the 3D photogrammetry models and the page content is key to the experience it seemed necessary to visualise this incase further layout amendment were required (which must be completed before high-fidelity prototyping). This visualising process was successful and proved that only one amendment was required, which related to the design selector (see below).

When considering the animations, there was also an opportunity to apply the parallax effect. As well as serving a please aesthetic, the effect also enables the separation page elements and production of depth. A good example of this is the behaviour of the 3D object and the nail-polish graphic; as both graphics scroll upwards, the nail-polish moves faster, separating the graphics and providing a depth that could not be created if these elements were presented as one image.

In the above comparison video, two behaviours regarding the interaction between the 3D model and design selector are visible. I decided that the left-most option would be most appropriate, as down-sizing the design selector would reduce the size of it’s tap-targets, resulting in functional issues that would negative impact the user experience (Harley, 2019; Parhi, Karlson and Bederson, 2006).

AR Experience

Above: The AR Experience UI after the first stage of medium-fidelity prototyping
Above: The AR Experience UI after the second stage of medium-fidelity prototyping

When creating the medium-fidelity prototype of the AR experience, I was able to efficiently import many page elements from the product page, including the ‘Purchase bar’ and design selector. Some elements, such as the shutter button needed to be created from scratch though, so I opted to use vector graphics so that they were unscalable for many screen sizes.

In addition to the design selector that I created on the product page, I have added a stroke around the outside of each design, and small underline below the selected design. It seemed important to declare which design had been selected, and although a colour stroke alone could do this, it would be inaccessible for some users, so I decided to include a separate underline too (Guy, 2014). As this seems to be a good improvement, I will also work this into the high fidelity version of the product page.

This has been a successful and productive week of prototyping. In a real-world content, I would now be looking to complete usability testing in relation to areas-of-interest (AOIs) that have been introduced at this stage (colour, typeface, graphics, and imagery, to ascertain whether formative feedback could provide insight into further improvements. Now that the design process has reached a digital stage, there is possibility for technologies such as eye-tracking to be used, permitting a deeper understanding into how users will interact with page elements (Bergstrom and Schall, 2014).

Next week I plan to focus on creating photogrammetry scans on Blossom & Easel’s make-up bag(s), and experimenting with animating them, ready for the creation of a high-fidelity prototype.

References

Apple (n.d.). Layout – Foundations – Human Interface Guidelines – Design – Apple Developer. [online] developer.apple.com. Available at: https://developer.apple.com/design/human-interface-guidelines/foundations/layout [Accessed 12 Aug. 2022].

Belinski, E. (n.d.). Resolution by iOS device — iOS Ref. [online] iosref.com. Available at: https://iosref.com/res [Accessed 12 Aug. 2022].

Bergstrom, J.R. and Schall, A.J. eds., (2014). Eye Tracking in User Experience Design. Morgan Kaufman. doi:10.1016/c2012-0-06867-6.

Buninux (2021). Text Alignment Best Practises. [online] Medium. Available at: https://blog.prototypr.io/text-alignment-best-practises-c4114daf1a9b [Accessed 7 Aug. 2022].

Guy, T. (2014). Usability Tip: Don’t Rely on Color to Convey Your Message. [online] UX Magazine. Available at: https://uxmag.com/articles/usability-tip-dont-rely-on-color-to-convey-your-message?rate=ijTgGDWgA0pQifcW0TxUqd_wtNxkg8Jug4a0Z_cAolM [Accessed 10 Aug. 2022].

Harley, A. (2019). Touch Targets on Touchscreens. [online] Nielsen Norman Group. Available at: https://www.nngroup.com/articles/touch-target-size/ [Accessed 11 Aug. 2022].

karthikeyan (2017). Autolayout – iOS 11 Layout Guidance about Safe Area for iPhone X. [online] Stack Overflow. Available at: https://stackoverflow.com/questions/46344381/ios-11-layout-guidance-about-safe-area-for-iphone-x [Accessed 12 Aug. 2022].

Malewicz, M. (2021). UI Design Basics: Screens. [online] Medium. Available at: https://uxdesign.cc/ui-design-basics-screens-734bfbeffca9 [Accessed 12 Aug. 2022].

Parhi, P., Karlson, A.K. and Bederson, B.B. (2006). Target Size Study for one-handed Thumb Use on Small Touchscreen Devices. Proceedings of the 8th Conference on Human-computer Interaction with Mobile Devices and Services – MobileHCI ’06, [online] pp.203, 210. doi:10.1145/1152215.1152260.

WebAIM (2021). WebAIM: Contrast and Color Accessibility – Understanding WCAG 2 Contrast and Color Requirements. [online] WebAIM. Available at: https://webaim.org/articles/contrast/#ratio [Accessed 8 Aug. 2022].‌

DM7908 Week 5.3 – Wireframes

Satisfied with my early sketches, I’ve now interpreted these as scaled wireframes. As the MA programme has progressed I have come to appreciate the iterative development process – I have found that the opportunity to review designs and interpret them as a higher-fidelity versions permits me to critique those designs, find flaws, and suggest improved designs. This presentation documents some of the design debates that I came across, mostly referring to the placement of user interface elements, which required me to find a balance between accessibility and aesthetics.

Wireframes Presentation

ISSUU Presentation

Transcript

References

Babu, R. (2019). Inclusivity guide: Usability Design for Left Handedness 101. [online] Medium. Available at: https://uxdesign.cc/inclusivity-guide-usability-design-for-left-handedness-101-2bc0265ae21e [Accessed 24 Jul. 2022].

Nadir, M. (2021). Left Handedness and Mobile User Experience. [online] Medium. Available at: https://bootcamp.uxdesign.cc/left-handedness-and-mobile-user-experience-a3728c72f880 [Accessed 25 Jul. 2022].

Nielsen, J. (2005). Scrolling and Scrollbars. [online] Nielsen Norman Group. Available at: https://www.nngroup.com/articles/scrolling-and-scrollbars/ [Accessed 24 Jul. 2022].

Sherwin, K. (2019). UX Guidelines for Ecommerce Product Pages. [online] Nielsen Norman Group. Available at: https://www.nngroup.com/articles/ecommerce-product-pages/.‌

DM7908 Week 5.2 – Early Sketches

Since my positive meeting with Christopher earlier this week, and learning that he’s interested to see the visual development of the project, I have moved forward with producing some early sketches.

In the below presentation, uploaded to YouTube, I discuss some of the sketches and my design ideas further. Please note that the presentation contains some key points about accessibility, functionality, and links to theory in my DM7915 proposal.

Early Sketches Presentation

ISSUU Presentation

Transcript

References

Heller, J., Chylinski, M., de Ruyter, K., Mahr, D. and Keeling, D.I. (2019). Let Me Imagine That for You: Transforming the Retail Frontline through Augmenting Customer Mental Imagery Ability. Journal of Retailing. doi:10.1016/j.jretai.2019.03.005.

Løkke‐Andersen, C.B., Wang, Q.J. and Giacalone, D. (2021). User Experience Design Approaches for Accommodating High ‘Need for Touch’ Consumers in Ecommerce. Journal of Sensory Studies. [online] doi:10.1111/joss.12727.

Peck, J., Barger, V.A. and Webb, A. (2013). In Search of a Surrogate for touch: the Effect of Haptic Imagery on Perceived Ownership. Journal of Consumer Psychology, [online] 23(2), pp.189–196. doi:10.1016/j.jcps.2012.09.001.

DM7908 Week 5.1 – Meeting with Client

Yesterday, I met with the Founding Director of Edwards of England/Blossom & Easel, Christopher Johnson, to discuss some of the outcomes of the research to date. He was particularly interested in the results of my competitive audit and very keen to learn how the ‘digitising’ of the products via techniques such as photogrammetry could improve sales and possibly allow him to places products in the metaverse.

Below I have listed my notes from the meeting as well as the action points.

Meeting Notes

  • Christopher was very interested in the results of my competitive audit, and agreed with some of the outcomes. Requested a copy and intends to action some of the areas where B&E does not align with its competitors
  • Very constructive discussion about how product qualities (weight, capacity, colour/design, texture, build quality, etc can be captured in media. John Lewis examples contrasted with Sainsbury’s provided clear explanation and demonstration
  • Christopher understood that much of this research would be theoretical, and attaining quantitative/qualitative feedback from customers was beyond the scope of this project
  • Christopher agreed that providing many sources of information (images, videos, text description etc) could be overwhelming for some customers due to cognitive load. Good discussion around how one visual form of media could suffice (e.g.: teleshopping)
  • Christopher mentioned an interest in ‘digitising’ some of his products for sale in the meta verse. Just something he’d like to explore
  • Overall, Chris was very supportive and remains interested in the project and has agreed to the actions listed below

Actions:
Message Christopher Johnson regarding:

  • Gaining access to analytics data
  • Requesting a makeup bag for prototyping
  • Typeface and font files for B&E branding

DM7908 Week 4.2 – Augmented Reality Research

Alongside my competitive audit, this week I’ve also been analysing a few similar augmented reality deployments alongside the required technologies to create them. Along the way, I’ve been highlighting elements of inspiring AR deployments that my client could benefit from having in their experience.

I have compiled this research into an ISSUU keynote presentation, available below, and have presented this as a short Youtube presentation too.

Presentation Transcript

References

Adobe (n.d.). Create Augmented Reality | Adobe Aero. [online] Adobe.com. Available at: https://www.adobe.com/uk/products/aero.html [Accessed 9 Jul. 2022].

Alper Guler (2021). Converse Skate Park – AR Virtual Try on. [online] Youtube. Available at: https://www.youtube.com/shorts/5kRAHM3L1F4 [Accessed 10 Aug. 2022].

Apple (2020). Apple Developer Documentation – ARKit. [online] Apple.com. Available at: https://developer.apple.com/documentation/arkit/ [Accessed 16 Jul. 2022].

Apple (2020). ARKit | Apple Developer Documentation. [online] Apple.com. Available at: https://developer.apple.com/documentation/arkit/ [Accessed 2 Aug. 2022].

Apple (n.d.). Apple Developer Documentation – Adding Realistic Reflections to an AR Experience. [online] Apple Developer. Available at: https://developer.apple.com/documentation/arkit/camera_lighting_and_effects/adding_realistic_reflections_to_an_ar_experience [Accessed 16 Jul. 2022].

Bulgari (2016). Serpenti Forever Top Handle. [online] Bvlgari. Available at: https://www.bulgari.com/en-gb/bags-and-accessories/womens/bags/top-handle-bags/serpenti-forever-top-handle-calf-leather-green-290569 [Accessed 8 Jul. 2022].

Bulgari (2019). BVLGARI TOVCH. Tap to Dive into Our world. [online] Youtube. Available at: https://www.youtube.com/watch?v=eMn5kNrZHsM [Accessed 10 Jul. 2022].

Design Hubz (n.d.). Designhubz 3D & AR | Web-based Augmented Reality (AR) for ecommerce. [online] Designhubz. Available at: https://designhubz.com [Accessed 5 Jul. 2022].

Facebook (n.d.). Spark AR Studio – Create Augmented Reality Experiences | Spark AR Studio. [online] Spark AR Studio. Available at: https://sparkar.facebook.com/ar-studio [Accessed 10 Jul. 2022].

Genius Ventures (2021). Luxury Handbags AR Commerce by Genius Ventures Inc | Augmented Reality for Fashion Brands | Spark AR. [online] Youtube. Available at: https://www.youtube.com/watch?v=8MAWU2vpNvk&list=PLfCFtdiO4_nyPtl4izl3yvLGUUpYIGtyR&index=3 [Accessed 10 Aug. 2022].

Genius Ventures (n.d.). Augmented Reality Business Solutions by Genius Ventures Inc | AR | XR. [online] Genius Ventures Inc. Available at: https://geniusventuresinc.com/services/ [Accessed 9 Jul. 2022].

Online Campaigns (2012). Converse the Sampler iPhone App. [online] Youtube. Available at: https://www.youtube.com/watch?v=lBQzXi04JpE [Accessed 10 Jul. 2022].

Snapchat (n.d.). Lens Studio Marketing. [online] Snapchat Lens Studio. Available at: https://ar.snap.com/lens-studio [Accessed 10 Jul. 2022].

Tech Magic (2021). Bulgari Iconic Bag in Augmented Reality. [online] Youtube. Available at: https://www.youtube.com/shorts/OWMYi7QmcWU [Accessed 9 Jul. 2022].

DM7908 Week 4.1 – Competitive Audit

In order to better understand my client’s position in the retail market and learn how their online presence compares to key competitors, I have carried out a competitive audit. My intention is that a competitive audit will inform some of the upcoming design decisions in a redesign of the Blossom & Easel website. I will also highlight any potential usability problems in competitors’ websites and review gaps in the market where there is opportunity to innovate. Overall, this process aims to improve the efficiency of the forthcoming design process by maximising the return on time, money and energy (Google, n.d.).

During the previous module I produced a research proposal that are listed for large-scale competitors of blossom and easel, these were Sainsbury’s, next, John Lewis, and ASOS. In addition to these Christopher Johnson has informed me of a further three competitors that are similar in size to Blossom & Easel.

All of the competitors analysed are in direct competition for a very similar audience. I have been able to analyse each competitors strengths and weaknesses regarding their online presence, including aspects such as: accessibility, user flow and navigation (across a product page), media (e.g.: product images and video), and tone/descriptiveness of written product descriptions.

All visuals that were inspected for the competitive audit, as well as the outcomes, have been presented as a Keynote presentation document here:

I have compiled the data from the competitive audit into a Microsoft Excel document, which is available here:

Summary

Above: A SWOT analysis comprising results from the Competitive Audit

To summarise the competitive audit, it appears that the inclusion of emerging media could be a unique opportunity to aid high–NFT consumers’ confidence, allowing Blossom & Easel to increase the endowment effect among their customers in a manner that their competitors do not currently achieve. However any experience must address salient qualities of the product (Peck, Barger and Webb, 2013); some similarly priced competitors demonstrate product capacity in their imagery, providing consumers with confidence by alluding to scale and build quality using methods that Blossom & Easel currently do not.

I’m very much looking forward to presenting this data to the client at a meeting next week.

References

Google (n.d.). Start the UX Design Process: Empathize, Define, and Ideate: Week 5 – Competitive Audits. [online] Coursera. Available at: https://www.coursera.org/learn/start-ux-design-process/home/ [Accessed 28 Jun. 2022].

Peck, J., Barger, V.A. and Webb, A. (2013). In Search of a Surrogate for touch: the Effect of Haptic Imagery on Perceived Ownership. Journal of Consumer Psychology, [online] 23(2), pp.189–196. doi:10.1016/j.jcps.2012.09.001.

DM7908 Week 2.3 – Content Directory, Site Structure, and User Flow

Content Inventory / Directory

Above: A Content Directory of the Blossom & Easel website, displayed as a mind map

As my project would involve rebuilding the product page on my clients website several times (as part of an iterative process of prototyping), it was necessary for me to catalogue all of the page’s content. Above, I have recorded every page element and allocated it to a section e.g.: “Image gallery section” or “Production description”. Not all page elements may find their way into my final prototype, as the inventory process has allowed me to visualise and audit the content before implementing features back into a redesign (Cronenwett, 2019). It may be the case that content is outdated or replaceable with newer technology (i.e.: photogrammetry scans may remove the need for an image gallery).

Site Structure Diagram

Above: A keynote slide recording the site structure of Blossom & Easel’s current mobile experience

My next task was to create a site structure diagram, which would assist me in understanding information architecture on the Blossom & Easel website. Although this project will not see me make alterations to the entire site structure, my adjustments to the product pages may impact where information can be found. Based on my findings it appears that the product pages would be the most suitable area of focus for my project, as this is the only area of the website addresses specific products; the ‘Story’, ‘About Us’, and ‘Contact’ sections address the company and its ethos, but not the products directly.

User Flow

Above: A flow diagram that capture’s the product purchase flow on the current Blossom & Easel mobile experience

This led me to explore how the user may interact with the Blossom & Easel website. Until I am permitted to view any analytics data from the website, I must work via assumptions; and that process starts by visualising a common flow that the consumer undertakes – purchasing a product.

Highlighted in red, I have identified the part of this flow that involves consumer decision-making. In this process the consumer would be required decide whether to purchase the product, and this is the area that may become problematic for high-need to touch consumers, as outlined in my DM7915 project proposal. It is this specific part of the process that I intend to address via an iterative prototyping process, to ascertain how adaptations could be made to an E-Commerce Mobile Platform to address high-need for touch consumers.

ISSUU Presentation

Please note, the above screenshots are available for viewing in a larger format via the ISSUU link below:

References

Cronenwett, D. (2019). Interaction Design: Structure. [online] LinkedIn. Available at: https://www.linkedin.com/learning/interaction-design-structure?u=89462170 [Accessed 5 Jul. 2022].‌