- Buy Microsoft Visio Professional or Microsoft Project Professional 2024 for just $80
- Get Microsoft Office Pro and Windows 11 Pro for 87% off with this bundle
- Buy or gift a Babbel subscription for 78% off to learn a new language - new low price
- Join BJ's Wholesale Club for just $20 right now to save on holiday shopping
- This $28 'magic arm' makes taking pictures so much easier (and it's only $20 for Black Friday)
Estée Lauder applies AI, AR for cosmetics accessibility
The original design involved taking a selfie, which the algorithm analyzed to assess the uniformity of application and then provided guidance to the user. It didn’t take long, though, for the team to instead utilize real-time video the app could use to scan the user’s face. For instance, if the video shows the user has applied foundation or lipstick unevenly, the app provides verbal descriptions of the specific areas that need a touchup and guidance to correct the issue. The user can then make adjustments, rescan, and the app will provide a prompt when everything is correctly applied.
First priority, however, was for Aidan and his team to directly engage with the visually impaired community. “We really wanted to gain an understanding of what their unique needs were, their pain points and preferences, and what they desired from our products,” he says. “We pulled together focus groups and asked questions, but mostly listened to them about their personal experiences with makeup and technology.”
Importantly, he says, some focus group members were completely blind, some experienced varying degrees of low vision, and some had excellent peripheral vision. This enabled his team to gain insight from a variety of individual experiences and to question their own assumptions.
“We assumed that a natural, more humanistic sounding voice would be the preference for the implementation, but the user research confirmed that familiarity was actually most important to our users,” Aidan says. “Whatever they had set up on their device is what they wanted to experience.”
The team also partnered with internal advocacy groups at ELC, external advocacy groups, and experts in accessibility and inclusivity, and combined their insights with feedback from the focus groups to gather requirements for VMA. The team then leveraged the user research for everything from naming the application to getting the tone of voice just right.
“Throughout the design, build, and test phases, their feedback was informing our decisions, even the small features like being able to adjust the speed of the virtual assistant’s speech,” Aidan says.