Skip to main content

AI & UX in the digital age

Insight Published on 02 June 2024

In the digital age, user experience (UX) is paramount for the success of any product or service. It ensures that users can achieve tasks easily and stay engaged rather than getting frustrated, making a complaint, or worse, moving to a different service altogether. 

UX is a process that requires close collaboration between creative, analytical and technical engineering skills. The industry is run by skilled professional roles such as product designers, service designers, researchers, copywriters and software developers - all working closely together. 

With Artificial Intelligence (AI), is all that about to change? Are we headed for a world where our applications, communications and websites are designed and built by 'UX bots'? The answer to that, as designers will often tell you, is 'it depends!'

In this article, we explore ways in which AI is helping to transform, assist, and inspire designers, developers and technologists who create UX, whilst highlighting some of the challenges and considerations when using AI for product development. 

How AI could impact UX

AI has two major potential impacts on UX (as well as some lesser ones). The first is that UX professionals can use AI to carry out tasks more quickly than they would ever be able to themselves. AI excels at generation and iteration, so using AI to create multiple variations of designs, code, tests, and other material makes a lot of logical and economic sense. 

The second major impact is in the opportunities for AI to be directly integrated into UX itself, thereby providing a way for apps and websites to feel 'predictive'. 

AI supporting UX practitioners

Content curation and creation 

AI-powered tools can assist with creating content. Using 'prompts', creatives can rapidly produce articles, infographics, images, page layouts, user interfaces, videos and even music and audio that is high quality, original, and 'directable' in style. Commonly used platforms include ChatGPTMidjourneyPiAdobe Firefly, and Canva's Magic Studio feature. 

While the speed, availability and ongoing improvement of these tools in generating materials is impressive, it's not infallible. From my own experience, it does currently lead to increased focus (time) on the curation and fine-tuning of the assets, as the results are rarely exactly what a creative has in mind on the first iteration. For example, AI is not all that great at evaluating and choosing the best option from an emotional or aesthetic point of view. If you consider the complexity of human value judgement on the things we see, it's not hard to understand why AI can't do it.

 

AI Generated images showing variations of a healthcare professional visiting a male patient
AI Generated imagery by Firefly using the prompt 'Healthcare professional visiting a male patient who is uncomfortable'

 

Consider the details of these two images. While both were created with similar prompts and are superficially similar in composition and content, a closer look reveals a sense of awkwardness within the left image compared to the right. Even in the right image the patient's facial expression is not clear to us. Is he angry? Determined? In pain? 

Another major caution with AI-generated content is bias. Bias is the result of AI being 'trained' with data sources that are potentially flawed or misrepresentative and it is a major, some say impossible, problem leading to 'hallucinations' in AI's output. According to IBM

"The models upon which AI efforts are based absorb the biases of society that can be quietly embedded in the mountains of data they're trained on. Historically biased data collection that reflects societal inequity can result in harm to historically marginalized groups in use cases including hiring, policing, credit scoring and many others.”

Bias, and the avoidance of it, could form the basis of an entirely new article in itself. Suffice to say, it needs to be understood and carefully considered. It's another reason why AI can't replace the clarity and emotional intelligence of human review, reflection, and curation. 

User behaviour analysis 

By analysing large volumes of user data more quickly and accurately than a human could, AI tools can identify patterns and trends in behaviour. This can, in turn, highlight pain points, areas of improvement, and opportunities to enhance the user experience. 

For example, highlighting unexpected spikes in bounce rates, rapidly alerting service operators to unexpected behaviour like rage clicks, or declines in engagement can help designers identify and resolve potential issues with the user experience.  

Assisted by AI, designers can make data-informed decisions and create a more user-friendly and efficient interaction, improving overall user satisfaction. 

Accessibility 

There’s no doubt that AI has the potential to help make digital products and services more accessible for people with disabilities, but it’s early days.  

Pre-existing consumer software like screen readers, image recognition tools, speech-to-text and predictive auto-complete already help users with disabilities to interact with websites and apps. In theory, these tools can be improved and made even more useful with AI. 

While there are clear opportunities for AI to help make the digital world more inclusive, there are clearly articulated limitations to the existing tech: the diversity and complexity of individual needs being the most significant. 

There are also risks around the security of sensitive personal information which may be needed to improve the experience for those diverse but individual needs. 

According to Ran Ronen, CEO of Equally AI, the theoretical emergence of Artificial General Intelligence (AGI) may provide some answers to the current limitations of plain old AI as he discusses on Forbes

"AGI promises to bring about revolution in the accessibility industry. Unlike conventional AI, which excels in specific tasks, AGI has the potential to think, learn, adapt and apply intelligence across a wide range of functions, much like a human, with the added capability of being inherently proactive in its approach.”

Embedding AI in the design and function of UX 

Outside the direct assistance of AI for designers, developers and other digital professionals, one of the most evident uses of AI in a design context is personalisation. 

AI can help analyse and process the detail of a user's behaviour, preferences, and interactions to create idiosyncratic experiences faster and more effectively. But again, it's only as good as the training data that it can consume and learn from. 

We're already seeing this in streaming platforms such as Netflix's recommendations or TikTok's 'for you' content and other industries such as personal fitness, education and health also stand to benefit. 

Personalisation, when done right, feeds the emotional part of our exchange, resulting in a richer, more engaging experience that keeps users returning for more.  

Chatbots, personal assistants and voice

AI-powered chatbots and virtual assistants have fast become a staple in the digital world and can be found on many websites. They offer users immediate assistance, answering queries and guiding them through websites and applications, while offering organisations a tool to free up customer services staff - albeit with mixed results as DPD found out.

The use of chatbots can potentially improve user satisfaction when well thought out and implemented, but clearly this is not always the case. 

With the advance in AI and theoretical AGI, previously methodical but sterile chatbots could now begin understanding natural language, humour, idioms and more. This more human context could lead to smoother and more intuitive user experiences.  

AI chatbots like Pi (Personal AI) have evolved the chatbot genre. Pi can be used in a multitude of contexts from practicing job interviews and writing speeches to simply keeping people company in lonely moments. 

While it is not perfect, Pi's interface allows users to conduct both spoken and written conversations and uses more informal and 'emotionally intelligent' language. 

Pi’s voice generator is particularly natural and emotive. As a result, users can have human-like and conversational back-and-forth rather than feeling like they are talking to an impersonal robot. To achieve these results, Pi relies on a branch of AI known as… 

Natural Language Processing (NLP)

Natural Language Processing or 'NLP' refers to the branch of AI that gives computers the ability to understand text and spoken words in a similar way to humans, resulting in a more human-like and engaging user experience.

This area of AI attempts to bridge the gap between humans and machines by enabling sentiment analysis, voice recognition, and language translation. It makes interactions feel more natural and accessible, whether you are conversing with a virtual assistant - such as Siri or Alexa and giving them instructions through speech-to-text technology - or talking to a chatbot. 

If a customer support chatbot system knows the user's emotional state, then issues can be prioritised and responses tailored accordingly. For example, if a user sounds frustrated, the chatbot could recognise this and respond apologetically. It could potentially even delegate the issue to a human customer service agent to provide further support and log UX options to improve the service design in the future. 

Amazon recently implemented AI-edited product review summaries, which use NLP to assimilate customer reviews into a concise soundbite:

An image of a laptop with an Amazon customer review on the screenNLP can also be used for translation services in various settings such as healthcare and the judicial system to provide information and support in multiple languages, thus catering to the linguistic diversity of the population. Likewise, NLP could also assist healthcare professionals with automated note taking, transcribing and disseminating clinical notes from spoken input. This would not only reduce the time spent on documentation but also allow the healthcare professional to give more focus to patient care. 

Clearly there are precautions and careful planning and testing needed. But as the tech continues to evolve, it may provide a great way to make care more efficient by freeing up valuable time for professionals by carrying out administrative tasks behind the scenes. 

Using AI for product development and UX

The challenges 

As we've outlined above, AI isn't a panacea. 

It is still prone to inaccuracies and mistakes, often due to AI's challenge in emotionally 'discerning' between similar but subtly different outputs across written word, audio, and imagery. 

Therefore, it's vital that when using AI-driven data, the outputs must be fact-checked and sense-checked by humans. This is especially important if it’s being used to make decisions, being applied in a medical setting, communicating publicly, or creating material that relies to some extent on emotional engagement.  

As musician Jacob Collier states: 

"AI as a tool in music-making is fine, but it's always going to be the humanity in music that makes people want to listen to it."

What this means is that AI can't currently easily be used as replacement for the human feeling or judgement in creative disciplines - and that includes design, motion, audio and user interaction. 

Latest research suggests there are at least 27 human emotions, all of which can be triggered subjectively and differently at any time by our lived experiences. How do we teach AI which of two almost identical things is more emotionally engaging from multiple people's perspectives?  

Limited originality is also a challenge. As AI analyses and generates content based on patterns and previous examples, it struggles with creativity, innovation, uniqueness and subtlety. For example, when using AI to write articles, the content may sound more robotic than if it had that human touch. And where 'originality' is requested, it can often be presented as somewhat odd, out of place or humorous - but not in the right way! 

Another concern when using AI for any means is that these systems rely on vast amounts of often quite personal and specific data. This could include identity, engagement, biometric, behavioural and attitudinal data. Therefore, it is paramount that data privacy laws such as GPDR keep step with AI developments to prevent unauthorised access or potential misuse.  

A final consideration is that AI can perpetuate pre-existing societal biases. As it learns from the training data fed into it, if this data contains biases, then the AI may make discriminatory decisions which can potentially exacerbate existing societal biases. Therefore, it is our responsibility to take steps to mitigate any potential biases by using diverse training sets or testing for the presence of biases.  

The opportunities 

Although the above comments describe challenges and limitations in the current use of AI in digital design, overall AI could definitely help to improve all sorts of experiences. A range of possibilities for the way in which designers and developers can approach user interactions is now readily available - from personalisation and chatbots to content curation and user behaviour analysis.  

As designers and developers continue to explore the potential of AI, users can perhaps expect more personalised, predictive, seamless digital experiences in the future. For creatives, planning ways to use AI to automate tasks that are tedious or time-consuming is a strategic move that can free up time to focus on higher value work. 

Maybe that's how future products or services will be differentiated in an increasingly competitive digital landscape.  Only the future will tell!

Topics

  • User Experience
  • UX Design
  • Artificial Intelligence