By Maher Ali Rusho

Rapid innovations in digital technology and computing are constantly redefining the art of the possible in human-computer interaction (HCI). Computer interfaces have evolved beyond traditional input tools and touchscreens to wearable devices replete with a range of on-board sensors. This has unlocked a new class of data to infer user attributes through digital activity.
Earlier, personalisation was limited to commercial use-cases like recommender systems for online shopping and content platforms. But today, the digital footprints of user activity across novel human-machine interfaces (HMI) can be used to determine more complex characteristics of users – for instance, their emotional states, or mental health markers.
Personalised HCI finds application across a spectrum of use-cases today. For instance, it can be applied in enterprise settings to address employee burnout or improve customer engagement, or in educational systems to tailor learning experiences for students with cognitive or sensory disabilities. Moreover, it can also be leveraged to drive early mental health interventions by detecting depression and anxiety in users.

Current approaches to personalising HCI

The personalisation of HCI happens at the intersection of new HMIs, and AI techniques to infer user attributes from the data generated by these HMIs. Currently, there are two broad approaches to personalising HCI: user modelling, which is typically studied under the larger ambit of affective computing, and personalised adaptation.

The democratisation of new computing devices like AR and VR glasses, smart watches, and hand-held VR controllers has led to the generation of new types of data. This data offers new ways to build user models and adapt computing interfaces for individual users. For example, user eye tracking during their interactions with virtual avatars can be used to detect depression with significant confidence. Similarly, by harnessing electro-physiological responses of users, VR environments can be adapted in real-time to achieve improved anxiety treatment outcomes. At the moment, eye gaze monitoring data represents a significant opportunity, as VR technology is on the verge of consumerization. More specifically, facial gesture-activated interfaces generate large volumes of contextual data to detect user emotion with greater accuracy.

Beyond user modelling, personalised HCI is also becoming valuable to build computing interfaces with the right attributes. For instance, the gender and personality of chatbots have a strong influence on how engaged users feel when interacting with them.

From an implementation perspective, personalisation techniques can be targeted to adapt to a single user, or multiple models may be built for each group of users. Whereas group-specific techniques are better suited for socially-oriented use-cases like engagement detection, user-specific models are preferred for detecting mental states, or personalising learning.

Key challenges

However, a number of challenges persist in HCI personalisation. One of the foremost issues is ensuring data privacy while simultaneously leveraging individual data for training user models. In the context of existing data privacy laws, security gaps can prove disastrous, especially in an enterprise setting.

In addition, personalising HCI for specific users requires user-oriented datasets, where data insufficiency poses a key limitation to achieving the desired outcomes. Moreover, because personalisation exploits AI for user modelling, it is not immune to issues already affecting AI initiatives, namely, bias in the collected data and personalised models, the fairness of their outcomes, and explainability of the models used.

What’s next for the future of personalised HCI?

Existing limitations and challenges of personalized HCI are technical boundaries that can be pushed by the use of technology. For example, user privacy can be ensured by applying confidential cloud computing to multi-party data collaborations. Similarly, psychological models offer a more explainable approach to understanding user behaviour.
With personalised HCI, computing applications are set to evolve to a new paradigm of user-centricity. Beyond user-friendliness, these systems will be defined by user-intimacy, enabling better outcomes for each individual in every domain .

(The author is the 𝗙𝗼𝘂𝗻𝗱𝗶𝗻𝗴 𝗣𝗮𝗿𝘁𝗻𝗲𝗿 at 𝐔𝐧𝐭𝐢𝐞𝐀𝐈, and is a Pre-Doctoral Research Associate on Human Computer Interaction (HCI) at Brain-Station 23 PLC; views are personal)