MidJourney: Private Personalisation infinite regress

Private Personalisation: Brands’ Duty of Care in the Age of AI Digital Mirrors

Private Personalisation: Brands’ Duty of Care in the Age of AI Digital Mirrors 1024 571 StJohn Deakins

In the age of hyper-personalisation, algorithms are becoming increasingly adept at reflecting back to us our deepest selves.  These AI ‘digital mirrors’, created by our interactions and behaviours, can reveal aspects of our personality that we may not even be aware of. While this can be a powerful tool for personal growth and self-awareness, it also raises significant ethical considerations for brands. Two contrasting examples are Netflix’s recommendation system and a Stanford University experiment. Both highlight the emerging need for ‘Private Personalisation’.

Netflix AI: A Digital Mirror

Netflix’s recommendation system serves as a digital mirror by reflecting users’ preferences and interests based on their viewing habits. Unlike algorithms that have explicitly attempted to identify personal traits, Netflix’s system inadvertently revealed a BBC reporter’s sexuality to her through its content suggestions.

The reporter noticed that Netflix had been recommending series with lesbian or bisexual characters, several months before she had consciously realised her own sexual orientation. These algorithmic recommendations were unique to her, despite having friends that she believed had similar streaming histories.

Netflix’s algorithms did not seek to identify the journalist’s sexuality. Instead, they acted as a digital mirror, reflecting her unconscious preferences and interests. This unintentional revelation underscores the power and potential risks of algorithms that learn from our behaviours.

Stanford’s Experiment: An Explicit Prediction

In contrast to Netflix’s inadvertent revelation, a Stanford University experiment explicitly attempted to predict people’s sexuality from social media profile pictures. The experiment, detailed in The Guardian, used artificial intelligence to analyse facial features and accurately predict sexual orientation in a majority of cases.

This deliberate attempt to identify personal traits raises serious ethical and privacy concerns, highlighting the fine line between insightful personalisation and intrusive prediction.

The Challenge for Brands: Private Personalisation

As algorithms become more powerful and capable of delivering hyper-personalisation, the need to keep the outcomes private becomes paramount. Brands must navigate the delicate balance between providing personalised experiences and protecting individual privacy. This leads to the concept of ‘Private Personalisation‘, where the processing of data is performed locally, with a customer’s own data remaining on their own devices. The insights gained from processing personal data are then co-owned, used responsibly, and kept confidential.

Transparency and Consent

Brands must be transparent about the data they collect and how it is used. Customers should have control over their data and the ability to opt-in or opt-out of personalised recommendations.

Choice and Control

Similar to choosing a favourite mirror to try on new clothes, enabling customers to ‘tune’ the algorithm will become critical. This will provide them with an image of themselves that is both truthful and psychologically comfortable. This will require easy, low friction – and ideally automated – controls for customers that adapt recommendations to their personality traits, mood and context.

Ethical Use of Data

Brands must ensure that personal data is used ethically and does not infringe on individual privacy. This includes not making explicit predictions about sensitive aspects of a person’s life. Storing the most sensitive data on the customer’s own device will also help to mitigate potential harms and risks for the customer, and therefore for the brand.

Security

With the potential risks associated with revealing unconscious traits, brands must ensure that data is handled securely, and that algorithms do not inadvertently put customers in danger.

Conclusion

The contrasting examples of Netflix’s digital mirror and Stanford’s explicit prediction model illustrate the complex landscape of personalisation in the digital age. Brands have a duty of care to ensure that personalisation is private, ethical, and responsible.

‘Private Personalisation’ is not just a marketing strategy; it’s a commitment to respecting individual autonomy and privacy. As technology continues to evolve, brands must be vigilant in upholding these principles, recognising the power of algorithms as both tools for connection and potential sources of intrusion. The challenge lies in harnessing the power of digital mirrors without crossing the line into the private sanctum of individual identity.

Source: MidJourney. Prompt: Private Personalisation infinite regress

Source: MidJourney. Prompt: Private Personalisation infinite regress

StJohn Deakins

StJohn founded CitizenMe with the aim to take on the biggest challenge in the Information Age: helping digital citizens gain control of their digital identity. Personal data has meaning and value to everyone, but there is an absence of digital tools to help people realise its value. With CitizenMe, StJohn aims to fix that. With a depth of experience digitising and mobilising businesses, StJohn aims for positive change in the personal information economy. Oh… and he loves liquorice.

All stories by: StJohn Deakins