Always an engaging speaker, Botsman’s talks centred on her book ‘Who Can You Trust?: How Technology Brought Us Together and Why It Might Drive Us Apart’. The audience was hooked from the start with an anecdote concerning the time her parents accidentally hired a drug dealer to be her nanny; taken in by the woman’s manner with the children and her fake Salvation Army uniform.

Botsman’s parents used established, learned, ‘trust signals’ to make their decision, but this approach was not insightful enough to see through the pretence. This episode highlighted a key message of her talks – ‘Trust has two enemies, not just one: bad character and poor information’, and ‘the illusion of information is worse than ignorance.’

Botsman’s take on the changing shape of trust is less positive than in her previous work ‘What’s Mine Is Yours: How Collaborative Consumption is Changing the Way We Live’. Her story starts with ‘Local Trust’  – the earliest model, where trust was created between people who knew each other personally.

The second trust model, that of ‘Hierarchical Trust’, developed between individuals, established institutions and their associated officials – such as bankers, politicians and teachers. As this model crumbled, the replacement model is that of ‘Distributed Trust.’

Using a metaphor of connectivity and power, hierarchical trust involved waves of ‘trust energy’ flowing up from individuals to established institutions. In a distributed model this energy flows laterally between people and between people and contemporary institutions/channels. What we are seeing is the ‘twilight of the elites’ and ‘the inversion of influence.’

I liked Botsman’s perspective that trust is a process (not a static entity, in the sense of a belief or conviction) and an active agent that helps us bridge the ‘trust gap’ between the unknown and the known. Trust then, can allow a confident relationship with the unknown.

So how is this change in the ecosystem of trust impacting on consumers and how they interact with brands and digital platforms particularly?

Trusting our digital platforms

Developments over the last few years are having an impact on digital platforms, how we perceive them and how they operate. The presidential election in the US and the furore over fake news, allied with the rescinding of Uber’s licence in London, have encouraged different expectations of the role and responsibilities of companies like Facebook and Uber.

Traditionally, they have positioned themselves as neutral pathways, or ‘dumb pipes’, that connect people with each other and a service. Events as above have led to pressure for them to change their accountability positioning from: ‘Reactive – within reason, be there when things go wrong’; to ‘Proactive – be responsible for the risks of bad things happening.’

We increasingly outsource trust to machines and algorithms, searching Facebook for news, booking a cab via Uber and asking Alexa what we should do today.

From a commercial perspective, this reliance means there are new and different challenges for brands and how they interact with their customers – ‘with great power, comes great responsibility.’

Marketeers need to be aware of this changing face of trust. We already know that considerable distributed trust is placed with a wide range of celebrities active across social media, especially on Instagram. This model of distributed trust is also extended to digital brands, platforms, channels, and now algorithms. Consumers rely on these brands and channels – ones that they align with in terms of proposition and delivery, but the price to be paid for breaking this trust can be high, as Ryanair is finding out to its cost.

Technology is playing a big role in developing this new model of distributed trust, and a key area is that of automation and artificial intelligence. I touched on this in my piece ‘The Future (and eternal truth) of Marketing‘.

‘The relationship between brand and consumer, and the transparency with which it is conducted, risks being further confused by the growing influence of bots. “Choice architecture” is changing with the rise of automation, robotics and AI. Bots will refine choices presented, and even make choices on behalf of consumers. Some argue that the intervention of bots will mean that matters of ethics, which are nuanced not binary decisions, will get side-lined. In reality this places even more responsibility on the brand to uphold ethics. Bots may ignore it in the moment of choice, but ultimately, any brand that cannot meet the requirement for transparent ethics, will risk a consumer backlash.’

Of course, there is the opportunity for brands to leverage trust positively. Philanthropic gestures are a powerful way to build trust and a good example of this was Target’s $1 billion pledge to support students in need of financial support to pursue their education.

It can also be derived by encouraging positive interaction around a brand. An example of this was Burberry’s ‘Art of The Trench’ campaign, where users could share and comment on everyday pictures of people wearing Burberry products. First Direct has built trust though fabulous customer service since its launch in 1989, in spite of not offering the most competitive financial products. Airbnb, perhaps the perfect brand for world of distributed trust, has shown that travellers can just as easily trust normal people offering accommodation as they can established institutions such as hotels.

Burberry’s Art of the Trench

An associated threat to an effective trust process is that of a reduction in friction. Trust needs friction – ‘time and consideration to operate’ and with a faster pace of life, there is increasing pressure on this space. I covered this dynamic on the Econsultancy blog in this piece – ‘Why increasingly efficient UX might not always be a good thing’.

‘For brands, the question of how to provide the right amount of friction to unlock reflection but not to hamper experience is critical in building a world that, in addition to doing things, thinks about what it is doing.’

Steve Selzer from Airbnb believes that immediacy and the absence of friction are creating a less tolerant, less self-aware world – ‘This is why designers of intelligent, immersive experiences need to build in meaningful friction, encouraging reflection and awareness of the actions themselves as well as their consequences.’

The advent of other realities, augmented and virtual, in tandem with reduced friction, may also cause problems. A piece from Venturebeat observes: ‘…reflection is even more important in immersive environments, where you don’t so much “watch” or “use” experiences as really “live” through them. VR experiences are perceived by the brain as actually happening to the user, so their transformative potential — toward self-development or rapture — is quite powerful.’

Botsman is not optimistic about the future. Her rather dystopian perspective asks where much needed control or moderation can come from. Could there be a role for a digital ombudsman, a trust kite-mark or an ethical Alexa? The reality is that we will always choose utility over morality and regulation is never likely to be popular – witness the outcry over the Uber decision in London.

The more we defer responsibility and abdicate the need for channel accountabilty, the more likely that our trust will be abused.