Balancing Technology and Human Insight: Avoiding the Pitfalls of Blind Trust in AI

We recently shared our views on bridging the gap between AI and human understanding and why we believe it’s crucial that we don’t get dazzled by the hype of AI.  We looked at the inherent biases and flaws that can be baked into the algorithms – and the ability to keep check on those biases – as well as the importance of context and nuance

Professor Hannah Fry was the keynote speaker at the Richmond Market Insight Forum and indeed made many of the same points – what a great validation! Her talk created a lot of energy and buzz so building on our previous piece we uncover what technology (and AI) can do, what it can’t and what the risks are for the insight industry. 

We know that technology is great at processing large volumes of data, and at following rules. Whilst we know that humans make mistakes, we tend to assume that technology is 100% accurate. So much so that we’re often willing to hand over power to technology without question. Our trust is absolute.  

Hannah Fry gave some fantastic, funny and slightly disturbing examples, all of which perfectly illustrated people’s willingness to follow technology, without question including the Japanese tourists in Australia driving to North Stradbroke Island and blindly following the sat nav – straight into the water! Google it if you haven’t seen it.  

And whilst we may laugh and think that won’t happen to me, we can probably all recall instances of being ‘misled’ by the sat nav – hopefully not to this degree! 

As we talked about in our previous article, we need to recognise what technology can do well and maybe more importantly, what it can’t.  

Within the world of insights we may look to AI to collect responses or to analyse them, helping us to gather more data and cut down on timeframes. However, we still need that human instinct, the emotional intelligence, and the understanding of context to ensure that we don’t drive into the ocean. 

If we go back to the sat nav, it’s clear that for many people they have completely replaced maps. Who has an atlas in their car anymore? Or an A-Z in their bag? Yet by placing all our faith in the sat nav we may not pay attention to what else is around us. We may end up in a blind alley. We may miss interesting and fascinating places that are right under our nose, and that we would have seen if we’d looked at the map. Without the sat nav we’d be lost, struggling to get from A to B.  

Is the same true of insights? By relying on technology to analyse for us do we lose the ability to navigate and uncover rich insights? How do we ensure that those skills are maintained?  

I’m sure many of us in research and insights remember when we first started out, we had to learn how to do analysis...we had reams of transcripts and we had to somehow summarise, pull out key themes and ultimately make sense of the data.  It could be a painful process, trying to piece together all the different elements until it fell into place. And then wondering why we couldn’t see it before! 

Yet if these ‘foundational’ tasks that previously would have been carried out by a real person are conducted by technology, then how do those starting out learn the skills? 

And of course, good research is not just about learning how to interpret data but also how to make it meaningful, to pull out nuance and create a story that resonates. AI can be good to pull together key information yet the summaries it produces can be a little, well, flat, and rather black and white. Nuance and human connection are required. Stories need to connect with our audience to be remembered

So, if AI is used to do the groundwork analysis and it produces outputs that lack depth and rich understanding, how are those translated into something more meaningful, and engaging? Surely what is required from research is strategic insight. Insights that are based on understanding, context, and the ability to see things from a different perspective. Yet how are we able to develop and hone these skills if we haven’t mastered the basics?  

Are those coming into the industry at risk of being unable to deliver strategic insight, as the skills haven’t been developed? Or do we need different ways of learning