Nature vs. Nurture: When a Chatbot Goes Bad
Can a bot really be a psychopath? The folks at MIT think so. Well sort of. MIT researchers recently conducted a study using AI to create a digital psychopath named Norman (loving the Alfred Hitchcock reference). But this wasn’t just a study to see if we can create monsters with technology – it was really designed to educate us as we enter the age of AI. That lesson? If you feed a bot bad or disturbing data, you’ll get bad or disturbing outcomes. It’s very much a nurture vs. nature conversation. Naturally AI is nothing much at all – just a series of 0s and 1s. But as you nurture it by feeding data into the system and defining the outcomes you want to achieve with it, the AI engine will find data, identify patterns and optimize around those outcomes.
AI is not inherently bad on its own. A chatbot, for example, can’t come up with its own theories or hypothesis and test them – they seek the outcome the people behind it told them to seek. In this case, MIT researchers got the outcome they wanted. They created a Norman Bates character for the digital age. And while that may seem like a stunt, the lesson should not be ignored.
As companies begin to think about the business cases for AI, it’s good to remember that how we train and build AI-powered tools will be just as important as the technology behind it. So often we see businesses get too in the weeds of the tech and employ teams of people for years and they still get a less than optimal end state. In the end, AI is a broad topic and not all of it is created equally. Know the use case you are looking to solve and optimize around that. Simply because an AI system can work to cure cancer doesn’t mean it will be good for other use cases like customer experience. Why? Because while the tech is the same, the data sets they use are very different. In the end, know what you want to achieve and test multiple models thoroughly. Because as Norman Bates famously said, “people always mean well.” But as we know, sometimes that isn’t enough.