When Ai Isn’t the One (and What We Learn From Failed Relationships)
I’m a huge fan of AI for its marketing potential, but I’m also a huge fan of learning from failure. AI is freshmeat technology—sexy for sure, but still pretty mysterious. Those of us lusting over it may want to, you know, look into its turn-ons, turn-offs, and past-relationships before we get serious.
#1: Just because you have the same initials, doesn’t mean you belong together.
That’s a lesson Admiral Insurance learned the hard way. In late 2016, the UK’s largest auto insurer announced an AI application called “firstcarquote”. It planned to analyze first-time drivers’ Facebook activity and then offer discounts to those deemed likely to navigate the roads safely. Good idea, not-so-good due diligence. Admiral’s date with AI was cancelled day-of, when Facebook blocked the company from accessing user data. Hardly a surprise, given Facebook’s data use policy.
AI and Admiral might have made a nice couple, but Admiral forgot to get parental approval. Facebook is protective for good reason: Current AI is notoriously bias-prone. Until learning algorithms are trained on widely diverse data sets and designed with clear ethical standards, AI can’t be allowed to say “yes” to every suitor that comes knockin’.
(Note: in the event that AI manages to sneak out the back door, maybe don’t make a habit of posting skydiving pics or overuse exclamation marks (!) if you want to maintain those low low insurance rates.)
#2: AI sleeps around. Always wear protection.
A good lover has learned from experience. Not a bad thing necessarily—but don’t forget that a partner’s experience might give you a little something more than a fun time.
Programmatic advertising is tantalizing because AI is more efficient when it comes to buying and selling ads—but it isn’t quite so discerning as its human counterparts. Recently, major marketers have pulled ads from YouTube and other sites because it turns out Google’s advertising algorithms have been placing ads for the likes of Walmart and Johnson & Johnson near content associated with dangerous and offensive ideologies.
Care for some baby wipes to go with your neo-Nazism? No thanks.
The lesson here: You never know who your AI may be messing around with on the side.
#3: Dating a greedy, social climber? Not a good look.
Some AI just makes you look bad. After all, nobody likes a gold-digger who doesn’t share.
A few weeks ago, ride-hailing giant Uber copped to fielding a new “route-based pricing system” that sets rates according to what an AI believes riders might be willing to pay—based on the affluence of their pick-up/drop-off locations. Tough luck all you frugal Malibu-Beverly Hills commuters—Uber’s looking for profits.
The economic benefits this Uber-AI pairing may be short lived. Not only will it likely cost the company rider trust, but also drivers – who receive no additional income when they drive a higher-rate fare – are already ticked off.
Sure, it’s not the AI’s fault – it was raised that way – but if Uber wants its friends to keep coming around, it may want to find an AI with better values.
#4: Just because it looks good, doesn’t mean it is.
A few sexy profile pics are usually enough to get us salivating. Careful: This AI may just be really good with “Facetune”.
Chatbots – supposedly ground-breaking AI-fueled interfaces, which understand consumers’ desires, help them buy products, and respond to their service needs have two problems.
The first is small: Chatbots are hard-to-find little guys. When UK-based PizzaExpress rolled out its chatbot last year, elusiveness was the primary criticism levied against it. That chatbot is not the only one in hiding. Of course, with better UI-visibility and promotion, the problem is easily fixed.
The second is bigger: Chatbots promise more than they deliver. Yep, hot profile pic… not so hot in person. Bottom line, current language processing AI isn’t too great at interpreting nuance in human communication. When you invite the chatbot up for a drink, it will probably assume your offering just a drink.
Frustrating consumers is not a good marketing strategy—and a number of major brands have dropped their chatbots as a result.
#5: Be clear upfront: You don’t always go all the way on the first date.
You don’t want to be a letdown. That’s exactly what BMW became when it began selling vehicles that promised to integrate Amazon’s Alexa intelligent personal assistant app. It turns out that, for now, the “BMW Connected Skill” offers only a limited array of capabilities and is not available in all new BMW models. Consumer response has been tepid: 55% of reviews give the Skill only one star. Alexa may be an impressive AI, but BMW’s hardly-impressive early-stage integration has clearly left some buyers wondering why they bothered ordering that second bottle of wine.
Future early AI adopters should be mindful of this wet-blanket effect when arousing consumers’ lust for new tech.
Now all you “AI-haters” out there rubbing your luddite hands with glee need to hold your horses. Fact is, these kinds of missteps – uh, learning experiences – are rare, plus each was rooted in the AI implementation rather than in the technology itself. AI’s marketing charm is real. To learn exactly how good a partner it can be, companies – those discussed included – should continue to court innovation. Sometimes the cutting edge draws a little blood, but who doesn’t like a little danger with their romance?