<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=979905748791482&amp;ev=PageView&amp;noscript=1">
About ChatbotsAugust 2, 2019Written by Alex Debecker

[Podcast] A Chatbot Is for Life Not Just for Christmas with Sean Clark

I recently had the pleasure of being invited to talk about chatbots, ubisend, and my work on Sean Clark's 'Click and Convert' podcast.

The episode was resonated with Sean's audience, predominantly made up of marketers and data-driven professionals. Listen to the episode or read the transcript below to catch up.

In it, I touch on chatbot technology as a whole, insightful use cases, and never shared before case studies.


Note: I purposefully adapted the transcript a little bit to account for my sporadic rambling.

A chatbot is for life, not just for Christmas with Sean Clark

Sean Clark: Alex, welcome!

Alex Debecker: Thank you, it's great to be here!

Sean Clark: To start us off, can you tell us what chatbots are, what do they do?

Alex Debecker: They are computer software humans can talk to using natural language (English, French, etc.). It's a way to use natural language to access data. You can open a message box and get to the information you need.

The machine will also answer in a natural way as well. You can have a natural conversation with it.

They are powered by artificial intelligence, natural language processing engines among others.

They've been popping up for the last few years. We've been working on them for a long while, we're quite excited to see the industry finally catching up!

Sean Clark: Are Siri, Alexa, Google Home chatbots?

Alex Debecker: In a way, yes.

When we say chatbots, people think Alexa.

Siri is more a chatbot than the other two because Alexa and Google Home are more like channels (like Messenger, website, Viber, etc.) than chatbots. They're devices on which you can deploy a chatbot.

Siri is more a chatbot because you can't build apps on Siri, you just talk to it.

We've built Alexa chatbots or Google Home chatbots, which provides a new channel for companies to work with and reach their audience.

Sean Clark: What industries do they work really well in?

Alex Debecker: They work really well in a variety of sectors. A big part (about 70%) of our work is done within HR solutions.

This is probably due to HR departments at large companies are usually overworked and under-budgeted, The money doesn't go to them, it goes towards growing the business which, ironically, makes their work harder. You end up with thousands of employees being serviced by 10 HR staffs, which just doesn't work.

A big portion of HR is repetitive work. In our experience, we can take about 40% of HR enquiries away by having a chatbot answer them. Those 10 poor people struggling to reach the end of their inbox now have close to half of their day back. They can now invest this time doing work that's more human-centric, that requires their specific set of skills.

In service, we can automate a lot of the front-line questions (where is my package, my package is damaged, etc.) -- all the way to integrating with customer service platforms.

In sales, we can help the customer go through the journey, assist them in buying products or services.

Sean Clark: Are there any places where you wouldn't use one? Are there places where it's too complex to implement?

Alex Debecker: I wouldn't say 'complex', but there are sections and places where we wouldn't advise using chatbots.

Mainly places where human emotions are a requirement. If you're calling 999, for instance, you don't want a chatbot handling your case -- you want to talk to a human.

Within businesses, something similar happens where, if you have something sensitive to talk about, you'd rather reach the appropriate HR human.

We try to advise customers not to use chatbots in these situation. This is interesting because it means, again, that we're using our technology to take the boring away. People often worry about AI taking their jobs but really we're trying to make the work humans do about human skills.

We're putting an emphasis back on human work, making machines and humans work together. Part of that is making sure we're not having machines do the work a human would be much better at doing.

Sean Clark: How advanced has the technology gone? I remember looking at your technology a few years ago, it must have come leaps and bounce since then?

Alex Debecker: Yes, it has.

What is interesting though is the available technology right now is still quite... average. Even with this average technology, we're achieving amazing results. Our clients are getting huge results right now.

We expect that the next 10 years will see huge progress in this industry, we'll continue to build things we've never built before. Given what our clients are achieving with the limited technology we have right now, we're super excited to see what's going to be possible in the next 10 years.

chatbot industry will boom quote

We're at a point right now where we're slowly coming out of the 'chatbot hype phase'. About three years ago, when you last looked at our work, we were right in the middle of this hype phase.

Everyone thought chatbots were amazing, people were building little crappy question/answer 'things' and call them chatbots.

Today, we're slowly coming out of that phase. We're getting over the hangover brought by the realisation that those little crappy 'chatbots' weren't, in fact, amazing.

We're now coming out of that. We're more aware of the limitations of the technology, all the while aware of what we can achieve with it. At the same time, humans and society, in general, are getting a lot more comfortable talking to machines.

This combination creates the perfect storm that will make the industry boom for the next ten years.

It has been apparent, particularly in conversations I've had with clients over the past six months, that companies are now seeing the potential in chatbots but also the potential in getting there first -- before their competitors.

There's going to be a gold rush for being the first in your industry, vertical, role to have a chatbot.

Sean Clark: How in-depth can they get? Do they already have the capacity to hold long conversations, keep them in context? Or do they tend to fall off a cliff after a couple of interactions?

Alex Debecker: We've all experienced crappy chatbots.

Being able to hold context and manage long conversations is what differentiate the good from the bad.

Context meaning you start talking to a chatbot about something and it remembers what the topic of the conversation is, what the key variables you mentioned before are.

For instance, let's say you talk to a concert booking chatbot to buy tickets for tomorrow's Nickelback concert (because you have terrible taste in music). For the next 4-5 interactions after that, the chatbot remembers that you are talking about booking tickets for tomorrow and Nickelback, not Pink in three weeks.

That's context, the ability for a chatbot to carry on the conversation without having to refer to the key variables at each interaction. This is entirely possible, it's what we do day in and day out.

This feature is what separates the bad from the good chatbots. Most free chatbots you could create using a DIY platform would be bad at keeping context.

You could log in and create what we call an FAQ chatbot, where you input questions your users will ask and the answer you want your chatbot to give -- and that's it. That's great if it's what you need, but this sort of chatbot will never be able to do anything complex, including holding context.

What we do is build something that is highly targeted, highly specific, which allows us to keep context for as long as we need.

As an example, one of our clients' sales chatbot sells cars. The conversations it has to handle are very complex and very long, it has to stay aware of -- and sometimes update -- context for over 20 interactions.

Since we're very targeted and high-end in that regard, we're able to build that context in, update it when it needs to be updated, lose it when it should be lost.

This is why we try to steer clients from wanting a chatbot that does everything. A bit like you don't want a landing page that sells everything, you want a clear USP and a clear goal. It's the same for your chatbot. The more targeted it is, the better it will perform.

Sean Clark: Do chatbots have the ability to learn from the end-users and evolve?

Alex Debecker: Yes, absolutely. There are a few ways in which chatbots can learn. Obviously, they are AI-driven, so you'd expect they'd have some ways of learning from their interactions -- or else that would defeat the purpose!

evolution of machines that learn

The first way for them to learn is from manual training. You'd release a chatbot, let loads of people talk to it, and review the conversations manually afterwards.

If you notice something you've obviously missed, you build this new topic into the chatbot's knowledge base and manually train it to know this new 'thing'.

The downside of that is it's a very manual process. We're not keen on manual processes, as we're all about efficiencies, so it would make us a bit sad to have to do that manually.

Another way for chatbots to learn is through feedback. During the first release of a chatbot, we implement a feedback feature that asks the users to rate whether the chatbot's answer was useful or not. We can then use this data to train the chatbot in answering better.

This also ties nicely with gap analysis, something SEOs would be familiar. Through analysing conversational data, you can figure out what your chatbot has been missing, what it's unclear about, what your users are not understanding -- and use this data to make it better, faster, stronger.

Finally, there is learning at scale. Since ubisend services many clients across many industries, we are able to anonymise a lot of data and use it to train chatbots across the board.

For instance, let's say we have 55 HR chatbots that can handle conversations around booking annual leave. If a user on one of those 55 chatbots asks a question in a bit of an obscure way, we can use that interaction to train the other 54 chatbots to also understand that new sentence.

That's one of the advantages of coming to ubisend, a company with a large client base and vast experience.

Sean Clark: I'm particularly aware of Alpha GO-type of artificial intelligence, where the AI is basically given the rules of a certain game then taught to play 1,000s of games against itself to learn and evolve. Is that the type of power that's available to us with chatbots?

Alex Debecker: There are a few things there.

Like you said, AIs like Alpha GO are given the rules to the games they're learning. Although there are billions upon billions of combinations of the game, it's still a confined environment.

Doing so with natural language is actually a lot harder, because humans talk in many different ways, bend the rules all the time, or even step outside of the conversation at any point. This makes it a lot harder to predict.

At ubisend, we built our own NLP engines internally. Unless the client specifically asks us to use Google or Amazon's NLP, we'll use ubiNLP -- our own AI engines. (Note from the editor: learn more about natural language processing).

Since we tend to work with larger companies, using our own NLP engines allows us to deploy the entire solution on our clients' infrastructure. This means they own 100% of their data.

Instead of sharing data with Google or Amazon, they retain the entirety of it. This is a massive value add, as it provides security (user data never leaves the business) and actual value (data is the commodity of the future). Businesses will be judged and evaluated on the data they own.

Sean Clark: We are so data-driven with everything we do in marketing. I'm assuming this brings an enormous amount of data, especially if you have control of this in house. What sort of insights would you get from a chatbot that would have been harder to get before?

Alex Debecker: There are two things worth discussing on this.

The first is, a tailored chatbot can collect the data that matters to your business. We don't ship a chatbot with a 'template' dashboard that means nothing to you.

Instead, before even writing a single line of code, we sit down with our clients and figure out what data and metrics matter to them. What will make your CEO, director, shareholders really excited about this chatbot?

  • Conversation-to-sale conversion rate?
  • Deflection rate?
  • Conversion per channel?

We'll build the chatbot solution with this goal in mind.

From a marketing point of view, this makes complete sense. We all have access to Google Analytics and its hundreds of reports. Really, any given business would (should) use four or five that are really targeted to their business.

This is what we do. We could collect all the data in the world, but it's more important we collect the data that matters to our clients. It's the power of a bespoke Google Analytics for chatbots!

example chatbot dashboard

The second is, there is new data available to marketers through chatbots. It's one of the things that really excite me, particularly as a CMO. I don't often get the chance to geek out on this, so I'm glad I have this opportunity today.

Thanks to chatbots, marketers have, for the first time, access to qualitative data at scale. I know many out there will cringe hearing me say this, because marketers don't tend to trust anything else than quantitative data. But hear me out.

If you go on Google Analytics right now and you want to see what keywords people have put into Google to find your website. You'll find that ~96% of keywords come up as 'Not provided. They're hidden by Google themselves, leaving you to figure it out by yourself.

With chatbots, you have full access to entire conversations between your customers and your business. This allows you to understand words they use, words they don't understand, content that's clearly missing, your customers' feelings at every part of the journey.

chatbots qualitative data for marketing

It's a direct connection to your customers' brain.

This becomes a pool of knowledge for everything your business does. For instance, you can use this data to drive your SEO strategy. If you know the words your customers are using, you can use this to build your next pieces of content.

This is one of the most exciting aspects of chatbots to me: accessing data that matters and accessing your customers' brains.

Sean Clark: Social media gave us a little bit of this, for instance through Twitter conversations, but nothing to the extent you're talking about.

I'm assuming the solutions you build integrate with all sorts of third party applications. Are there any particular ones, for instance CRMs or HR platforms?

Alex Debecker: Yes, we integrate with pretty much anything.

Part of our ethos is making sure we don't give our clients another 'thing' to log into, another 'thing' to manage. If our solution meant they had to manually transfer data across to other platforms, we'd consider this a failure.

So, integrating is tremendously important to us.

Since we work custom, we can integrate with whatever it is the customers have. Most large platforms like HubSpot for marketing, SAP for HR, etc. have APIs available.

What tends to happen is most large companies use bespoke internal software. In this case, we often have to create the connections ourselves through webhooks or APIs.

For example, we recently worked on integrating a chatbot purchase event into Google Analytics report and allowing the client to report on it by channel. Their chatbot lives on Facebook Messenger, website widget, and website full page; so being able to filter by source was important.

Doing this allows our client's CMO to report on all the companies' marketing channels from one application instead of having to log into ubisend for 'just' the chatbot channel.

In HR, we often work with SAP, workday -- all the big ones.

Customer service is an interesting one. Most clients that come to us with a customer service chatbot project already have a live chat integration in their system. Our job here is to make sure our chatbot can integrate into that live chat system instead of forcing not only the company but also their customers into using a different tool.

from live chat to chatbot in customer service

Integrating is a massive part of the power of chatbots.

You, as a marketer, might find this interesting. We recently worked with Google to develop a piece of technology that would allow businesses to re-target chatbot users with ads.

If you imagine going on an ecommerce website looking for a blue jacket. You talk to their chatbot about it and, sadly, it turns out they don't have it in your size. This little bit of technology would then allow this company to surface retargeting ads to you later on, solely based on your conversation with their chatbot.

This is a good example of using chatbot data and integrating it into a different service, in this instance Google Ads.

Sean Clark: I'm a big proponent of remarketing. Some people don't like it but I like it, and it's very effective. The more data you have on people the more accurate and relevant those ads can be, so I really like the idea of that.

Something else you said piqued my interest. You mentioned chatbots living on different channels. Have you seen people interact differently based on the channel they're using? Do people using Messenger behave differently than those using a bot on a website?

Alex Debecker: It's fairly similar.

I have noticed people talking on Facebook Messenger tend to talk with more of a customer service intent. I'm a bit older than the younger generation but I believe most young people these days, if they have a question, go straight to companies' Twitter or Facebook rather than their website (or email).

Because of that, we see customer service-related messages through social media channels.

On websites, however, users seem unsure about who they're talking to when they first make contact. It could be anyone. So we tend to see more of a business-y interaction on there.

Overall, though, they're quite similar.

What we have noticed is users are surprisingly open when talking to chatbots.

We encourage our clients to make their chatbot's first interaction with a user a clear message: I am a chatbot. It's one of our rules. Don't try to make your customers think they're talking to a human, because it's bound to fail at some point and make everyone angry.

Instead, have your chatbot announce clearly and up front that it is a chatbot.

'Hi, I'm so-and-so, I'm a chatbot, and I can help you with XYZ'.

What we've noticed then is, users tend to feel more comfortable sharing personal details knowing it's a machine on the other end. This is great because it means we're all getting used to talking to machines.

Sean Clark: Yeah, there is that 'creep factor' that's slowly coming in with things like robocalls. I believe Google got some harsh feedback when they looked into audio-based bots that were able to take restaurant booking. They demo'd that last year and people didn't seem too keen. They were too much like humans.

It's good to hear you have that rule in place. Do clients tend to push back on that?

Alex Debecker: No, they don't tend to.

We've been in the game for a while and we know the impact of not following that rule.

If you imagine talking to a chatbot that doesn't introduce itself as a chatbot. You talk to it, thinking it's a human, and at some point it struggles to answer a very simple question. As a customer, that would annoy you so much.

You just lost that customer right there and then.

On the flip-side, if your chatbot introduces itself as a chatbot and explains it can help with XYZ, the situation is vastly different. The customer will understand that it cannot ask about anything outside of XYZ. There's a lot more acceptance in the chatbot failing.

There is no positive in pretending your chatbot is human. People don't like to feel tricked. They will figure it out at some point, because we're not building perfect machines. At some point, they'll fail. And when they do, your users will feel duped and insulted.

Sean Clark: My next question is about cost. I know it's about the return, so it's going to be hard to answer, but given all we've talked about I assume these solutions are not the cheapest things.

If there are people listening and thinking they could really use a chatbot, how do you price a solution?

Alex Debecker: We've obviously on the high end of the spectrum. You could go out and find a cheap, if not free, tool that will help you build a 'chatbot'.

But, we're enterprise-grade and build high-end solutions, which comes at a cost.

Pricing a chatbot is quite tough. It's a bit like pricing a website, which you're familiar with.

You could get a free website using WordPress or spend £20 on a theme from Themeforest. Or, you could go to an agency to build a massive multi-lingual ecommerce website with custom integrations, which would cost you tens of thousands of pounds.

Most clients come to us extremely excited by the potential of the technology. They realise they can do loads with it and want to go off in many directions.

Ironically, our first job is to calm them down. Wanting to do everything at once will never work and cost you millions of pounds.

Instead, we encourage them to focus on one use case. What is the one thing you want your chatbot to achieve? This is what we call a chatbot's one true goal.

Once we have that, we tend to build a minimum viable product (MVP). We take 4-6 weeks and build a smaller prototype version of the solution at a reasonable cost.

This allows them to release something, test it in the real world with customers, and prove the concept. It's very important to make sure users actually enjoy the channel, that they don't get annoyed or ignore the chatbot. This is all insights our clients can get before a full investment.

Once this proof of concept has been running for a little while, we can use the data it collected to educate the next phases of the project.

To give you an idea, a final product could take two to six months to build. Some projects take much longer, we're now towards the end of a two-year project.

As far as indicative costs, our solutions start around £15,000 and can go into the hundreds of thousands depending on the scope of the project.

Sean Clark: What do you think is the future for chatbots?

Alex Debecker: I think we have about ten years ahead of us where chatbots will become more and more mainstream.

A few years ago, if I went out and talked to people about what I do, no one would even know what I was talking about. Today, already, people are a lot more aware of this technology. This is probably thanks to things like Alexa and Google Home.

We'll get to a point where most businesses will have a chatbot. It'll be like having a website. You'll get onboarded into your new job using a chatbot, it's just going to be part of normal life.

Beyond that, I believe we'll see software and companies be more 'chatbot ready'.

At the moment, we need to build the chatbot and then look at software like SAP and figure out how things will connect. In the future, software will have that connection ready. Developers will know this is something they need to be ready for.

Finally, this might annoy some people, but I believe chatbots will at some point bypass websites. I know, this is a pretty bold statement.

At the moment, if I want to go to the Ber Street Kitchen (a fantastic little cafe we have here in Norwich) and I want to book a table, I'd have to go to their website. I don't know their website, so I'd first have to go to Google, type in Ber street kitchen, find their facebook page, find their website, browse around to find their contact page, bypass broken links and missing information, etc.

Using websites presents many hoops and potential points of failure. One of our good friends, Tom, talks about that all the time as well. Navigating websites, even for tech-savvy people like us, can sometimes be complex.

Instead of all this complexity, I believe at some point we'll just go to a chat window and ask 'what time are you open on Saturday and can I book a table?'.

We will literally be talking to the business -- not having to understand buttons and colours and menus and navigating pages.

We will see something similar to what we've seen with companies' Facebook pages, where users now go there instead of the website. We'll see that, but at a larger scale.

Sean Clark: You're absolutely right. It's a bit like voice search. More and more people use voice to search while driving, for instance. My car has the Apple car play and I like the Waze navigation system, the app Google bought recently. I don't have to type anymore, I can just talk. No webpage required.

The thing that concerns me about that is the narrow fields to entry because it's basically one result. With voice search, you're not presented with a choice.

Do you think we'll end up with chatbot search engines? Because if you ask something to Siri, it comes back with one result.

Alex Debecker: It's an interesting question.

We're currently working on a project with Archant under Google funding. Archant is a local publishing company. They're 150 years old and have a massive building in the centre of Norwich, with a basement full of old newspaper archives.

There are about four pages per day over 150 years, so that's many many pages of content hidden away in the basement. We're turning all these pages into an interactive chatbot, living on text but also Alexa and Google Home.

What you'll be able to do is ask the chatbot 'what happened today in 1945?' and hear a local story from the end of the war. Or, you'll be able to ask 'tell me a story about the suffragettes' and hear a story about that.

So, to get back to your question, if we ask this particular chatbot 'tell me a story about Jarrolds in 1967', there might have been many stories that year. Therefore, we present the user with five headlines and offer them to option to pick whichever one they want to hear about.

We might get to a similar point with voice search. It might not be ideal because, in some instances, you just want to get to the answer. But if you ask for a local dentist, you could be presented with a choice.

We could even get to a point where additional information will be given to the user to encourage a decision. For instance, dentist A, B, and C could have their online review rating as well. You'd get the results with a differentiator.

For articles, it gets more complex. If you search for 'the best SEO strategy', what would the differentiator be via voice?

It's certainly an interesting challenge to tackle in the future.

Sean Clark: Voice also gives search newfound accessibility.

Alex Debecker: Absolutely. Again, websites are very hard to use, even for us tech-savvy people.

Sean Clark: Alright, Alex, I could go on for days talking about this! Where could people find you if they want to know more?

Alex Debecker: You can find us on ubisend.com or on Twitter at @ubisend. If you want to reach me directly, Twitter is: @alexdebecker.

I hope you enjoyed this new format from us. I want to thank Sean for having me on and providing such a comfortable experience.

Finally, I am looking to contribute to more podcasts. If you (or an acquaintance) are hosting a show, feel free to reach out!