Data that comes from chatbots is very valuable because it allows companies to better understand their customers. If you don’t want this type of data to lose its worth, you should properly secure it. Because of the recurring scandals concerning personal data breaches, netizens pay even more attention not only “to whom” they send their personal information but also “for what need”. Today, I’ll show you what to do to increase data security, so that the users would no longer worry about the information they send in dialogues with your chatbot.
With a chatbot, you can check your account balance, download an invoice, or transfer money from your bank account. These are just several simple examples, yet they strongly connote issues related to security, in the broad sense of the word.
When deciding on a cooperation with a chatbot company, first, you should make sure that it stores and processes the personal data of chatbot users in accordance with applicable law.
Even though interactions with a chatbot aren’t the same thing as filling out an online personal data form, the results are the same. The user’s data goes through a server and gets stored in a database. That’s why chatbots should have the same safety features and security protocols that are required in other systems.
One of the major advantages of chatbots is that you can use them on many different channels to communicate with users. Every channel has its own rules and is owned by a company that has its own data privacy and security policies. Whenever you choose a specific channel, be aware of what you’re going into.
We can divide chatbot channels into two categories with relation to security:
Most often, you can come across chatbots that are run on public channels, such as Facebook Messenger, Google Assistant, or Slack. In these cases, you always share your personal data with another party.
However, in the case of Mark Zuckerberg’s communicator, you can design the chatbot in such a way so that sensitive data (like your phone number or insurance policy number) is transferred outside the communicator (and that’s what we advise our clients to do).
Private channels are used in situations where sharing data with another company is simply unacceptable. The data then is accessible only through a web widget or mobile app. Moreover, access to this data is limited, for instance, only to the intranet.
The place where the chatbot is hosted affects the way data is stored. There are 2 main software implementation models:
In the Cloud model, we install the system in the infrastructure of the hosting provider. Usually, each chatbot platform stores data in one chosen cloud. Well-known, large cloud computing services, such as Google Cloud, Amazon Web Services or Microsoft Azure have EU-U.S. Privacy Shield certifications.
On Premise is a much more “traditional” model. The software is installed in the client’s infrastructure. The implementation process takes much longer, but it is very often required in the case of banking or insurance industries.
There’s also a possibility to combine 2 of the above mentioned models. Sensitive data is stored in the client’s infrastructure (or in any other chosen place), and the chatbot system is stored on a cloud.
If your chatbot uses the services of other companies, for instance, companies that provide you with an NLP (Natural Language Processing) engine or analytical system, you should pay attention to the same issues connected to hosting and data security. With the increase of the number of entities that process data, the higher is the risk of data leakage.
Depending on your chatbot’s channel, you can use additional functions that allow you to gather data outside of the chatbot ecosystem. One of these functions offers the possibility to create dedicated forms, to which data is transferred with the help of URL parameters.
You can find similar solutions in the chatbot of a beer brand named Żywiec, where, because of legal requirements, we didn’t have the possibility to add promotional codes directly from the chatbot.
We transferred the following data to the form:
Users would give all the consents to the processing of personal data and email addresses through a secured form.
Remember not to fill out forms with your sensitive data in this way, as there’s a risk that it will be intercepted by third parties. That’s why placing such information as your email address or PESEL identification number in your URL is a very bad idea.
The best way to protect yourself is to avoid forms that require your personal data 🙂 If your company needs to know the postcode, don’t ask for the whole address. In a similar fashion, if you want to refer a patient to another specialist, you only need to know his symptoms, don’t ask for his entire medical record.
Limiting the amount of information you require from users not only decreases your responsibility for holding personal data but also improves the chatbot’s UX.
We know from experience that chatbot users usually write their data directly in messages, despite the fact that we have shown them precisely where they should enter their data and that there is a special form for this purpose. This is especially common amongst the happy winners of chatbot contests organized by our clients.
But what if we don’t want to store data entered directly into messages? In this case, data anonymization is the best option. With chatbots, you can replace data with, for example, stars. We implemented this solution for clients whose chatbots, by definition, weren’t supposed to save any data even if the user wanted to share it at all costs 🙂
The data won’t be saved in a database, won’t appear in statistics, or your mailbox. You should remember, however, that the data will still be available in, for instance, the Facebook inbox if your chatbot runs on a public channel.
The admin dashboard is the basic work tool for any person that administers the chatbot. It allows to make structure updates, add intentions to the NLP system, look through contest entries, analyze data, hold campaigns, and even answer to users’ questions.
In the context of security, it’s essential to secure the access to this information.
The access should be strictly controlled by a system based on role assignment and permissions.The user should have access only to these parts for which he’s responsible. For example, we can assign such roles, as: Administrator, Editor, Marketer, Mailbox Moderator, Contest Moderator, or Viewer.
Every action the user makes has to be registered in log files – the files that record the application activity. Thanks to this, you’ll know to which data specific users had access. This is very important not only from the perspective of security but also because of the GDPR requirements that are imposed on computer systems.
A single password to many accounts is one of the most important online security problems. After having intercepted the password from one system, a hacker can use it to access other accounts of the user.
The password to an admin dashboard should be strong (appropriate length, uppercase and lowercase, numbers and symbols) and regularly changed. There’s a special system that makes sure users don’t forget to do this. A two-phase verification, for instance, with the help of an SMS – that’s another way to increase your online security and to minimize the risk that third parties intercept your data. It’s worth considering this solution whenever your system allows it.
As you can see, you should be familiar with several important issues when you want to secure data in your chatbot. Make these issues a priority. Thanks to this, you’ll be prepared for any eventuality. Remember that we live in a time where data security raises many doubts. Take the necessary actions to avoid worst case scenarios. A chatbot is supposed to be a tool that helps you keep in touch with your clients, not a reason to worry about data breaches.