Skip to Content

Robert de Snoo on the Ethical Framework

Content is also available on this page exclusively for members Log in to get access to this content or request account.

"Ethics cannot be reduced to right or wrong"

He has an IT background and knows the ropes in the insurance sector. Robert de Snoo is positively critical of the Ethical Framework, which was approved by the members' meeting of the Covenant this summer. "An Ethical Framework is a good starting point, but it's mainly about practice."

The interview is forced to take place via Teams , but that has been entrusted to De Snoo. He quickly emerges as an easy talker. And as someone who can handle an online environment well. "Well, what do you want, as a former ICT professional", he reports with a laugh.
After a long career in ICT, De Snoo has become more concerned with what he calls "the front end of innovation". In the spring of 2017, for example, he founded the Innovation & Trust Think Tank to explore social and strategic topics and make them tangible. "We live in a time of thinking, but think that we should do it. I want to combine both." In (daily) practice, it comes down to helping organizations and companies, including insurers, to put data ethics on the agenda.

In conversation with ...


This is the eleventh conversation in a series of interviews with an important stakeholder on a current theme. In this In conversation with ... Robert de Snoo speaks about ethics.

De Snoo is one of the founders of the Human & Tech Institute, which started a few years ago as a Think Tank Innovation & Trust, a collaborative project of Achmea, KPN, Rabobank and Royal Schiphol Group. In the meantime, the institute has established the Human & Tech Ethics Community (HTEC), which consists of independent thinkers from business, marketing, philosophy, ethics, technology and regulation. This HTEC helps to test data-oriented innovation projects for possible socio-ethical risks and opportunities.

 

Let's start at the beginning. What is ethics for you?

"You can already set up a nice tree about that, but very briefly summarized that is 'the good conversation' that an organization must have with its employees, environment and society. What do we stand for? What do we want, but also: what do we not want."

On the site of the Human & Tech Institute I read that ethics "is not a book with good answers, but that it mainly asks the right questions"?

"That's right and then those questions can also vary a lot. Based on the participants in a conversation. Or based on the function. A lawyer asks completely different questions than a marketer. And don't forget that the questions (and answers) are constantly changing based on new insights. What we as a society consider normal can change quickly. For example, due to technical progress, but also on the basis of social discussions. I only have to mention the Black Live Matters or Zwarte Piet discussion and you know enough. All this makes ethics incredibly difficult. Especially for insurers who are already stiff from the laws and regulations."

Is it too much?

"I don't know. I cannot judge the laws and regulations for insurers, but having an Ethical Framework, which is focused on the use of data, does give direction to the much-needed discussion. Both internally and externally. After all, technological developments are far ahead of laws and regulations. That is why we also believe that every organization should have a data ethics committee."

Case 1: housing fraud

Everyone knows that there are people living in social housing who are not actually entitled to it. How can a housing cooperative ethically look for ways to detect this housing fraud? One way is to search for online clues, for example on social media. In order to achieve results, however, in addition to the address details, the resident data of all residents must also be used. Legally, this is possible and allowed, because the housing cooperative has arranged the use of this data. The question is therefore mainly an ethical one: how far do you go in profiling and searching for traces? And how do you weigh the different interests professionally?

Ethical behaviour cannot be legislated.

How can insurers do it right? There are no hard frameworks like with legislation, so who decides what ethical behavior is?

"Ethical behaviour cannot be put into legislation, but we have set the standard for good behaviour by law. If you break the law, you do not adhere to the norm. But vice versa, it is not automatically the case that you behave well if you follow the rules. Just look at the privacy regulation. There are companies that think they comply with the law if they ask permission for everything, but that is far too short-sighted. Such a company thinks it complies with the law, but of course it is not an example of good ethical behavior."

What is good ethical behavior for you?

"There are plenty of companies that are doing well, but nobody is selling it. I think that's such a shame in the ethical discussion: you're doing it right or you're doing it badly. You can't simply flatten ethics into good or bad, because before you know it, you're on the wrong side. We had already set up our Innovation & Trust Think Tank before Facebook's privacy scandal, which allegedly gave a campaign agency access to the personal data of millions of Facebook users. At one point I wanted to bring out the ideas and plans of that think tank, but especially the large(er) companies sputtered against it. They all thought: what happens to Facebook can happen to us. And to be honest, if a small startup screws up and something really goes wrong, not that small startup, but NN or Achmea did it."

And now? So what should insurers do?

"Taking a stand and consistently defending it. Or, as we in the data ethics committee always say: building moresprudency. You will have to show time and time again that you have thought carefully about your actions and what trade-offs you make. Insurers are quite far along in this, including in the area of discrimination. But it also means for insurers that they have to make (technological) choices. Where is the line? It is always very easy to tell what you do, but also tell what you do not do. Take the example of facial recognition, which is currently the subject of much debate in the context of Black Lives Matter. A major supplier in America says it does not hand over facial recognition to the police, but then does so to the FBI. I'm really flabbergasted. As good as I was when I heard that another major U.S. company shares data with more than two hundred police departments. Through a camera at your door you can see who is at the door when you are not at home. Handy, right? But not if that company then forwards that data to the police. I didn't give permission for that, did I?"

Case 2: facial recognition

Who doesn't know them? The long queues at Schiphol at the border control. In total, you have to show your passport at least four times: at check-in, to get through the border, ten meters further again and before you get on the plane. There must be something to come up with, schiphol thought. In the Seamless Flow pilot, the airport has linked your passport and boarding pass to your face and you can go anywhere through facial recognition if you have shown your passport once. It increases the convenience for the traveler, it is safer and more efficient. Win-win.
Or not? The biggest question is whether the technology works well and whether the travelers have sufficient confidence in the organizations involved and the system? Sensitive personal data is stored. What happens to that data? How long are they stored? And can a traveler also refuse? The queues are not there now, but the message is clear. The most important task facing an organization when introducing new technology is communication. Explain and (always) give people a choice! They understood that well at Schiphol.

You just said that insurers are well aware of discrimination?

"Absolutely. Although insurers de facto make a lot of distinctions in order to be able to differentiate their premiums, they seem to be well aware of real discrimination. But they also have to be vigilant that it does not still creep into the data."

So how does that awareness manifest itself?

"Setting up an Ethical Framework that is mandatory for all members of the Covenant does indicate that you take society and ethical decisions quite seriously. In addition, I know that insurers talk a lot about ethics, organize debates and together create the frameworks for a clear and unambiguous policy. Rightly so, because their impact is great, but that is why the Ethical Framework is so important."

What do you mean by that?

"Insurers have a lot of impact on individual people, but also on society. We had water damage in our kitchen this spring and had to leave our house for four months, with the whole family. I thought that had a considerable impact on my life, especially in corona time. But of course it goes much further than that. Insurers ultimately determine what we as a society tolerate with regard to data use."

"Insurers help determine what we tolerate"

"Insurers help determine what we tolerate"

"Insurers help determine what we tolerate"

You're referring to the eternal dilemma between data and ethics? Can they actually go hand in hand?

"That duality is always exciting. There is definitely a positive side to technology and data use, but there is also another side. If you find out you're being treated differently online than I am, do we still have a good conversation? Moreover, you can not always foresee everything. Take Airbnb. The home exchange started so nicely, but did anyone think at the time of its establishment that at some point a quarrel would break out in the Association of Owners? I don't blame them, of course, but I do wonder if the platform is capable of saving things. Can they take a stand? Do they dare to show what they stand for?"

Ethics is also a matter of moving along, then?

"That's right. Things go wrong. That is the case and remains the case, but the key question is whether the monitoring is always on. When Radar is on the doorstep, every insurer is awake, but what about if the cameras stay away for a while? That is precisely why it is so great that data brings a new reflection to the industry. For example, look at a Privacy Statement. Who reads that? Only the people who draft it do that. Insurers must offer much more customization and serve customers at their beck and call. Am I a privacy freak? Are they allowed to know 'something' about me? Or do I automatically click on yes everywhere? That first group, the one to five percent complainants, are known to insurers, but what about the rest?"

So more customization, is that the key?

"I think so. And not so much when it comes to premium differentiation, but more to accommodate customers in the (desired) data use. This does not only apply to insurers, all companies must take their responsibility to increase data awareness among customers and employees. In training and workshops I often ask people for their personal opinion. Would you like that yourself? Usually I get the answer: no, I don't. So why do you do it with someone else? My advice is always to at least start a conversation with the other person."

Case 3: the care robot

Robots are increasingly being used in healthcare. This is also the case with a healthcare institution that did a pilot with people with forms of fear of commitment. The robot was intended for support and as a kind of buddy. A nice thought, but what had not been well thought out beforehand, was that a robot can break down. No alternative had been devised. In fact, when the pilot was over, the robots had to go back. Not really useful for people with fear of commitment. It is one of the many examples of good technology in which, during the process of development and implementation, insufficient thought is given to whether there may also be negative consequences, such as the impact on the well-being of people (patient/client) or the prevention of (emotional and psychological) damage.

You are suggesting that insurers should be much more vulnerable?

"I would indeed love it if insurers dared to be more open. At the same time, I understand how difficult it is to tell customers that you cannot always see all the consequences yourself. That is not what people want to hear, but it is precisely this openness that testifies to a sense of reality and commitment. Maybe there is a role for the Covenant there?"

The Covenant has developed an Ethical Framework for its members, and the insurance industry is the first to rein itself in. Isn't that vulnerable enough?

"An Ethical Framework is a very nice first step, but it is also just a start. Insurers now have to get on with it and get to work. In the context of the Directive on the Protection of Online Consumers, the first lawsuit against the telecom industry is now pending. That can also happen to insurers."

Is that a real danger?

"Absolutely. The Dutch Data Protection Authority is already looking at the cookie policy of insurers. And rightly so, because I also just want to be able to say no. Think back to that moresprudency just now: explain what you do, what you don't do and why."

The first lawsuit against the telecom is ongoing. That can also happen to insurers.

Ethics is mainly in 'what do we want' instead of 'what can we do'?

"And in a good conversation. We all know that technology is advancing and technological gadgets are being applied more and more. You have to set up a dialogue about that internally. For example, between your ICT and marketing department. A good conversation always starts at home. We once guided a very diverse team of a company where we turned things around in a fun way, with the question: How can we hack things or act as unethically as possible? That's where the most amazing ideas came from. It provided the organization with a complete list of low-hanging fruit where the gaps in the service were. Much more important, however, was that the employees of legal felt fully involved instead of getting the Zwarte Piet again. You have to have the right conversation with a diverse team."

Ethics should belong to all of us?

"Exactly. It concerns us all. That is why, after the think tank, we have set up an independent platform (the Human & Tech Ethics Community, HTEC), which can support organizations and companies in putting ethics on the agenda. Many organizations are searching. They need reflection on decision-making, but end up in a political game. Who should be in the working group? What will be their mandate? Internal scheming, while it should be about sincere reflection and shaping next steps. In the beginning we also thought that organizations would throw the casuistry over the fence to our HTEC, but we have learned that organizations have to do it themselves. We only help to organize that own reflection. Our goal is to become redundant as soon as possible. This can be in a growth model, but it can also be the case after just one session. The nice thing about ethics is that employees will 'automatically' get involved if you do nothing yourself. They play with it as well as management does. Only recently did I ask during a workshop: if you run into an ethical dilemma, do you know who to be with in the organization? Eighty percent had no idea. That's why I keep repeating it: an Ethical Framework is a good starting point, but it's ultimately about awareness throughout the company. Employees must dare to be vulnerable internally in order to make ethical dilemmas negotiable. If it is not possible inside, how can you explain it to the outside world? Insurers really need to get to work."

Case 4: cookies

We all know them: cookies. Useful for companies, often irritating for individuals, but what does the law say? Unfortunately, the law is not entirely clear. Commercially, the option is open and beautiful online data can be created, but it also presents a few dilemmas. Question 1 is whether you offer a choice and/or whether people can really refuse your cookies? Question 2 is which variant of the default option you choose. Many organizations work with a number of options, which briefly explain what the benefits are. A typical example where the customer does not think from the point of view, because he is annoyed. Why do you make that choice as a company? Have you thought about it and made a conscious choice? Or has the online department done that and conveniently followed the standard in the market?

Photography: Ivar Pel

Insurers really need to get to work.


Was this article useful?