Dr. Gemma Galdon-Clavell is a leading expert on the legal, social, and ethical impact of data and data technologies. As founding partner of Eticas Research & Consulting, she traverses in this world every day, working with innovators, businesses, and governments who are are considering the ethical and societal ramifications of implementing new technology in our world.


We continue our discussion with Gemma. In this segment, she points out the significant contribution Volvo made when they opened their seat belt patent. Their aim was to build trust and security with drivers and passengers.


Gemma also points out that we should be mindful of the long-term drawbacks if you ever encounter a data breach or a trust issue - unfortunately, you’re going to lose credibility as well.


The Business Urgency to Be First
Gemma Galdon-Clavell: My name is Gemma Galdon Clavell and I work on the legal, social, and ethical impact of data and data technologies. I started a company six years ago now that works precisely on this. And so, we're one of the very few companies that have been helping the public and the private sector in the last six years developing better technologies, but also understanding better how technology have impacted societies. So, taking the point of view of the consumer and the citizen into design of the technology, and avoiding bad data practices as we see all the time, unfortunately.

Cindy Ng: Welcome, Gemma. When companies come to you with a new product or service, they understand that going to market and dominating the entire space is almost everything. There's a huge tension between the organization as well as regulatory pull, making sure that you meet the legal requirements. Companies are trying to bring products to market as fast as possible. That's an industry problem.


Gemma Galdon-Clavell: Well, that tension exists and it will continue to exist. I think that we are currently working with pioneers and we're very aware of that. We don't hope to work with everyone tomorrow. We need to work with the ones that are gonna change the rules and that's what we find fascinating about our work. We don't wanna mass produce ethical impact assessments. We wanna help the world come up with better technological solutions to its problem. So, of course, we experience that tension. We are contacted sometimes by some people that don't really believe in what we do. So, they have been told by somebody else who does see their problem that they should work with us, but then maybe that person was higher up and then the person who contacted us is legal management. And they're very skeptical about our work.


I think that in all of our projects in the end, they do realize that there's value in what we bring in. But again, we are working with the ones that wanna shape the future and not do things that we're not interested in. Just, I mean, think about Volvo, for instance, and cars. I usually use the analogy of cars because cars were not conceived with seat belts, for instance, or speed limits. These are things that, as a society, we agreed over time that these were the necessary precautions that we wanted to make the most of cars and vehicles, while at the same time protecting society. And when society start thinking about what the limits your cars be, seat belts were not immediately on the table.


And then there was a company, Volvo, that came up with this innovation and thought, "Well, if we offer seat belts in our cars, then we can create more trust and provide more security to our customers." And what they did was they released the patent. They did not just put seat belts in their cars, but they said, "We actually want the industry to adopt this. We want this to be the standard." And they gave it away for free. Today, all cars have seat belts. No one would dream of buying a car without a seat belt and Volvo is still seen as a company that sells security. So, these are the people we wanna work with. We wanna work with people that are willing to be disruptive in their industries, not the people that just wanna do same old, same old.


The Breach of Trust
Cindy Ng: When companies do engage you after they've experienced enormous embarrassment through the media, it's not necessarily a data breach, but there's certainly a breach of trust with the public, has anyone done the research to quantify the cost of that?

Gemma Galdon-Clavell: I think they've tried. I wouldn't be able to tell you whether they were right. We have seen some clients be very clear in saying that they realize how much they've lost. That they have lost a lot of money by not doing things well. And not just money in terms of what I said before, you know, you're coming up with a pilot that doesn't sell is hugely costly. It might not be as visible as a data breach, but if you produce something that in the end no one wants because you didn't take into account people's stress or acceptability, then you're gonna lose a lot of money. And if there’s a data breach or a trust issue, then you're going to lose credibility as well. So, you're gonna have reputational problem. So, I think it is about dollars and cents, but it's also about the long-term effect.


So, I think that the companies that we work with are increasingly incorporating privacy and data ethics as part of their risk assessment. And that's what we would like to see. We'd like to see privacy and ethics being mainstreamed in the usual processes of any large corporation that deals with technology.


Cindy Ng: And alternatively, have your researchers been able to quantify how much they would potentially save when they work with you and seeking out your counsel?


Gemma Galdon-Clavell: It really depends on the project. What we have often done is when we were ask to contribute to developing a specific piece of new technology with the company, what we do is state their economic forecasts, and tell them what it is went wrong, what did you spend all this money in creating and developing this new product. And then you couldn't tell us. So, our estimates of savings are based on their estimates of potential profit. That's how we do it and sometimes we have told them, "Well, imagine if you were not able to sell it," that's one scenario like, there's an acceptability issue and so, you just sell a few hundreds. And so, it doesn't become a viable market alternative. But there's another scenario, when there's a data breach or a trust issue. And then it's not only about the money that you’ve spent, but also the money you need to spend in the future to resolve or to redress that problem that you've had with your existing or potential clients.


So, there's these different scenarios depending on when things go wrong. But it's very likely that if you're not careful with your data processes at some point or other, you will run into problems with the regulator, with your own clients, or with society as well since you're gonna end up making the headlines for that bad data practices.


Cindy Ng: The Guardian published an article the other day about the Australian government, how they released anonymized data sets, lots of medical records including prescriptions, surgeries for millions of people. And researchers, they've been able to re-identify those people. And I'm wondering if they would come to you after the media announcement, is that when you take on a client?


Gemma Galdon-Clavell: We usually take them earlier. Once they have such a huge issue, we work with neighboring countries, but not with Australia. But in this case, it's clearly a case of not having as specific procedure to do open data. I mean, clearly, the government's gonna have all this information and clearly, the government needs to have all this information because you do wanna make sure that your doctor is aware the procedures you had before and your condition. You should also have a right to access your medical records. So, that data has to exist and it needs be somewhere.


But then you need accurate based encryption, you need to make sure that that information can only be accessed by the right people. And if you do open data because you want that data to be aggregated and you want universities and private partners to make the most of that data, then you need to go through the appropriate safeguards. And we have specific methodologies to do open data like how to anonymize in a way you still can derive a value from that data, but the data does not include all those small pieces of private information that you don't wanna see really.


So, clearly, the Australian government did not have an open data policy or the appropriate profiles of people in place to make sure that that was done responsibly. And it's really terrible that, in 2018, you have major governments not being aware of these issues and they don't have procedures before data goes out there to make sure that this doesn't happen.


I think that this is changing to a large extent in Europe. We are working a lot with Latin America as well and I think that governments there realize that if they wanna make the most of the data revolution, they need to do it responsibly because otherwise, the trust and liability issues are too great. But unfortunately, it doesn't seem like the government in Australia was aware of that or have the procedures. We've seen that in other countries as well, but I think that there's more and more of an awareness of the need to undertake these things before disaster strikes.


Privacy Assurances Shouldn't be a PR Gimmick
Cindy Ng: When companies offer privacy assurances, for instance, if your browsers can go into incognito mode which erases the search histories, or Snapchat where your messages disappear after a certain period of time, how accurate is this feature? And is it a good way of doing business when everybody who is, sort of, even tinkering in this space know that that might not necessarily be 100% true? Is that a messaging problem?

Gemma Galdon-Clavell: This is one of our greatest concerns. There's a lot of people...well, not a lot, but there's some people who use privacy just as a PR thing. And they don't really change their practices. And that is clearly a matter of concern. And that's why we need standard because otherwise, it's gonna become a PR thing. And your technical standards and your privacy safeguards should not be a PR thing. It should be part of your core business and your core specifications.


So, one of the things we try and do is...and we're trying to develop a certificate, a way of certifying those companies that say what they do and do what they say, so to speak, to provide consumers and their customers with more assurances as to what it is that they're actually being offered. And making sure that it's not just cosmetic things, or a PR strategy that has no relationship with actual data practices. So, we think that certification is the way to go and we're gonna be very active in the coming months precisely in providing certification for the companies that wanna sell privacy or that they say that they use privacy and responsible data processes in their products.


Cindy Ng: And finally, I know you work with a lot of pioneers. So, I don't know how open you can be about your projects, but I'd love to hear a successful project that you've been able to deploy.


Gemma Galdon-Clavell: Our contributions are usually part of larger projects. So, for instance, in Europe in the development of the...what I called ABC gates(Automated Border Control) I guess, everyone is familiar with them by now, the kiosks that look at your passport when you go through an airport. You may have seen that sometimes there's no...you don't have a border mark anymore. But there's a machine that checks your passport and your biometric data, and decides whether you can enter a country or not. When the European Commission initially started developing that, they realized that it was important to incorporate ethics and responsible data processes in that. And so, we've been helping the industry for the last five years in making sure that the way that your biometrics are taken and that the identification happens is responsible. And that the data processes that are in place are responsible and accountable. So, I think that's one of the success, for instance, that we have.


We've also been working with a lot of public administration in improving their procurement practices. Buying technology is a crucial part of doing technology responsibly, making sure that when you buy technology, you buy the best technology out there, and that you buy technologies that incorporate data ethics and cyber security, and privacy concerns, how to improve the procurement processes to protect public administration. But also buy better technologies that are gonna be better integrated in your existing data processes. So, that's... I think that we have several success cases there and we think there are several administrations that are currently buying better technology and doing it better, and having a more informed team of staff that is more aware of the risks of incorporating new technologies there in their processes.


We’re also currently...and these are ongoing things, so, we don't really have results yet, but we are working with several international organizations that fund technology development in making sure that ethics and responsible data processes are part of the things that they assess when they provide funding for innovation. So, anyone that would come to those international organizations would need to prove that they're aware of the social impact of the technology they're trying develop and that they are building the necessary safeguards to make sure that that technology is responsible.


There's quite a lot of examples out there of things that can be done in practice to improve the way that we do technology and the way that technology impact on society and we're very proud to have been part of that. And we hope to continue to be part of that story for a long time.


Cindy Ng: Thank you so much. I'm so inspired by what you do. So, I wish you much success.


Gemma Galdon-Clavell: Thank you so much.