Can data and ethics live together?

Designing digital products can be an impervious ground when it comes to choosing how to treat users’ information and sensitive data. Sketch’s Design Advocate Matteo Gratton discusses how to design, and choose, more ethical digital products.

user image

Giorgia Lombardo

23/06/2021; 7 min read Design Dogs

With more and more people using digital products, terms like data tracking or data collection, GDPR, cookie preferences, encryption, multi-factor authentication have become part of our everyday vocabulary. There is a digital territory where we exist and that we need to navigate. This not only implies that we have digital identities, it also means that we need to learn how to take care and protect them. 

The companies creating the digital products we use every day should have the responsibility to design products that respect and protect users’ identities, data, and sensitive information. From this discussion stems Data Ethics, defined by Oxford professors and philosophers Luciano Floridi and Mariarosaria Taddeo as:

“A new branch of Ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values)” 

Matteo Gratton is a Design Advocate at Sketch who focuses on finding better ways to design keeping ethics, privacy, and data in mind. Sketch is indeed a company that has successfully built a collaboration feature that respects users’ data and design ethics. Matteo provides insight on how designers can create more ethical products and how users can choose a digital product with more awareness.

Matteo will also talk at Design Matters 21, where he’ll take the audience through practical examples and cases from Sketch and other design products and discuss how designers can look at their process from a privacy perspective. Join the conference in Copenhagen or online, on Sep 29–20, 2021.Get your ticket now!

What are the main ethical problems that we see in relation to data collection and treatment today?

I believe the issue has several layers: a first layer at a company level, a second one at a platform level, and a third one that sits at a user level. Each of these layers present ethical and privacy data management issues, and I’d say that many of them occur because there’s a lack of understanding in robust business models for recovered and stored data.

The main problem is that users don’t have enough knowledge, and the lack of a worldwide rule of law only makes it worse. This situation allows companies to keep perpetuating what they want because users passively accept any condition “so they can use the service.” In Europe we have the GDPR legislation in place but it often gives users a false sense of protection, as data security is not as frequent and extended as we believe.

Example of pop up message at BBC.com. Only by clicking on “consent” users have immediate access to the content of the website

How do you think companies today are changing their policies and the way they design their digital products?

Some companies are fully aware of the problem and take it seriously. Even though these companies put the issue at the very top of their priority list, I believe they are still a minority. The majority is just following the rules as strictly as possible and looking for ways to collect, store and manage data — sharing data (analysis) is a very profitable business nowadays.

What is Sketch specifically doing to protect users’ data?

Sketch is fully aware of the problem we’re facing around this topic. We believe we must go beyond the current rules. We’re not collecting, storing, or sharing data from users (except for Sketch files that we share within Workspace and their members). Everything we store is easily accessible to users, and we don’t track anything when you log into your Sketch account. Before you log in, we track the bare minimum and we even shield the user’s IP address. We believe transparency must be a priority as it nurtures a balanced interaction between users and ourselves as a company, as well as between users within a Workspace (for example, between managers and other employees). What is visible to one of the members is also visible to the others.

Moreover, we don’t store any of the activity happening on any Sketch file. We simply store a save of the file as a way to keep track of the different updates in the file itself. An update contains the work of all the collaborators, even if only one member manually saved it. Finally, you can always see the collaborator list in the Mac app, but you can’t see how much time someone’s been around except for the very last 15 minutes of a collaborator’s sessions, which simply allows you to know if someone is active in a document or not.

What can we, designers, do to make digital products more ethical?

I believe we can have a significant impact, as we are the ones designing the interactions and interfaces that are asking for data. Inevitably, we are part of the discussion around which data we should gather. We can always raise our hands when we believe we are requesting, storing, or managing data that is not really necessary. I believe we should strive to keep the least amount of user data possible — only what is required for the system to work.

I would love to add a side note here: even if certain data is required for the system to work, we should always play the devil’s advocate and check it out repeatedly to be sure that it’s really necessary.

The other area where we can have a profound impact is in the way we expose data to our users. We must be sure we are exposing data to users properly. When necessary and with the right information. Eventually, we need to be sure that we don’t leave any users behind in the process. If we fail to consider even a tiny amount of users, they could be greatly impacted.

Last but not least, we can also ensure that the data we gather will be disposed of when it’s no longer useful. For example; when a user leaves our service and cancels the account.

Can you name some examples of digital products that show transparency and respect for privacy?

To name a few examples, I’d love to go for something we all use daily: browsers and messaging apps.

For browsers, Firefox is a well-known example of transparency and respect for privacy. They added a few native features that enhance privacy when navigating. For instance, Containers segregate different types of websites, which makes it harder to track you. They also have the HTTPS everywhereand the Do Not Track options enabled by default.

https://www.youtube.com/watch?v=Gy7lyvAfOSw
Firefox Containers explained.

If you are concerned about your privacy and how websites are collecting your data, you can choose from a long list of extensions that can help you better manage it. Finally, Firefox has a blog dedicated to helping users understand privacy and security online.

As for messaging apps, I’d like to mention Signal as an excellent example of an app that has been built with privacy in mind. Signal does not store or require you to share your data when you install the app. It also has many features that help you maintain your privacy and be sure that anything can be removed. For example, you can have a message disappear within a set amount of seconds after receiving and reading the message. Personally, it’s pretty unbelievable to see how much information other similar apps require when you compare them to Signal!

Information that messaging apps collect. From left to right: Whatsapp, Facebook Messenger, Instagram, Signal, Snapchat, WeChart, Telegram. From Reddit

Let’s look at this topic from a user’s perspective. Some say that when you don’t pay for a digital product — e.g. a social media app — you arethe product because the data that is gathered from you can be used (and even sold) to the benefit of a company. Do you agree? How can users navigate this?

I believe this sentence is, unfortunately, true. I don’t think users should become the product themselves if a service or product is free. On the other hand, it is also true that a business needs money to stay alive, and it needs to find a way to monetize its services or products somehow. Still, there should be a limit that should not be bypassed, but that limit is tough to define.

We know that it is possible to define who a user is, by triangulating their data, however, what’s problematic nowadays is that often the users’ data is not only part of the service or the product that asks for the data, but it’s also shared with other products. Even worse, the data can be sold (as data itself or because the company is acquired by another one) and merged with other databases.

The report Identity in a Digital World provides a useful primer on the growing presence of digital identity in all our lives, with practical examples of the sectors and purposes in which we most commonly see it being put to use. From We Forum.

Finally, data will never disappear. The truth is that data is continuously being gathered and stored — if the company that owns the data decides not to delete it, there’s no way it’s going to be forgotten. I believe that we, as users, have the right to be forgotten. A simple “the data we collect is anonymous” sounds to me more like a polite way of saying “I don’t care” and I think it is definitely not enough.

Two possibilities of “digital graveyards”. This study by Carl J. Öhman and David Watson sheds light on digital death. Picking Facebook due to its popularity, their approach fixates on two scenarios — one in which there’s a global freeze on people joining Facebook (left) and one in which Facebook maintains its current growth rate to reach 90% penetration of a given group (right). The first scenario leads to a maximum of nearly one in ten profiles belonging to someone who has passed away, while the second one leads to a higher percentage of three in ten. In either of the cases, the number of dead on Facebook doesn’t exceed that of the living in this century, but will represent a problem in the future. From Digit.

Users are becoming more and more aware of the risks associated with data “mistreatment” which could lead to a lack of trust. At Sketch, how do you work to build users’ trust in the UX?

We’re trying to explain why we choose not to display some sensitive information. Actually, trying to explain why we don’t have that information at all. This means some users might feel we are lacking features they’d like us to offer, but we prefer to put their privacy before our feature set. We believe that this is a unique feature, and we are proud of it. By educating our users and bringing more visibility to how important a topic privacy is, I believe we can impact the way other designers are going to think and act against it. A sort of pollination of an ethical approach driven by our own example.

Looking ahead, are we moving into the right direction in terms of ethical product design? What other changes or features or practices do you think we need to focus on?

I am not sure we’re heading in the right direction. We live in a world that doesn’t help us be sure of what we are doing. There are many differences around the world, and the tech (and web) industry is well above historical and political borders. We are moving into uncharted territories, with some areas of the world that follow stricter rules than others, which can lead to competitive issues.

What makes me feel more positive is that this topic is becoming more prominent, and more people are aware of the problems related to data. Still, we lack a general acknowledgment of the severity of the problem and common, minimum, recognizable, and accepted rules. We, as designers, have the power to impact. I love to be a believer, and so I hope that we are making things better. We can design better.

Did you enjoy the article? Share it on