Research, Concept, Ideation, UX, UI & Prototyping
05-2019 – 02-2021
Madeleine Assadi, Pranamita Ray, Robbert de Graaf & Me
Digital Society School & Bits of Freedom
Starting the conversation around the ethical and privacy implications of AI in surveillance technology through counter-surveillance design.
Project HIVEMINDS is a project I started working on together with the Digital Society School Amsterdam and Bits of Freedom. During this project, I worked in an international and interdisciplinary team of designers, researchers and business innovators. The goal of this project is to start the conversation around the ethical and privacy concerns of AI in surveillance technology. Using design as a way to make these concerns tangible and as a conversation starter. We backed our design choices with research which we published on Bits of Freedom’s website and on our own platform. With our project, we got invited to present our work at the Dutch Association for the Rights of AI and Robotics (NVAIR).
What are the current technological and societal implications that affect our physical cybersecurity and what can we learn from the values behind counter-surveillance culture?
With this question, we started our project off. We looked into local cases like city councils, such as Amsterdam, who have started to regulate the implementation of technology and big data and work towards technological sovereignty in their cities through the ‘Cities Coalition for Digital Rights’. New policies proof that municipal governments are reinventing their policies to include digital technologies. These transformations are addressing the rapid technological shifts in society and try to protect their citizens’ privacy.
New identification technologies are implemented in cities without the consent of the citizens. Think for instance about the new facial recognition check at Schiphol, which is not an option to decline nor accept when purchasing a ticket. Across the EU, predictive policing systems are being developed at a tremendous rate, while regulation that would address the risks of predictive policing tools is still lagging far behind. One of the countries at the forefront of predictive policing in actual practice is the Netherlands. The Dutch police have set up various predictive policing projects which they refer to as ‘living labs’. In these projects, the Dutch police experiment with data and algorithms.
Technologies are innovating at a faster pace than the methods we can do to protect ourselves. It is therefore important to identify concerns citizens have when it comes to the Smart City and develops alternatives needed for the future(s). In order to identify the public’s opinion and concern on their physical cybersecurity, we created a survey that we published online. To attract people’s attention we designed fake mini cameras, which we placed around the city of Amsterdam. We placed them together with a QR code that leads to our survey, in places where people would like to be private, like toilets and bathrooms. This resulted in a lot of responses, some of which were negative and some positive. Nevertheless, it did get the message out and started the conversation which was our goal in the first place.
Publishing our own manifesto; 2030, A Vision on Privacy and Surveillance
After a lot of research and experimenting it was clear that this topic is a matter of digital human rights. Technology is moving at a very rapid pace and that makes it difficult for regulations to keep up, which makes it challenging to set a standard for our privacy. We wanted to write down our vision on privacy and surveillance to make sure we have an ethical perspective in our research and design process. The manifesto served as a guide when we had to make design-related decisions and also helped us in communicating our motives and intentions towards our stakeholders.
Designing and publishing our own website to share our research articles, and writing guest articles for Bits of Freedom.
Throughout the project, we shared our research findings in several articles. I designed and developed the website on which we published these articles together with other information concerning our project. These articles also got shared on Bits of Freedom’s website.
Filtered Realities is an analogue webcam filter that makes you the director of your webcam feed, no more insecurity of your image being used for other purposes.
Skype, Zoom, Microsoft Teams are deeply integrated in our daily lifes. COVID-19 brought our bosses, colleagues and our teachers right in our bedroom or living room. The line between work and ‘life’ was already thin, but since we’re working from home this thin line has evaporated. Most named services already include blurring options, customizable backgrounds and face filters. But by using their filters we still give full access to our webcams. We trust a service with our raw webcam feed. Whether that is the best way to go ‘private’ is debatable, as most of the earlier mentioned services have already diminished our privacy to a bare minimum. Whether it’s a tech-savvy hacker or the infamous NSA, we’re never 100% sure who’s not looking and who might is.
This webcam filter bypasses all these issues by filtering the feed directly at the source; you will become the editor of your media stream. The analogue filter gives you the possibility to enhance your image with colours, blur your surroundings or set up a background image. But more importantly; it gives you the possibility to hide. Your face is being used to improve facial recognition systems, your biometrics are used to improve surveillance software. By filtering the camera feed at the source you can stop this, you will be in control.
Designing counter-surveillance face masks using AI technology, to outwit facial recognition software.
Through my research, we discovered that the Dutch police is at the forefront of using AI technology for predictive policing purposes. More camera’s are being placed in public spaces, to surveil people’s behaviour. When COVID-19 was starting to overrule our daily lifes, I saw the opportunity to question the integrity of facial recognition systems, and surveillance systems in general. I saw the rise of people wearing face masks as an opportunity for counter-surveillance.
Lets Face It is a web tool that I developed that enables people to make their own face mask which fools facial recognition. The tool generates a non-existing face using AI, that is put on a face mask. You can choose between every ethnicity, face shape, gender, and age, to create a face mask that’s the best match for your face. When you wear our mask, facial recognition software will detect you as a person, but not as yourself.