Adam Harvey is a Berlin-based artist and engineer whose art subverts surveillance technology: camouflage from face recognition cameras, anti-drone clothing, and a faraday cage phone case. AeroTime team met him in Vilnius, where he was giving his TEDx talk, to discuss the dystopian future which is already being made somewhere in a lab:  mass surveillance mechanisms are ‘trained’ on pictures of babies, machines are taught to be biased, bio-metrics can show whether to punish or reward you.  Are we doomed to live in a world of 1984 or does surveillance mechanisms actually make us safer? Read on to find out.


Could you tell our readers what brought you to doing projects that aim to subvert the modern uses of surveillance technologies?


At one point I started engineering and then I started photography. Then, I tried working as a photographer and realized that photography is not about art and beauty as much as control and power that comes from capturing and storing evidence. It was hard to ignore that part of photography so I changed back to engineering.

Photography has become so easy, in many cases you don’t need a human, it’s fairly automated. So what is the role of a photographer? Some of the more interesting opportunities in imaging are in trying to not appear, trying to subvert imaging technologies, to create images through doing that.


Think Privacy, New Museum Store, NYC, Photo by Leon Eckert


A lot of my work lately has been about reverse engineering some of the technology used in computer imaging, drone surveillance using thermal imaging, trying to find vulnerabilities and exploit them, through an art project or fashion project.


What is the end game of exploiting these vulnerabilities? Do you seek to show that the systems are flawed and need changes?


That’s the short term goal, but the long term goal is to understand technologies more, as they’re very, in beginning, unfamiliar to me. Not many people understand how millimeter-wave length scanners works in airports, or that when you walk through Gatwick, LND, there is an array of face-recognition cameras, tracking you. Except that when you learn that you would think they are tracking you for security, when in reality  Their primary usage is to track contractors who provide security to hold them accountable for how quickly they move people through security lines. So, in this case, we have surveillance on the people who are doing surveillance with these face recognition cameras. And the camera itself is very unique, as it has an LED spinning pattern. It is very tricky, because you don’t know what it is until you look up at it, and once you look up at it, then it has your face. 


One of the most heated discussion happening right now is about drones and their interference with privacy and security. What is your opinion about drones?


Drones use multi-spectrum imaging, one of the cameras being thermal. It is important to understand that thermal can see you at night. It can’t quite see through your clothes, but it can see the heat under clothes. It is an invasive technology that has an asymmetry over the way you see. You can never see yourself in thermal, unless you have a thermal camera. And when you do, you understand that what a drone can see is the activity in the house. You can see when someone is present. It can monitor and track people and in the worst case it can log on into people’s heat signatures. It’s a powerful technology for war.


Demo of Anti-Done Burqa shot with FLIR handheld camera on 29th st in NYC. ©Adam Harvey


Underlying thermal imaging technology has come a long way in the last five years, dropping in price from ten thousand dollars for a decent camera to under a thousand. Meaning, a lot of people can have thermal imaging cameras.


Now, the discussion about legal requirements in EU and US is targeted at the user, the consumer. Do you think government agencies, contractors should have the same rules regarding imaging and drones?


I think security is a different part of the same spectrum. It would be nice to think that they were subjected to the same rules, but I see that as less of reality. Thermal is one of many other technologies including LIDAR (Light Detection and Ranging), different parts of the R-A spectrum. This is an area where a lot of development can still happen. For example, with short-waves of infra-red, you can detect the type of clothing someone is wearing, whether someone is wearing real silk or fake silk. So, if it’s not regulated, you can begin to use that in commercial ways for all sorts of horrible things. Imagine collecting information about people with urban, commercially piloted drones, not only doing facial recognition but also looking at skin quality in R-A, looking at fabrics and textiles in short-waves of infrared. It’s able to gather all sorts of information about people in this multi-spectrum thermal imaging.


What sort of a future are we moving towards, in your opinion? Are we getting closer to a dystopian future?


My guess is the future has already been created in a lab in who knows where, and that future will eventually trickle into military technology and, eventually, into consumer technologies. Think GPS for example. What is happening in the future has already been established, we just don’t know about it, because it is behind a closed door. Waiting until it finally trickles into consumer life, where we will finally have to deal with it. I’m not sure what it is, though. I’m sure it will involve data, AI and many cameras.


We see this imaging technology, capturing technology everywhere. A skilled teenager from virtually anywhere can probably make his own drone with thermal sensors and use it for all the purposes they are used today. Maybe it levels out the society in way?


As a user, you understand what is happening, because you have access to it.

Opening some markets can play an important role in proving access to technology and where you become educated and familiar with it. I’m very focused on digital imaging. It’s very expensive and not many people can engage in it. Some people would consider my level of interest as rubbing into paranoia, but I think you just have to answer those questions about it and you need to directly engage in technology. So in the end, when there is an open market we can figure out how terrible it is, or create some new good uses for it, like food inspection.


Think Privacy, 2016, ©Adam Harvey


Oddly enough, I went to a defense industry show a few years ago where one of the hot products was HD thermal imaging. One of the talks there was about food inspection. Because with thermal imaging, with short-length, mid-length, near, infrared,  you can inspect the quality of the fruit. You can see through it, the quality, decay. You can analyze the quality of a peace of fruit or vegetable produce. And that was at a defense industry show. And that is the new market that was created that nobody had probably imagined when they were thinking about the usage of technology on drones.

It’s hard to say that any of these technologies are bad, because there are so many different uses. I guess what I would like to see more US regulation of these things and nobody really wants to get into this. And what I am talking about here [editor's note - Adam Harvey gave a talk at TedEx Vilnius in June 2017] is some of the problems that emerge from that lack of regulation, like pining government surveillance outcomes on photos of kids…


Let’s get back to surveillance at such places like airports, now there are people highly trained in behavior psychology and they can tell “this person is not acting like he’s supposed to act, he’s not acting like a tourist”. Will facial recognition technology be trained to do that as well?


Yeah, it could. There are some researchers that are not willing to do that, but there are others who are willing to do it, racial profiling, criminal profiling. It’s a messy area. It’s resurrecting phrenology from the early 1900s, when you’re making an influence on one person, based on someone else who looked like him in the past. Where if you happened to look like criminal, that would increase you criminal score. That’s not a fair way at all to establish order in society.


So it’s teaching the machine to be biased?


Oh yeah, and I think you can teach a machine to be biased, because you have to encode something and that something is a decision somebody has to make to include or exclude certain things.