
Satellite Images at Risk of Deepfake Manipulation
You’re in the mood for satellite images to take your family on a trip, but with this school year wrapping up, you don’t have time to plan anything elaborate. So why not take them down memory lane with a little nostalgia? You scour Google Earth to find your old hometown, reminiscing about all of the wonderful childhood memories it brings back to mind—but something’s off. You can see that some of the main buildings are different than they used to be, and the surrounding homes look unfamiliar as well. You zoom in closer and realize that your childhood home has been replaced with an entirely different house!
Geography professionals should be worried about deepfakes
This is a subject that you might never have thought of before but it’s worth the time to become educated on. With deepfake technology growing more advanced, professionals in the field of geography are advised to be vigilant. Deepfakes could threaten not just satellite images and maps but our perception of reality as well.
Satellite imagery at risk of deepfake manipulation will pose a danger for any industry with research or media needs for this data.
The Deep Fake AI tool was created in December 2017
Satellite imagery is particularly vulnerable to manipulation by deepfake AI tools. The technology has been progressing at a rapid rate, with the latest deepfakes unveiled in December 2017 generating human-like eye movements and realistic blinking. Already reports of altered satellite images in some regions of the world have made headlines. For instance, here in India two days ago (March 23rd), police were called after noticing someone had uploaded a picture of the national flag, bearing an unfamiliar symbol on it.
The research results were published recently
Satellite imagery is at risk of deepfake manipulation to distort the truth, warn geographers. The research results were published recently by the team led by professor Aishath Nizam, at Victoria University of Wellington. The study analyzed how false information in a photograph of Earth’s surface, captured by satellite might be altered using 3D rendering software and digital editing programs to change facts or present an alternative reality.
Human identification from space is possible with deep learning
Satellite imagery has increasingly been used to identify people on the ground. For example, in a recent study of the Nigerian census by researchers at Stanford University and Yale University, researchers pinpointed the locations of 86% of Nigeria’s 140 million-strong population. Images were then processed with deep learning to find out more information such as population density or whether a group is likely located in an urban area or not.
This technique has also allowed researchers to map out how Ebola spread in Sierra Leone and Liberia – revealing insights like how many cases occurred because victims traveled over poor roads, or that impoverished neighborhoods are at higher risk for infections due to limited access to clean water sources.
Google Earth images are vulnerable to manipulation
The deepfake manipulated satellite images show the extent of manipulation and create a questionable landscape for Google Earth. Geographers around the world warn about the high risks of this technology.
The same method used in movie making called motion capture is being used to manipulate and animate images from remote sensing systems to fabricate geo-information and superimpose false imagery onto photographs or video footage. These deepfake manipulated satellite images could potentially be used to promote, mislead, or deceive governments, commercial enterprises, business partners, or private citizens.
Geography students can learn from this!
Geographers are witnessing the possible emergence of a new type of satellite image deepfake manipulation in which satellite imagery is digitally altered to suggest events and conditions that did not occur.
Already, geographers have identified key vulnerabilities in satellite imagery data such as privacy, the potential for manipulation by user-generated content, the vulnerability of traditional maps to provide false conclusions, and a decrease in the authenticity of spatial data.
Ultimately, this poses a serious threat that impacts a geographical satellite image work methodology because it threatens geographers’ ability to conduct research via space-based images. The more time we spend on the earth rather than up in the sky with satellites will lead to greater the accuracy of our findings.
The phenomenon is not yet widespread but could become popular quickly.
The phenomenon of satellite images deepfake images, in which the subject’s face is switched with that of another person or that of a digital image, has been popularized by applications like FaceApp. However, deepfakes do not only pose threats to the privacy and accuracy of social media users. Recently, we at GIS4All had an opportunity to interview Professor Thomas Madsen about the risks geographers face when operating with satellite imagery.
Satellite images are already frequently manipulated for several reasons: detection, identification, and intelligence analysis for military purposes; risk prevention; law enforcement; disaster response; and even recreational aerial photography. As these edits create change on the ground —usually unintentionally— geographers rely on satellite images to document change in their environments.
For more info Visit Us