AI Overview
Protecting against face scanning involves disrupting infrared (IR) cameras and hiding facial features. Effective methods include wearing infrared-blocking glasses, specialized masks, or clothing with retroreflective materials that blind cameras. Other techniques include using face coverings (masks/scarves), hats to block overhead cameras, and using software to scramble photo data.
Physical Anti-Surveillance Methods
IR Blocking Glasses: These prevent cameras from seeing your eyes and can break smartphone Face ID systems. Some specialized glasses, like those from Reflecticles, can block infrared light.
Reflective Hats/Clothing: Clothing or hats made of retroreflective materials reflect light back at cameras, causing bright spots that disrupt algorithms.
Face Masks/Scarves: Covering the nose and mouth removes crucial identification points for AI systems.
"Invisible" Hats: Specialized hats (e.g., Invisible Mask) can project infrared light to disrupt camera sensors.
A New Mask Can Block AI Facial Recognition From Every Angle
A Dutch designer, Jip van Leeuwenstein, created a transparent, lens-like mask that blocks AI facial recognition from all angles by distorting key biometric markers like jawlines and eye spacing, while remaining transparent to human vision. Developed as part of the “Surveillance Exclusion” project, this tool protects privacy in public spaces by causing cameras to misidentify wearers, acting as a form of protest against AI surveillance.
Key Aspects of the Anti-Surveillance Mask:
[Transparent Design: The mask is clear, allowing people to see facial expressions and interact normally, unlike opaque masks.
[AI Confusion: It bends and refracts light, disrupting the specific, geometric mapping points (nose, jaw, eye spacing) used by AI cameras to identify individuals.
[Versatility: The mask is designed to disrupt recognition from multiple angles, including side and partial views.
[Context: The project, which has gained attention through tech and privacy forums, aims to highlight the growing invasiveness of automatic surveillance in urban environments.
This mask is one example of a growing field of "privacy tech" designed to challenge the ubiquity of AI surveillance systems in public infrastructure.
Created as part of his Surveillance Exclusion project at the Utrecht School of the Arts, the mask gently alters facial contours enough to confuse algorithms — all while keeping real human expressions clearly visible.
The innovative design quickly drew attention from tech enthusiasts, researchers, and media outlets, becoming a striking symbol of resistance against growing digital surveillance.
Digital and Tactical Protection
Digital Cloaking (Fawkes): The Fawkes app creates "invisible" modifications to your photos, which, when uploaded, scramble how facial recognition models identify you.
Makeup/Patterns: Using high-contrast or asymmetrical makeup/patterns can confuse computer vision models (sometimes called "Computer Vision Dazzle")
Avoid Over-Sharing: Limiting the number of high-quality, clear photos of your face online reduces the ability of services like Clearview AI to scrape your data.
Techniques to Avoid
Standard sunglasses: Many do not block the infrared light used by modern cameras.
Simple scarves: A flimsy scarf that does not change the structural appearance of the face may not work.
Small reflective stickers: Often ineffective against advanced surveillance cameras.
Facial Recognition: How it Works
Now that you know some basics about biometrics in general, let’s focus on facial recognition in particular. Facial recognition is composed of two major phases. The first is facial identification and the second is facial recognition. Identification is determining “do I see a face” and recognition is determining “do I know who this face belongs to.”
Facial identification requires cameras and a good mathematical model of what a face is or isn’t. Traditionally these cameras were visible light cameras, but the need to function in low-light, especially in surveillance purposes, as well as the need to get high-grade contrast, means many operate in the near-IR spectrum, like night vision devices do.
Facial recognition requires the system to have an enrolled dataset. How exhaustive the dataset needs to be is dependent on the application.
Understanding the two-phase process and their components is key to developing countermeasures, which we will discuss later. But why would people want countermeasures in the first place?
The wide-spread deployment of facial recognition technologies, driven by machine learning (ML) and “artificial intelligence” (AI) systems is, in terms of threats to a free society, second only to the adoption of a Central Bank Digital Currency (CBDC). And, just like CBDC, the masses have been conditioned over time to accept aspects of it, or its forerunner, in their lives under the guise of “cool” or “convenient.” Some examples include:
Facial identification helping auto-focus the camera on your smartphone when you’re taking photos.
Facial recognition helping to automatically tag “friends” when you upload your photos to social media.
Facial recognition being used to unlock phones and computers.
Making it cool, fun, and convenient creates a situation where people actively, willingly, participate in feeding the data model. For years, people have been uploading photographs of themselves, friends, and family, to social media sites. These sites then introduced facial identification and allowed you to tag the face with who it belongs to. Eventually social media started offering to tag photos for you, which is fun and convenient, right? Well, it can do that because machine learning models were built and trained by people tagging photos.
Millions of photos are uploaded to social media and tagged every day. Facial recognition models are being perfected through this massive influx of data.
Think of all those photos of people, at different angles, in different lighting conditions, at different ages. If you wanted to build the perfect dataset for an automated facial recognition system, you couldn’t ask for a better one — certainly not the state DMV database or State Department’s database of passport photos.
But what about the model? Just having photos of individuals isn’t enough. Each image needs to be analyzed in order to build a mathematical model of that person’s face. These days, system designers are increasingly relying on technologies like convolutional AI to automate the creation of these models via processes which are opaque to them. However, in general facial recognition models are going to be based on the geometric relations between facial landmarks, such as:
Distance between eyes, ears, etc.
Breadth and length of the nose.
Bone structure of the face (cheek bones, brow ridge, etc.)
Additionally, some of these measurements will be based on measured or inferred depth. These measurements require good lighting and contrasts in order to assess, a situation which has led to no small amount of controversy in recent years. In recent years, there have been no small number of cases where facial recognition technologies, deployed for various purposes, have been accused of being “racist,” either because the sensors have trouble with darker-skinned subjects, or because the training dataset for the machine learning algorithms is predominantly white or Asian.
Because this causes issues with applications that people want to work, such as device security or social media applications, these complaints tend to drive the state of the art in pushing facial modeling, advancing it in the general case.
The Facial Recognition Threat Right Now
Whether you’ve noticed it or not, facial recognition technology is already being used extensively in crowded public places, especially within countries that favor authoritarian control over citizens’ right to privacy.
In China, the future is now. Mass deployment of surveillance cameras hooked up to high-performance computing clouds, with massive datasets, provide an all-seeing eye. People caught merely jaywalking are identified and then put on digital billboards to humiliate them and force social conformity, all in keeping with their social credit system. Facial recognition is tied to digital ID and payment systems. You can go into a fast-food restaurant, walk up to the kiosk, be served, and have your account debited, all from facial recognition. Fun, cool, and convenient, right?
The much darker side is that while social justice warriors in the U.S. and Europe are misguidedly pushing to help make facial recognition technology better at identifying minorities, in China they have developed data models and algorithms which can identify, with a great deal of accuracy, the ethnicity of a person. This technology is being used specifically to target the frequently persecuted Uighur minority population in Western China’s Xinjiang province.
In the U.S., we have protections that China doesn’t have. When the first publicly documented case of police using facial recognition technology en masse came to light in 2001 at the Tampa-hosted Super Bowl, there was a wide-spread outcry about how it was a 4th Amendment violation. Of course, this was pre-Sept. 11, pre-PATRIOT Act, and before Snowden’s revelations that would make this seem like a blip. In the U.S. now, some cities have created ordinances banning the use of facial recognition technology, sometimes due to privacy implications, other times at least in part because of the seemingly disproportionately high false positive rate for minorities leading to incorrect identification and false arrests. (Boston, Massachusetts, and Portland, Oregon, for instance, rolled out their ordinances against facial recognition in 2020 so that police could not use it during the ongoing riots and protests).
However, there are areas of the U.S. where the rules don’t always apply. Borders and checkpoints are one. Automated immigration checkpoints comparing on-site snapshots to your passport photo are becoming well established in the U.S. and other rich countries, offering convenience for greater acceptance of the tech. There can be no doubt that facial recognition technology is being deployed in the surveillance systems of major airports as well.
The same mobile technology that was pioneered and rebuked two decades ago will continue to make appearances at major events and especially at protests. And even when real-time facial recognition isn’t in play, surveillance photographs can be compared to government and open-source data sets (all those photos you put on the internet) for identification. This tactic was heavily leveraged both by government employees and private-sector open-source intelligence (OSINT) analysts and digital sleuths after the events on January 6, 2021, for instance.
The Threat in the Future
In the U.S., we’re highly suspicious of three-letter agencies hoarding and manipulating our sensitive data, but many of us hand that same data to social media and tech corporations without blinking an eye. Here, the threat of invasive facial recognition is less likely to come directly from the government and more likely to be privatized. As China is doing now, and as The Minority Report showed, we’re likely headed to a future where the profit-fueled surveillance we have long known in the online world will move to the real world. You’ll walk into a store, be identified, and then based on your likes and internet history, will be offered products in real time. Or, based on your Personal ESG score — a measurement of how environmentally friendly and socially conscious your lifestyle is perceived to be — you might even be told you can’t spend money there.
As the social acceptance of the technology grows until it becomes basic background noise like flushing toilets and flipping light switches, there’ll be fewer and fewer legal challenges, and eventually government surveillance will step up as well. Via “public-private partnerships” in the name of “public safety,” we’ll find the lines increasingly blurred.
What About Countermeasures?
Some systems are going to be harder to trick than others. How hard is going to be a function of how good the hardware is, how exhaustive the database is, and how sophisticated the model is. Saying for sure what will or won’t work is therefore hard. However, with some experimentation and research, I have a few things that I know will defeat some systems and may have success against others.
The Android application ObscuraCam can quickly edit out faces and other distinguishing marks. This is perfect for countering facial recognition and methods OSINT analysts might use.
Online Countermeasures
With regards to online countermeasures, the goal is to deny the creation of a good data model of your face. This can basically be broken down into two tactics:
First, is adversarial modeling. In machine learning, this essentially means spoiling the dataset with lies. You operate an account as yourself, or otherwise upload photos, but the photos are not of you. You then tag those photos as you, so the data model doesn’t associate your face with your person.
The second tactic, and one that’ll bring you much joy in your life, is to simply avoid playing the game. Get off social media. Spoil all your data, then delete your account. If you never had social media, all the better. Ask your family and friends not to upload photos of you. If they must, blur out the photos.
If you must swap photos online, use secure or covert communications applications to do it, and spoil the photos directly. Applications like ObscuraCam can take advantage of facial identification and pixelate or otherwise redact the photo when you take it. You can also use it to quickly obscure any other identifying information.
Despite obscuring a large portion of my face with this mug, A match was still made at a distance of an average distance of 0.53, which is near the threshold but still a match. A more sophisticated model would likely defeat this.
Real-World Countermeasures
Broadly speaking, there are three different categories of countermeasure we can use against facial recognition systems in the wild:
The first type of countermeasure attacks the ability of a system to detect a face in the first place. This is going to include anything from simple face coverings to purpose-driven clothing.
The second type of countermeasure is going to cause a false negative with facial recognition, after facial detection has occurred.
The third type of countermeasure is going to attempt to cause a false positive, making the system think that we’re someone else entirely. We’ll call this the “Mission Impossible” countermeasure.
Facial recognition: How to protect yourself from surveillance
With advances in facial recognition, surveillance culture is rising and remaining anonymous in public is getting difficult. In this in-depth guide, we explore how to avoid surveillance from face recognition and ways to protect yourself.
When you think of face paint and glitter, protecting yourself from facial recognition is not the first things that comes to mind, but events like Halloween and kids’ parties. And when you think about a face mask, you might have flashbacks to the COVID-19 pandemic. But believe it or not, these creative techniques, even as simple as a face mask, could help you improve your privacy through anti-facial recognition. In this deep dive, we explore different ways you can throw off facial recognition technologies to remain anonymous in public and protect yourself: whether you’re at a protest, a music festival, or walking in the street.
Surveillance through facial recognition systems is becoming increasingly smart and used globally. It’s not only present at protests or airports, but more worrying when you enter shops, get your car filled up, or go on your commute to work, it’s likely that CCTV and facial recognition technologies have captured your face.
In the UK, for example, the police scanned almost 4.7 million faces with live facial recognition cameras in 2024, which is more than twice as many as in 2023. This is worrying, especially considering that the UK isn’t recognized as a top surveillance state like China, for example.
While some might not be concerned if they live in a surveillance state, for others like minority groups, activists, protesters, or people who care about having privacy, ensuring they protect their identity against cameras and facial recognition software is necessary.
Even though recognition software is becoming smarter and harder to evade, there are still different ways to remain anonymous and steer clear of surveillance from this technology.
When it comes to protecting yourself, you need to be protected in two ways:
Protecting yourself from the surveillance of face recognition software.
Protecting your digital identity, which can be secured with these digital security tips that are especially useful for activists.
We have many in-depth guides which give the best tips to protect your digital identity and jack up the security of your tech devices – be sure to check out the links above and make use of the tips which also improve your privacy and security. Now, let’s deep dive into how to protect yourself from the surveillance of face recognition systems.
What is face recognition, and how does it work?
Face recognition was first created in Japan in 1969, and since then, it has been used globally. While some countries have stricter laws regarding who can use it and how it can be used, in other countries like China (think of the Chinese social credit system) and increasingly in the US and the UK, surveillance through facial recognition tech is becoming the new normal.
Simply put, facial recognition is a biometric identification system that uses machine learning to identify faces in photos, video footage, and even in real-time. This type of identification software is able to convert an image of one’s face into data, and then run it against already existing facial data in databases.
While the top, most advanced recognition tools can get 99.97% accuracy in ideal conditions, there are techniques to bypass these systems by feeding them faulty data. From painting your face in specific ways to wearing a COVID mask, glasses, and a hat.
If you decide to make use of the methods we explore below, we’d recommend doing further research into the type of recognition technology being used and its algorithm to ensure that you have a good chance of not being recognized. This is especially important because there’s not one anti-recognition method that’s bulletproof. Additionally, some techniques are creative and fun but could make you stand out, compared to others which will make you blend in more.
Computer Vision Dazzle: face paint, makeup & hair
Painting your face and applying makeup is not only for parties - you can even use anti-facial recognition makeup to camouflage your face and potentially trick facial recognition or facial detection algorithms in some cases.
The idea of using makeup as a camouflage to avoid facial detection was introduced by Artist Adam Harvey, who also created the open source anti-facial recognition toolkit. Harvey also coined the term Computer Vision Dazzle, CV Dazzle for short, when he discovered that it was an effective way to bypass the Viola-Jones face detection algorithm (commonly used in the past) but has since deprecated.
Because detection tools heavily rely on the dark shadowed areas of the face, like the dark areas under your eyes, your nose bridge, facial symmetry, and the shadow under your nose, if you strategically use makeup and hair on the dark and light areas of the face, you can change the appearance of your facial features. If done correctly, this creates asymmetries and potentially blocks facial detection tools. And because this blocks detection, facial analysis, recognition, and emotional analysis cannot be utilized.
Inspiring examples of CV Dazzle can be seen in The Dazzle Club, a collaborative project by artists Evie Price, Emily Roderick, Georgina Rowlands, and Anna Hart that ran from 2019 - 2021. The project was started after the forced admission of facial recognition software by Argent on the King’s Cross Estate in London where the four artists had been making art.
The idea behind The Dazzle Club, as explained by Emily Roderick, “Making embodied research into public space surveillance - drawing awareness of technologies and building new expressions of community and trust. Walks, films, texts and workshops pay attention to publicness and being seen.” Through interventions in public areas, like their silent walks through London while wearing CV Dazzle, the club highlighted and raised concerns around technology, data, and human rights, while also creating new expressions of collective creativity and trust.
Note: If you decide to embrace CV dazzle, we’d recommend doing in-depth research to ensure how you paint your face and the light and dark spots you camouflage are effective at blocking the probability of detection. It’s important to remember that there are different algorithms used for facial detection and facial recognition, which could require different techniques to hide yourself.
Accessories: sunglasses, caps & masks
There are also many accessories you can use, like a baseball cap, sunglasses, bandannas, or a COVID mask. Think along the lines of accessories that cover large portions of the face.
Face masks & bandannas
Using a face mask is still one of the easiest and most affordable ways to conceal your identity, while also not standing out from the crowd. If you’re protesting and would like to protect your identity from facial recognition and police surveillance, a simple face mask worn with sunglasses or a cap is a good solution. It’s important to be aware that face masks are not allowed to be worn in some locations or at specific events; this is dependent on the laws in your area. So before you opt for this, we’d recommend doing some research to check if you’re legally allowed to be wearing one.
If you’re tired of wearing a face mask (which is understandable) or if it’s illegal in your country, you could wear a bandanna instead – what’s nice with the bandanna is that you can choose to wear one that’s plain or one that’s fun and bright depending on what your needs are.
Caps & hats
Like the face mask, a cap or wide-brim hat is a quick and easy way to help conceal your identity, with the added benefit of protecting you from the sun. If you choose to wear one, make sure the brim is pulled low and avoid wearing caps with logos.
Anti-facial recognition glasses
You can opt for a regular pair of sunglasses, preferably large ones that also cover your eyebrows, or you can buy a pair of anti-CCTV glasses. The anti-CCTV glasses work in different ways by either blocking or reflecting CCTV infrared light (IR), and in return this disrupts the face recognition technology.
Creative frames
Today, there are even creative masks / frames you can wear. A beautiful concept is the Incognito Mask created by Ewa Nowak. This futuristic-looking frame worn around the face makes it impossible for facial recognition systems to measure the distance between and height of your facial features.
Clothing
Avoid logos
Avoid clothing with big logos and slogans, and make sure your clothing covers up any tattoos or birthmarks as these can be tracked by the police. This is especially important if you attend a protest, as police are known to identify protesters based on their clothing.
Consider wearing all black
What you choose to wear depends on the place or event. Often all black clothing is a good choice because it helps you blend in, and if you’re attending a solidarity protest, this is usually the suggested color as well.
Anti-surveillance fashion
Yes, anti-surveillance fashion is a thing – it’s like modern-day camouflage! There are many clothing designs and projects that use special fabrics, designs, and patterns to confuse cameras, obscure key facial features and even block signals from tracking devices.
For example, you can get scarves and clothing made out of retro-reflective materials that work by disrupting a camera’s flash, making everything captured in the image go dark when captured in dark or dimmed lighting.
Take a look at these:
Stealth Wear and Hyperface by Adam Harvey
Anti-flash fashion by Ishu
Facial Weaponization Suite by Zach Blas
While anti-surveillance fashion is innovative and fun, it is often more on the pricey side, and it isn’t a bulletproof solution to remain anonymous from facial recognition.
Privacy isn’t easy, but it’s possible
When you think about privacy, there are many aspects to it. Think about it - privacy online to protect our digital identities, and privacy in the physical world to remain anonymous and private when doing our daily tasks. Unfortunately, since there are so many aspects to privacy, actively remaining private or knowing where and how to start can feel overwhelming.
While there is not one bulletproof method to ensure privacy and avoid surveillance through facial recognition software, there are things you can do as we’ve discussed above and it’s amazing to see that they can be creative and fun. Before adopting one of these, always do additional research about the best way to protect yourself from being identified, for example by checking out the excellent guide by the EFF. It’s also important to ensure that you not only protect yourself physically but also digitally. For example, by not taking your mobile device to a protest and using anonymous email like Tuta Mail.
At Tuta Mail we believe that everyone deserves privacy. Whether it be when emailing a friend, walking down the street, or protesting for your rights. As an end-to-end encrypted email provider, serving no ads and no tracking we are committed to not only offering free email that’s anonymous and private by design, but also to educate and inform everyone on the importance of privacy and security.
Together we can make the world a better place.
Europe and US states impose tough restrictions on facial recognition, while the US federal government hesitates. China moves full steam ahead.
When Milan’s Linate Airport introduced a “Faceboarding” facial recognition system — no boarding pass required, just a quick scan and you’re through — Italy’s data protection authorities reacted with horror. They suspended the system, citing “insufficient safeguards” for passengers who had not chosen to participate.
Both European and American approaches to the technology face a common challenge: how to move fast enough to stay competitive with China and other authoritarian states while moving carefully enough to preserve civil liberties.
The technology is advancing fast, driven by artificial intelligence. The market for facial recognition could reach $18 billion (€15.4 billion) by 2030, up from $8 billion today. Scanning a face is convenient compared to collecting fingerprints or encoding an iris. And many consumers like that convenience, whether it be for unlocking a device or boarding a plane where ID was previously required. But biometric data can also be abused, and governments and companies could abuse facial recognition to conduct mass surveillance. Studies show that facial cognition algorithms can be biased and inaccurate, more likely to misidentify people of color — and in particular, women of color.
In Europe, biometric systems must meet strict privacy requirements before they can operate. The US has no comprehensive federal law on facial recognition, though states are rushing to fill the void. Eighteen states have enacted statewide facial recognition technology regulations for law enforcement or broad public use, while three states have pending legislation under consideration or scheduled to take effect. The remaining twenty-nine states have neither active statewide FRT laws nor pending statewide FRT legislation.
China, for its part, has no qualms. It operates surveillance systems at a massive scale through projects like “Skynet” and “Sharp Eyes.” Beijing deploys more than 700 million cameras. In Shanghai, authorities are tripling the number of facial recognition cameras to allow tracking of 50 million citizens. A new National Identity Authentication Law went into effect on July 15, which encourages Chinese citizens to submit their true name and a facial scan, after which they would be issued a unique ID code used for all online accounts
Both the US and Europe want to avoid such mass surveillance.
Europe is tightening regulations. The new AI Act imposes strict obligations on “high-risk” systems, including biometric identification. Data extraction from CCTV or the internet to expand biometric databases is prohibited, unless it is targeted and users consent.
Even voluntary facial recognition schemes face tough European scrutiny. Airport operators in the EU often position biometric boarding as an “optional opt-in” for EU citizens, but data protection authorities and regulators push back. Authorities require robust consent flows, data minimization, retention limits, and data protection impact assessments before these systems can operate.
Penalties are stiff. When US facial recognition company Clearview AI scraped billions of images from social media, breaching European GDPR privacy law, Dutch regulators imposed a €30.5 million fine and ordered the company to delete EU citizens’ images. Austrian privacy group noyb filed a criminal complaint in Austria last month that could subject Clearview executives to personal liability and even imprisonment.
The US approach to facial recognition technology is market first, regulation later. No comprehensive federal law governs commercial or law enforcement use of facial recognition. Regulation remains a patchwork of federal proposals and state and local statutes.
Surveillance use is increasing in the US. Immigration and Customs Enforcement officials reportedly employ facial recognition technology to scan motorists’ photos to identify undocumented immigrants. The FBI compares driver’s license and visa photos against the faces of suspected criminals, according to a Government Accountability Office report.
Federal attention is growing. In Congress, various facial recognition bills have been introduced, including a recent proposal requiring the Transportation Security Administration to inform passengers of their right to opt out of face screenings. So far, though, all have stalled.
America’s fragmented approach allows experimentation and fast deployment but creates inconsistent protections. A facial recognition system legal in one state may be banned in another. A practice permitted for private companies may be restricted for the police. This patchwork creates uncertainty for both technology developers and citizens.
Courts are giving broad leeway to roll out facial recognition. When spectators sued New York’s Madison Square Garden for using facial recognition to block entry, a 2024 federal court dismissed the case.
Major US technology companies have implemented some voluntary limitations. Microsoft’s “Responsible AI” standards and product controls include strict access requirements for biometric features and documented risk assessments. Amazon announced a one-year moratorium in 2020 on police use of its “Rekognition” facial recognition product. The company later extended those restrictions indefinitely amid public pressure.
The Milan airport case arguably shows democratic oversight working. An independent regulator reviewed a biometric system, found it lacking in safeguards, and shut it down. No one was arrested. That model — transparent rules, independent enforcement, and accountable decision-making — is slow. It is messy. It creates compliance costs. But it separates the West from China.
Facial Recognition and Ghost Glasses
Facial recognition is a powerful tool used by many organizations globally, from governments and tech giants to casinos and fast food joints. It was designed to identify a person through digitally mapped facial features.
The growing use and development of facial recognition technology has raised some major concerns. Some people are worried about being identified without their permission and the risks that may pose to their privacy. There's also no guarantee that data collected by the software will be kept secure or used in an ethical manner.
Accuracy is another big issue. Some facial recognition tools have trouble processing the faces of women, older people, and/or those with darker skin. Wearing a mask or certain cosmetics can also make the software less accurate. This can lead to potential issues with false identification, racial bias, and discrimination.
Anti-facial recognition glasses (or ghost glasses) can make it harder for cameras to get clear photos of the wearer's face. If you want to protect your privacy and avoid having your image captured, ghost glasses may be worth a look.
Read on to learn more about facial recognition technology and the camera-blocking eyewear that may help thwart it.
What Is Facial Recognition?
Facial recognition software uses biometric data — such as the distance between the eyes and nose — to create a unique profile for an individual face. This profile is then compared to all the other profiles in a given database to determine if there is a match that can be used for identification.
Businesses and law enforcement organizations use facial recognition tech to bolster security and aid in investigations. Some companies also use it to better understand customer demographics and behavior.
You can see facial recognition software in action at airports and shopping centers, and when you use Face ID to unlock your smartphone. And your face has probably been tagged automatically in at least one photo on social media.
How Do Anti-Facial Recognition Glasses Work?
Ghost glasses use a combination of techniques to confuse facial recognition software, including:
Infrared-blocking lenses – Some privacy glasses have special lenses that reflect or block the infrared light that cameras use to capture images.
Reflective materials – Some glasses use reflective materials on the frames to disrupt facial recognition technology.
These blocking techniques make it harder for the camera to detect a person’s facial features. Without a clear image or biometric markers to map, it may not be possible for the software to create a profile for comparison.
Camera-blocking glasses sometimes come with different options for tinted lenses, frames, and styles. Some are designed to look like regular sunglasses while others have a more futuristic look. The best place to find them is usually through an online retailer.
Prescription options may also be available if you need vision correction.
Choosing the Right Pair
When shopping for anti-facial recognition glasses, you should consider the following factors:
Material – The lenses and frames should be made from quality materials that won't easily break or scratch. Look for brands with good reviews and high rates of customer satisfaction.
Fit – Glasses that fit comfortably on your face will provide better coverage and make it harder for cameras to capture a clear image. Consider trying them on or comparing measurements with those of your current glasses.
Usage – Think about where you'll be using the glasses most often. Are they for everyday use or just for specific situations? This can help you determine what lens tint and other features will be best.
Price – The price range for camera-blocking glasses can vary. Keep in mind that more expensive glasses aren't always better quality.
Style – Ghost glasses come in many styles and colors, so choose a pair that suits your personal taste and face shape.
UV protection – Some anti-facial recognition glasses also provide UV protection. If this is an option, make sure you get 100% UVA-UVB lens protection. This is an extra bonus if you plan to use the glasses outdoors.
How Effective Is Privacy Eyewear?
While anti-facial recognition glasses can make it harder for cameras to capture a clear image, they are not 100% effective. Some cameras may still be able to capture your face. This depends on your distance from the device and the surrounding light.
It’s also important to remember that facial recognition software is constantly evolving. This means that glasses on the market now are designed to block existing technology. But they may not be effective against future software updates.
Try Privacy Glasses
Ghost glasses can be a useful tool in protecting your identity and privacy, but they’re not completely foolproof. Surveillance technology is constantly evolving, and so are the methods used to block it.
The decision to use anti-facial recognition glasses should be made after careful consideration. With the right pair, you may feel better protected from the less noble uses of facial recognition technology.
Are you concerned about facial recognition cameras monitoring your every move? Some large venues and arenas are using it as a security measure, claiming it ensures safety for guests and employees. However, the technology is also being used for surveillance and to block people from entering businesses.
As was the case for New Jersey native Kelly Conlan. Conlan was just passing through security at an event at Radio City Music Hall when security stopped her and refused to let her into a Rockettes show because their facial recognition software identified her as an attorney.
Although not involved with the case, Conlan works for the same law firm that has been involved in a personal injury litigation against a restaurant owned by MSG Entertainment, which also owns Radio City Musical Hall. The company decided that all attorneys working for law firms engaged in litigation against them are banned "from attending events at our venues until that litigation has been resolved." More and more of these types of facial recognition incidents are happening nationwide. So what do you do?
Well, one start-up may have the answer to how you can stop this from happening to you. The company is called Cap_able, and its mission is to create wearable fashion that will help you to make the choice of whether you want your face analyzed by facial recognition devices or not.
What is Cap_able, and what have they created?
Through their Manifesto collection, Cap_able has created a line of hoodies, pants, t-shirts, sweaters and dresses that are embedded with artificial intelligence algorithms that thwart facial recognition software.
The facial recognition cameras will either fail to identify you completely or think that you are an animal embedded within the clothing pattern, like a giraffe, zebra or dog. This gives you more control over your body without having to worry about technological devices always monitoring you.
Model wears colorful dress during photoshoot
How are the designs made?
Co-founder and CEO Rachel Didero came up with the idea for these designs while studying for her Master's at the Fashion Institute of Technology in New York. She had read a story about how tenants in Brooklyn fought against their landlord who wanted to have a facial recognition device at the entrance of their building and got inspired to create something that would give people a choice.
Combining fashion and engineering, Cap_able designs and tests each image with an object detection system called YOLO to see if they can beat a facial recognition system. They can then create a physical version of the pattern using a Computerized Knitwear Machine and create the final product using all Egyptian cotton.
Do the designs really work?
Didero says that, as of now, the clothing worked about 60% to 90% of the time when being tested with YOLO. Although Cap_able's algorithms are improving, they also have to keep up with the fast-paced, moving world of tech.
Facial recognition devices are being improved upon every day, so they certainly have their work cut out for them. And with the products starting at the price of $300, it may be a hard sell for the company. However, they make a valid point about the invasion of facial recognition and continue to work hard to improve their technology.
Model wears colorful sweater during photoshoot
Here in the U.S., there is currently no federal law requiring signs to be posted if facial recognition cameras are being used in a public or private place. However, some states and municipalities have laws and regulations regarding the use of facial recognition technology, and those laws may require signs to be posted. It is important for you to check with your local authorities to determine what, if any, requirements apply to the use of facial recognition cameras in your specific area.
Also:
Stop Facial Recognition: Countermeasures for Mass Surveillance
Also:
How Safe is Your Face?
Also:
Anti-Facial Recognition Glasses
Also:
The fashion industry is helping fool face recognition software with new clothing designs
.jpg)