Site icon GFALOE Tech

You Can’t Use This Google Photos Feature in 2 States. There’s a Hidden Reason for That

A new AI feature within Google Photos is notably missing for Texas and Illinois residents, two of the most populated states in the US. This is particularly odd, given the feature has seen a significant rollout across the country since its debut.

The feature allows anyone to edit a photo with their voice or by typing commands — all without additional software or even the knowledge of what edits need to be made to achieve the desired effect. The feature makes photo editing more accessible and approachable to people who are less inclined to dig into individual photo editing settings.

Conversational Editing in Google Photos debuted on the Pixel 10 series of phones. In September, Google rolled out Conversational Editing in its Photos app to all eligible Android users and more recently iOS users in the US.

But it wasn’t clear who was «eligible» to use the feature. In a help center page, Google said it wasn’t «available in all regions at this time.» It didn’t specify the regions, nor did it say why.

As it turns out, the restriction applies to both Texas and Illinois based on the laws in those two states.

The ability to edit photos with your voice or through chat isn’t the issue — the problem is biometrics, specifically, what’s known as facial geometry. One requirement for Conversational Editing is that another feature called Face Groups must be enabled. That’s likely the legal sticking point.

«The common thread in both laws is that they restrict how biometric identifiers such as face geometry or voiceprints can be stored, transmitted or retained,» said Frank Fagen, a professor at the South Texas College of Law.

The Houston Chronicle was first to report that the feature wasn’t available, noting that both states had sued the tech giant for data and biometrics collection.

Google didn’t respond to requests for comment.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


What is the Face Groups feature in Google Photos?

Face Groups is a Google Photos feature that algorithmically groups together similar faces it believes to be the same person, allowing you to label them with a name for your use within the app. This makes it easier to find photos quickly for specific people.

To do this, Face Groups collects facial geometry, a biometric analysis of shapes, proportions and angles. It creates face models anytime a face is detected in a photo. When the algorithm predicts that one face is similar to a face in another photo, it groups them together.

Face Groups is an optional feature that can be turned off at any time. Doing so will delete all face groups associated with your account, along with the face models and any labels you have added.

The problem is that this type of facial recognition technology isn’t legal everywhere, or at least requires some preliminary steps to be considered legal.

Texas and Illinois biometric laws

Consent is typically required before biometric data can be collected, and if it isn’t given, it can violate biometric privacy laws. A Google Photos user may have accepted the terms and conditions of using the app, thereby providing consent to the collection of biometric data. But what about the other people you take photos of? Not so much.

One of the two relevant laws is Illinois’ Biometric Information Privacy Act, or BIPA, viewed by privacy experts as the «gold standard» because it gives individuals the right to sue the offending company.

According to a 2019 Illinois Supreme Court ruling, you don’t need to prove that the violation resulted in actual harm to sue. That «opened a flood of litigation,» according to David Morrison, principal of the Illinois-based Goldberg Kohn law firm. Morrison noted that even technical violations carry penalties, which range from $1,000 to $5,000 per affected individual.

Google settled a $100 million lawsuit over the face grouping feature in 2022 in Illinois.

Texas has its own law, the Capture or Use of Biometric Identifier Act, or CUBI, but only the state attorney general can bring a lawsuit, not individuals. Biometrics covered by the act include eye scans, voice, finger and hand prints, and face geometry. A single CUBI violation can result in a fine of up to $25,000.

Texas sued Google in 2022 for collecting biometric data without consent. That case was settled in May 2025.

The Texas law states that biometrics must be destroyed within a «reasonable time» and ties the expiration date to the purpose for which the identifier was created, creating a conundrum for Google. Face Groups is an always-on and ongoing process, essentially waiting for you to snap a photo so it can check if any face in the image matches one of its facial models. That means its purpose never really expires.

«From a compliance standpoint, the simplest route for Google is just to disable the feature in Texas and Illinois,» said Fagen.

Fagen points out that conversation-style editing can be done within the Gemini app, and that’s available in both Texas and Illinois. This reaffirms the assumption that the feature itself isn’t the issue, but the biometric collections required for Face Groups.

Google isn’t alone in contending with these state laws. Meta has been hit with multiple lawsuits about tracking its users without their consent, including a $650 million settlement for violating BIPA.

Why should these laws matter to you?

When your credit card is stolen, you can put a stop on the card and request a new one with a new number attached to it. When suspicious activity takes place in one of your accounts, you can change the password to lock it down.

What can you do when your fingerprints, voiceprint or facial geometry are stolen? Not much — once this data has leaked, it’s out there. There’s a permanence to having your biometric data stolen, so laws like BIPA and CUBI exist to make sure this type of data is handled with the care it deserves, along with the appropriate repercussions for mishandling.

Identity theft is a real threat on its own, but to a bad actor, access to someone’s biometric data may feel like keys to the castle.

The cost of convenience

The smartphone in your pocket or in your hand is the ultimate compromise. It’s become an indispensable part of your everyday life and an addiction of its own. Imagine not having the option to tap on your screen a few times and have a brand new pair of headphones arrive at your door within an hour. When was the last time you had to ask a stranger for directions? That’s no longer the world we live in.

The convenience technology brings us makes it easier to be OK with leaving our data at the doorstep of anyone attempting to collect it. The biometric laws in place are at least an attempt to ensure that your most sensitive data is protected. Is the convenience of something like Google’s Conversational Editing worth potentially having your biometric data stolen?

While this is an instance of a single feature not being available within a single app for two states, the story is larger than that. The BIPA and CUBI set a precedent for how sensitive data should be handled and how companies like Google create future features with these privacy laws in mind at a national and global level.

Exit mobile version