Quantcast

MADISON - ST. CLAIR RECORD

Saturday, November 2, 2024

Consumers: No alleged injuries in BIPA class action against Amazon warrants remand to Madison County

Lawsuits
Kevingreen

Green

A class of consumers who use Amazon Echo devices equipped with Alexa capabilities argue that the federal court does not have jurisdiction over their Illinois Biometric Information Privacy Act (BIPA) claims because they do not allege actual injuries. 

Attorney Kevin Green of Goldenberg Heller & Antognoli in Edwardsville filed a motion to remand to Madison County Circuit Court on Sept. 24 on behalf of the proposed class. 

Green argues that three recent Seventh Circuit BIPA cases establish that the federal court lacks subject matter jurisdiction. Green relies on Bryant v Compass Group USA Inc., Fox v Dakkota Integrated Systems LLC and Thornley v Clearview AI Inc.

He wrote that in order to maintain federal jurisdiction, Amazon must prove Article III standing, which requires the plaintiff to have suffered an injury caused by the defendant that is concrete, particularized, and actual or imminent and would likely be redressed by the requested judicial relief. 

“Accordingly, based on binding Seventh Circuit precedent, Amazon has not satisfied its burden of establishing that the public harm alleged is the type of concrete, particularized injury necessary to satisfy Article III’s requirements,” he wrote. 

Green argues that the complaint excludes claims related to the unlawful collection and retention of the class’ data and only asserts claims related to Amazon’s “bare procedural violations” causing the class to be “aggrieved” under BIPA. 

“That is, Amazon, an entity in possession of the biometric data of plaintiff and the class, had procedural requirements to satisfy under BIPA, and its failure to do so means plaintiff and the class are ‘aggrieved’ under BIPA,” Green wrote. “Plaintiff specifically alleges a narrow class of people who … ‘have suffered no injury from Amazon’s violations of BIPA … other than statutory aggrievement.’”

Green argues that the court in Thornley held that Illinois permits BIPA claims alleging bare statutory violations without any need to allege or show injuries. 

Because Amazon cannot satisfy its burden of establishing concrete and particularized harm alleged by the class, Green argues that the case should be remanded to state court. 

Green and attorney Thomas Rosenfeld filed the class action on behalf of plaintiff April Schaeffer on July 26 in the Madison County Circuit Court. Defendants Amazon.com Inc. and Amazon.com Services LLC removed the case to the U.S. District Court for the Southern District of Illinois on Aug. 31. 

The defendants filed a motion for extension of time to file an answer on Sept. 13 through attorney Elizabeth Herrington of Morgan Lewis & Bockius LLP in Chicago. Federal judge Stephen McGlynn granted the request and gave the defendants until Oct. 25 to respond to the complaint. 

In the complaint, Green quoted senior editor of Amazon.com James Marcus: “It was made clear from the beginning that data collection was also one of Amazon’s businesses. All customer behavior that flowed through the site was recorded and tracked. And that itself was a valuable commodity.” 

Green wrote that Amazon’s “voice-based virtual assistant,” Alexa, is embedded in many Amazon devices, including Echo speakers, Fire tablets, and others. Additionally, Alexa can be integrated into other devices such as phones, TVs, thermostats, appliances, lights, and more. Green alleges that after a user speaks to a device equipped with Alexa, “Amazon collects, captures, or otherwise obtains, and subsequently stores voiceprints of the user, and transcriptions of the voiceprints,” which constitute biometric identifiers or biometric information regulated by BIPA.

“Although plaintiff has not suffered a particularized harm from Amazon’s conduct, her statutory rights under BIPA have been violated, and she is ‘aggrieved’ under the statute,” Green wrote. 

The suit states that Alexa works by recording and responding to oral commands upon hearing its “wake word.” However, Green argues that reports and studies indicate that Alexa-enabled devices frequently capture conversations by accident without being triggered. 

“One group of researchers discovered more than 1,000 sequences of words that incorrectly trigger smart speakers, such as Alexa. For example, Alexa may inadvertently be activated by the words ‘unacceptable’ or ‘election,’” Green wrote. 

The suit states that after Alexa responds to a request, Amazon does not delete the voiceprint or transcription. The biometric information is collected and used by Amazon to improve its speech and voice recognition capabilities, Green argues. 

“For years, Amazon represented that the voiceprints were simply streamed to the cloud and used only to allow Alexa to respond to the command and help personalize Alexa’s response to a user.

“Amazon has more recently, however, indicated that it stores voiceprints, the transcriptions made from the voiceprints, and other information created from the voiceprints, including ‘acoustic models’ of the speaker’s voice characteristics, on multiple servers,” Green wrote. 

The suit states that all voiceprints captured from a customer are associated with that user’s Amazon account. Amazon allegedly employs thousands of individuals worldwide to listen to recordings and review transcriptions in order to “eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands.” 

Green wrote that after a voiceprint is used for “machine learning,” the data is not deleted and is aggregated with data from other Amazon sources for that user, such as shopping history. 

“Amazon does not allow a user to stop it from collecting voiceprints,” the suit states. “The only way to stop Amazon’s collection of voiceprints is to mute the microphone or deactivate the device, both of which defeat the device’s utility.”

Green adds that in 2019 Amazon began providing users the ability to delete recordings, but it is unclear if all of the voiceprints are deleted. 

“For instance, Amazon states that even if a customer deletes an audio recording, Amazon, or third-party developers may retain ‘records of the interaction,’” he wrote. 

Green accuses Amazon of failing to notify users that their voiceprints are collected, stored and used and that their interactions are recorded. 

The suit was filed on behalf of “all Illinois residents who own an Alexa device located in Illinois from which, during the class period, Amazon has taken possession of the person’s voiceprint and/or a voiceprint transaction, acoustic model of voice characteristics, or other information created from a voiceprint that is linked to the person; and who have suffered no injury from Amazon’s violations of BIPA and/or other statutory aggrievement.”

Green seeks to enjoin Amazon from collecting, obtaining, storing, using, selling, leasing, trading, profiting from, disclosing or disseminating users’ biometric identifiers and information until it is done in compliance with BIPA. He also seeks an award of $5,000 for each willful violation of BIPA or $1,000 for each negligent violation, plus attorneys’ fees, court costs, and all other relief the court deems just. 

More News