Historic $ 650 million biometric privateness go well with | Enterprise regulation in the present day from ABA
One of the biggest legal stories of 2020 barely made the headlines as coverage of the COVID-19 pandemic and the presidential elections understandably dominated the news cycle. The general attention for data protection has mainly focused on the implementation and the subsequent referendum to expand the California Consumer Privacy Act (“CCPA”), while the Schrems II invalidation of the GDPR Privacy Shield framework also received widespread attention.
As a result, Facebook’s historic $ 650 million biometric privacy agreement under the Illinois Biometric Information Privacy Act (“BIPA”) attracted much less media fanfare. The deal comes just a year after Facebook was fined a record $ 5 billion by the Federal Trade Commission for deceiving users about users’ ability to control the privacy of their personal information.
Given the pace of current events, the lack of attention on the subject is not surprising, but unfortunate. Access to biometric data, including facial recognition technology, is one of the most important privacy issues lawyers will have to think about in the decades to come.
Face recognition technology is no longer science fiction
Face recognition technology uses data to create a biometric map of the human face. Once the data is collected, algorithms analyze incoming images for unique facial features and dimensions to find a match and match the new image to a person.
Some estimates put the facial recognition market to be worth nearly $ 9.6 billion by 2022. Facial recognition has already been tested at major US public sporting events where attendees are admitted to the facility through facial authentication instead of paper tickets or other methods.
Facebook closes $ 650 million facial recognition class action lawsuit
In late January 2020, Facebook announced it would pay $ 550 million to settle a BIPA class action lawsuit for the use of facial recognition technology. In July 2020, the settlement was increased to $ 650 million.
Illinois was the first state to pass biometric data protection laws and passed BIPA in 2008. BIPA requires, among other things, that consent is obtained prior to collecting a user’s biometric data. With BIPA considering $ 1,000 to $ 5,000 for every breach of the law, a ruling could have exposed Facebook in billions in damages. The $ 650 million deal is likely the largest facial recognition case yet.
The legal dispute arose from the Facebook service “Tag Suggestion”. This is essentially a photo tagging service that suggests the names of people in photos. Facebook obtained this information through “tagging,” a practice in which users identify themselves and others on photos. This information was put into a database, and eventually Facebook had enough data to automatically recognize users’ faces and give users suggestions to “tag” new photos.
However, Facebook isn’t the only application on your smartphone that might collect your biometric information. Several other popular smartphone apps have been criticized for improperly using face recognition technology. Most recently, the Clearview AI app was criticized by data protection officers. Clearview AI “scratches” publicly available photos from social media accounts. Clearview AI has signed contracts with law enforcement agencies and also sold this information to private companies through May 2020. Clearview promised to voluntarily terminate all contracts with Illinois-based companies after they were also sued under the BIPA. Apple is also facing several lawsuits under the BIPA. One includes face recognition, the other focuses on voice biometrics.
Technology companies promise reform
In response to this recent controversy, several of the largest technology companies have announced restrictions on the development of facial recognition technology. On June 8, 2020, IBM announced that it would no longer develop facial recognition technology.
On June 11, 2020, Microsoft announced that it would no longer allow law enforcement agencies to use its facial recognition technology.
Biometrics, including facial recognition, may be the main new legal front for privacy advocates. Biometric systems are being used more and more frequently in society, for example the fingerprint sensor on many smartphones, scanning the retina and speech recognition.
Regardless of the likelihood of future litigation and public policy proposals, individuals can now take various measures to protect their privacy. Anyone concerned with protecting biometric data should avoid apps that require uploading a photo of their face. It also helps users avoid tagging themselves and others on social media posts. Finally, users can check their privacy settings and turn off most face recognition features.
Unfortunately, governing the use of facial recognition technology by government agencies is not always that easy. Other governments, especially China, paint an alarming picture of how biometric data can be misused by authorities if there is no adequate legal protection.
Several local jurisdictions already prohibit the use of facial recognition technology. Big cities like San Francisco, Boston, and Oakland have passed such laws. In June 2020, Boston City Council unanimously approved the police ban on facial recognition. Officials cited concerns including racial prejudice and misidentification. Other cities go further. Effective January 1, 2021, the Portland, Oregon facial recognition ban will apply to both government agencies and private companies.
BIPA may only attract a fraction of the attention that other well-known privacy laws like CCPA and GDPR get. However, with more lawsuits filed under BIPA each month, it often becomes harder to ignore this often overlooked Illinois law in 2021, Illinois could follow by enacting similar laws in the near future.
Patrick McKnight is an associate in the legal department of Klehr Harrison Harvey Branzburg LLP in Philadelphia, Pennsylvania. He is a member of the company’s data, privacy, and cybersecurity practice group. The opinions expressed in this article are those of the author and not necessarily those of Klehr Harrison Harvey Branzburg LLP or its clients.