Stay networked. Get informed. Broadcast your projects.
Facial recognition (FR) involves the automatic processing of digital images where individuals are recognized by their unique facial features. Specifically, FR facilitates the notion of matching the name with a face and increases the risk of revealing one’s sensitive information. This paper discusses the privacy and data-protection implications surrounding the proliferation of facial recognition technology (FRT) and presents a harmonized approach to mitigating violations to privacy while using FRT.
The Article 29 Working Party (2012) warned that the ability of facial recognition software (FRS) to capture biometric-data without individual consent would end anonymity in public-spaces. I believe that the use of FRT makes it easier for public and private entities to identify and target people choosing to remain anonymous. In principle, this violates the right to freedom of expression, including the right to communicate anonymously, which is critical to express unpopular views without fear of retaliation (EFF, 2012a).
Facebook uses FRT to suggest who users should tag in photos, which is active on default unless the user opts-out of the policy (Smith, 2012). This raises privacy concerns because it facilitates recognition in photos without persons’ explicit permission; thus causing the loss of control over the disclosure of personal identity. Researchers at Carnegie Mellon University employed FRT to social-media profiles to identify strangers, resulting in access to personal information inclusive of social-security numbers (Acquisti, 2011).
To safeguard user-privacy, I support Lommel’s claim that tags of people on photos should happen based on prior consent and should not be activated by default (Boulton, 2011). Moreover, this should be reflected in transparent Terms of Service (TOS) agreements. However, Sherman, Facebook's privacy and public-policy manager, defended Facebook’s TOS by stating that Facebook only utilizes FRT for users to identify their friends; not strangers (Federal Trade Commission, 2012).
Facebook is amassing a large database of biometric-data; thus Governments may want to regulate access to this database. However, government-access could increase government-surveillance, which may hamper the rights to basic civil liberties; consequently undermining innovation and increasing cyber-security risks.
The application of FRT in law-enforcement can be beneficial in solving crimes across multiple jurisdictions (EFF, 2012b). In the United States, the Federal Investigation Bureau and the Department of Homeland Security are incorporating FR capabilities into their biometric-databases. However, Senator Franken expressed concern that application of FTR without transparent legislation could result in unsuspecting civilians being falsely accused of crime (Federal Trade Commission, 2012). Furthermore, the application of FRS in law-enforcement could be insidious as some applications can capture within a crowd, an individual’s personal possessions and their associations, thereby violating the rights to privacy and freedom of association. Other implications of FRT may include; racial and ethnic profiling, identity-theft, stalking and social stigma (EFF, 2012b). Furthermore, I am concerned that the faceprints of innocent victims could be placed in criminal databases, which could tarnish an individual’s reputation.
Employers and school admission entities utilize FRT to investigate potential employees and students (EPIC, 2012). However, misidentification could result in severe consequences for potential admission or employment. Additionally, children today are using the Internet for several reasons; thus through the use of FRT, they could become victims of covert tracking without their consent (CDD, 2012). Therefore, I agree with CDD’s view that companies should have a clear opt-in structure for teens in order to undertake FR.
Appleʼs iOS 5 and Google’s Android 4.0 mobile operating systems include FR application programming interfaces. Thus, coupled with geo-location and other smart-phone-technologies, FR facilitates individual-profiling among marketers (CDD, 2012). Additionally, data-controllers without consent of their data-subjects can covertly generate faceprints and sell it to third parties. This is dangerous as it threatens personal safety and access to personal information (demographics and credit-scores among others).
I believe that this potentially invasive technology is being used without clearly defined policies. In the United States, EFF (2012b) reported that only three States implemented laws concerning biometrics collection. In striking a balanced approach, self regulation is important but there is need for internationalized legislation regarding the use of FRT. Specifically, the Data Protection Directive may be used as a template for defining internationalized laws addressing the retention of FR data, informed consent and security and accountability of FRS. Also, media-literacy programmes should be incorporated on a national-scale informing citizens of the dangers of posting their personal information online, the implications of FRT and their ability to opt-in or out of TOS.
Although the application of FRT presents advancements in invention and innovation, it may pose critical threats to privacy and other civil liberties. Consequently, in safeguarding privacy while using FRT, I have proposed increased capacity-building initiatives, the formulation of internationalized legislation coupled with maintaining transparent TOS agreements regarding the use of the technology.
Federal Communications Commission [FCC] (no date) Open Internet. Available at http://www.fcc.gov/guides/open-internet [accessed 23 October 2012].
Forbes (2012) Proposed Web Regulations Threaten Free Internet. 4 October. Available at http://www.forbes.com/sites/edblack/2012/10/04/proposed-web-regulat... [accessed 23 October 2012].
British Broadcasting Commission [BBC] (2010) Q&A: The network neutrality debate. 22 December.
Available at http://www.bbc.co.uk/news/technology-10924691 [accessed 23 October 2012].
Center for Democracy & Technology [CDT] (2012) Security proposals to the ITU could create more problems, not solutions.
VanityFair (2012) Word War 3.0. May. Available http://www.vanityfair.com/culture/2012/05/internet-regulation-war-s... [accessed 23 October 2012]