MyIdol, Your Avatar, Playful Surveillance, & Ubiquitous Facial Recognition

Post,Review April 29, 2015 1:15 am

Ever wanted to see what you’d look like with blue hair? In a bright red suit?

How about pole-dancing in a panda costume?

A new, free, iPhone app called “MyIdol” has captivated the likes of Conan O’Brien, Miley Cyrus, and all of my friends. The app, by Chinese company Huanshi Ltd., generates a 3-D avatar from any facial image that you enter. Based on the instructions given (which are in Chinese, although that hasn’t stopped thousands of users) you can take a selfie, input the image, and the software will begin to match vector points on your face to produce what many call a life-like avatar of yourself. You can adjust the vector points and skin tones, and then start playing with hair, makeup, clothing, as well as poses and activities for your avatar, from gyrating to “SexyBack” to cruising around on a motorcycle.

IMG_0408

IMG_0409

UNCANNY VALLEY?

Some have noted that MyIdol often “misrecognizes” users’ chosen gender expression. Others have noted the “uncanny valley” nature of eerily life-like and photorealistic impressions of yourself, from the “hilariously monstrous” to the downright “creepy.”

FullSizeRender-1

You aren’t limited to just selfies. You can also upload any other photo of your choice — your sister, your puppy, Leonardo DiCaprio at Coachella — which is where things start to get interesting. I was watching my friend upload a photo of her father, who passed away some years ago, and started thinking beyond “this is some uncanny valley sh*t” to the real ubiquity of facial recognition apps.

Facial recognition software, of course, can be used for multiple applications, from the FBI’s “Next Gen” ID system to emotion sensors in the now-defunct Google Glass to Facebook photo tagging and Google image searches.  Facebook’s automated photo tagging, in which it uses facial recognition to suggest tags for an image, has been widely criticized and also ruled to violate German privacy law in 2013: “The dispute centers around Facebook’s requirement that users opt-out of its facial recognition database, rather than opting-in as mandated by European Union law.”

Many Facebook users outside of Europe, like myself, who continue to receive image suggestions and use Facebook, enjoy when the algorithm goofs up, tagging that painting or poster behind you, or suggesting a tag for a friend when it finds face-like data points in the folds of a bedsheet. You can choose to inform the algorithm that it has committed an error, enabling it to “learn” and adapt for increased accuracy in future data matches, or you can let it be. Such moments of deliberate misdirection are also a form of unintentional resistance to what has quickly become constant, pervasive uses of facial recognition algorithms.

What does facial recognition software,  of which we are constant, data-aggregating users, mean for us when it is omnipresent? What is the social life of surveillance technologies?

PLAYFUL SURVEILLANCE

MyIdol’s cheeky avatar-generating software is a great example of what Ariane Ellerbrok calls “Playful Biometrics.” As Ellerbrok argues, “play” is central to the mission creep of surveillance software from ‘hard’ biometrics to ‘soft.’ “Alternatively mobilized as marketing logic as well as a form of cultural practice, “play” has a fundamental role in the social life of technologies—even controversial or “serious” technologies” (2011: 529). Through applications like Facebook photo tagging and MyIdol, facial recognition shifts from a controversial securitization technique, emerging from the Post-9/11 surge in racial policing strategizing, to a softer, benign platform for play.

She writes, “Historically, the social role of biometric technology is clearly linked with the domination of marginalized groups, by allowing authorities to filter individuals, thereby assigning them differential rights based on racial, ethnic, or socioeconomic categorizations” (532). Numerous STS scholars have described the role of biometric technologies, from fingerprinting in colonial India to airport surveillance cameras, in producing notions of racialized and sexualized criminalization and in the policing of deviance. As Ellerbrok argues, the seeping of biometrics into “playful” software helps produce momentary lapses or erasures of its insidious surveillance roots — in this way playful biometrics is a method of obfuscation.

PLAYFUL RESISTANCE

A feminist sociotechnical analysis of “play” stays attuned to the ways in which playfulness and playful activity have been generally feminized and their importance diminished, yet that play is fundamental to the construction of individual subjectivities. Soft biometrics perpetuates the normalization of surveillance of marginalized people in various spaces.

Yet playful biometrics can also be productive and resistive: Small blips of resistance, such as messing with Facebook’s photo tagging algorithm, offer new forms of playing back against biometric technologies. I have begun playing with the gender and shape of “my” avatar. Some users of MyIdol have generated avatars of Vladimir Putin, perhaps in response to the recent banning of Putin memes within Russia. Queer studies scholar and performance artist Zach Blas has designed a “Facial Weaponization Suite” of dramatically lumpy silicon masks people can wear to ‘hide’ from surveillance cameras seeking to acquire facial data points. Blas was motivated in part by reading about a research study claiming to be able to “see” a “gay face” in a crowd — chillingly terrifying if used in the wrong situation or the wrong country. Adam Harvey’s “How to Hide From Machines” similarly used simple open-source computer vision algorithms to design face props and wigs users can wear as ‘camouflage.’ Such forms of resistance are deliciously enticing, small arms against the function creep of facial recognition.

FullSizeRender-3

2 Comments

  • rfadok@mit.edu' Richard Fadok

    Fascinating post, Mitali! I’m curious to know more about the relationship between the algorithms used by MyIdol (and other playful apps or websites) and those employed for more overt forms of governance. As you pointed out, the data collected are used to improve the design of facial recognition software, immediately MyIdol’s, but I wonder whether it might perhaps improve others’ as well indirectly. In other words, does the consumer use of MyIdol unwittingly contribute to the increasing sophistication and spread of ‘hard’ biometrics? The concept of ‘playbor’ might be a direction to go with this if that’s the case.

  • An outstanding share! I have just forwarded this onto a colleague who has
    been conducting a little homework on this. And he
    actually bought me dinner due to the fact that I found it for
    him… lol. So let me reword this…. Thank YOU for the meal!!
    But yeah, thanks for spending some time to discuss this
    subject here on your internet site.

Leave a reply

required

required

optional


Trackbacks