Sofie Elana Hodara, Zachary Kaiser, and Gabi Schaffzin
GENDr. Fluid. Like You.
There’s something sexy about the quantified self: numbers are easy to understand, comforting in their immutability. In this spirit, we bring you GENDr. A quasi-absurdist work of art, GENDr is a software application by Sofie Elana Hodara, Zachary Kaiser, and Gabi Schaffzin.
Designed for personal use, the application calculates your unique gender composition as a three-part ratio — male to female to other — based on a series of data points derived from your personal browsing history. Then, given your newly designated GENDr, it lets you shop.
GENDr displays nine products determined by your gender ratio. Product placement is driven not just by our sponsors (how else could such an application get funding?) but by the goods and services others with similar statistics have consumed within the last 24-hour period.
The application is a response to John Cheney-Lippold’s 2011 article “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Lippold examines how algorithmic models of data-based surveillance define online categories of identification, specifically gender, “from seemingly meaningless web data [browsing histories]” (169). As a result, gender identification is transformed into “a completely digital and math-based association that defines the meaning of maleness, femaleness, or whatever other gender a marketer requires” (170).
“So what? We live in a post-Snowden era! To digital privacy,” we say as we toast our artisanal beers. But wait, says Lippold, there are serious repercussions.
Lippold is interested in how algorithmically defined categories “softly persuade users towards models of normalized behavior and identity…” For Lippold, the algorithms offer marketers the power to tell “us who we are, what we want, and who we should be” (177).
How does this power of persuasion actually work? Do users of digital interfaces truly become slaves to the power of algorithmic inference? What does it mean when a system declares your gender? How does it inform how you behave and how others treat you?
We all know there are problems with digital privacy as well as gender-related treatment on the web, but GENDr turns that knowing into an easily quantifiable encounter, where issues of algorithmic control and concerns about privacy can be acted out by the user herself through engagement with the application?
Ongoing / Goes live Sept 9