Raz Schwartz Ph.D.

Senior Insights Manager
Innovation Product Area Lead @ Spotify

In the Name of Software Morality

Twitterville, the blogosphere, and many tech news outlets were gushing this week after a Cult of Mac write up regarding an iPhone app with the quite literal name “Girls Around Me“. This app that was introduced to Apple’s app store not long ago was quickly removed after a press backlash. At first glance, this app does not offer a novel technology. It merely aggregates people’s publicly shared check-in information from foursquare and match it with the user’s Facebook information to gather additional details. Sounds simple, right? Read More

But as the app shows, when you retrieve social data from various sources and aggregate it in a certain manner, things become creepy. And indeed, the terms creepystalking, and hunting were among the common depictions that first came into the minds of most twitters, bloggers and reporters.

I think this is where the issue took an interesting turn. Although some criticism was pointed towards the app developers, most of the admonishment was directed to the users, i.e. us. Saying something to the effect of “Hey, you didn’t take care of your privacy settings, so it’s your fault. Take this as a sign you need to start worrying more about these things.”

And indeed, (to an extent) they are right. Users’ awareness to privacy settings in social networks is a very important issue. We should take extra measures to educate users on how to be vigilant and make sure we are aware of the different implications sharing this information may have.

Here is the thing though. At some point this argument becomes very similar to the “she had it coming” rhetoric usually directed towards the young girls walking around “looking” for male attention. These arguments of ‘victim blaming’ are often used in to rape cases.

But in the same way this argument falls apart—in its sexism—in real life situations, why should it apply to the virtual sphere? One of the first things you learn about when you study the history of the Interent is the case of the “cyberrape” that took place in early 90s in the LambdaMOO MUD community. Back then, the software creator decided to make changes in the design of the software to prevent future similar instances.

Similarly, what I ask here is for software developers to step up their game and be aware to the moral implications in the process of developing this kind of software. And when Computer Science Schools do not teach even basic critical concepts like the Social Construction of Technology, it is not surprising we find ourselves with applications like “Girls Around Me.”

In giving the benefit of the doub to developers, they may not consider such negative consequences or critiques, or like the app developers feedback put it:

.. we believe it is unethical to pick a scapegoat to talk about the privacy concerns. We see this wave of negative as a serious misunderstanding of the apps’ goals, purpose, abilities and restrictions. Girls Around Me does not provide any data that is unavailable to user when he uses his or her social network account, nor does it reveal any data that users did not share with others.

Indeed, it’s true they didn’t do anything illegal with the data, they “just” aggregated it and displayed it in a “convenient” way. But when I read this I ask myself: Would this guy want his brother or sister’s profile to appear in an app like that? Would they want them to be displayed in the context “Girls Around Me” provides? If I was him, the answer would have been no.

Foursquare, Facebook and Twitter data are inherently invested with social and cultural meanings. Consequently, these companies try not to cross what Helen Nissenbaum calls the context-relative informational norms (although they do sometime try to expand them). But Apps like “Girls Around Me” do not take these norms in consideration. It is therefore why they expose the possible harmful ways the manipulation of our social data can produce.

We cannot stick the blame only on the users. Let’s also point an inquisitive spotlight on the developers and ask them to stop, think, educate, and then do what is right, in the name of software morality.

What are your thoughts?