Rudder's post described a few of the experiments that the dating website had carried out. In one, OKCupid told people that they would be good matches with certain other people even though the site's algorithms had determined that they would be bad matches. That's right: The company deliberately lied to its users. OKCupid wanted to see if people liked each other because they have the capacity to make up their own minds about who they like, or if they like each other because OKCupid tells them they should like each other.
(The controversial post was Rudder's first in several years; he had taken time off to write a book about experimenting on people. Due out next month, the book is called Dataclysm: Who We Are (When We Think No One's Looking).)
The OKCupid post was in part a response to controversy over a recently discovered Facebook experiment, the results of which were published in an academic journal. Facebook wanted to see if people would post more negative posts if their own News Feeds had more negative posts from their friends. In the experiment, Facebook removed some posts by family and friends because they were positive. The experiment involved deliberately making people sadder by censoring friends' more uplifting and positive posts.
Don't like this kind of manipulation? Here's Rudder's response: "Guess what, everybody: if you use the Internet, you're the subject of hundreds of experiments at any given time, on every site. That's how websites work."
What's wrong here
Rudder's "everyone is doing it" rationalization for experimenting on users makes it clear that he doesn't understand the difference between what OKCupid and Facebook are doing, and what other sites that conduct A/B tests of different options are doing.
The difference is that OKCupid and Facebook are potentially changing, damaging or affecting the real relationships of real people. They are manipulating the happiness of people on purpose.
These companies might argue that this damage to the mood and relationships of people is small to the point of being inconsequential. But what makes them think it's OK to deliberately do any damage at all?
The other glaring problem with these social science experiments is that the subjects don't know they're participating.
Yes, I'm sure company lawyers can argue in court that the Terms of Service that everyone agreed to (but almost nobody read) gives OKCupid and Facebook the right to do everything they do. And I'm sure the sites believe that they're working so hard and investing so much to provide free services that users owe them big time, and that makes it all OK.
Imagine a splash screen that pops up each month on these sites that says: "Hi. Just wanted to make sure you're aware that we do experiments on people, and we might do experiments on you. We might lie to you, meddle in your relationships and make you feel bad, just to see what you'll do."
No, you can't imagine it. The reason is that the business models of sites like OKCupid and Facebook are based on the assumption of user ignorance.
Why OKCupid and Facebook think it's OK to mess with people's relationships
The OKCupid admission and the revelations about the Facebook research were shocking to the public because we weren't aware of the evolving mindset behind social websites. No doubt the OKCupid people and the Facebook people arrived at their coldly cynical view of users as lab rats via a long, evolutionary slippery slope.
Let's imagine the process with Facebook. Zuckerberg drops out of Harvard, moves to Silicon Valley, gets funded and starts building Facebook into a social network. Zuck and the guys want to make Facebook super appealing, but they notice a disconnect in human reason, a bias that is leading heavy Facebook users to be unhappy.
You see, people want to follow and share and post a lot, and Facebook wants users to be active. But when everybody posts a lot, the incoming streams are overwhelming, and that makes Facebook users unhappy. What to do?
The solution is to use software algorithms to selectively choose which posts to let through and which to hold back. But what criteria do you use?
Facebook's current algorithm, which is no longer called Edgerank (I guess if you get rid of the name, people won't talk about it), is the product of thousands of social experiments -- testing and tweaking and checking and refining until everyone is happy.
The result of those experiments is that Facebook changes your relationships. For example, let's say you follow 20 friends from high school. You feel confident that by following them -- and by them following you -- that you have a reliable social connection to these people that replaces phone calls, emails and other forms of communication.
Let's say you have a good friend named Brian who doesn't post a lot of personal stuff. And you have another friend, Sophia, who is someone you don't care about but who is very active and posts funny stuff every day. After a period of several months during which you barely interact with Brian but occasionally like and comment on Sophia's posts, Facebook decides to cut Brian's posts out of your News Feed while maintaining the steady stream of Sophia posts. Facebook boldly ends your relationship with Brian, someone you care about. When Brian posts an emotional item about the birth of his child, you don't see it because Facebook has eliminated your connection to Brian.
And don't get me started on OKCupid's algorithms and how they could affect the outcome of people's lives.
Not only do both companies experiment all the time; their experiments make huge changes to users' relationships.
The real danger with these experiments
You might think that the real problem is that social networks that lie to people, manipulate their relationships and regularly perform experiments on their users are succeeding. For example, when Facebook issued its financial report last month, it said revenue rose 61% to $2.91 billion, up from $1.81 billion in the same quarter a year ago. The company's stock soared after the report came out.
Twitter, which is currently a straightforward, honest, nonmanipulative social network, has apparently seen the error of its ways and is seriously considering the Facebook path to financial success. Twitter CEO Dick Costolo said in an interview this week that he "wouldn't rule out any kind of experiment we might be running there around algorithmically curated experiences or otherwise."
No, the real problem is that OKCupid and Facebook may take action based on the results of their research. In both cases, the companies say they're experimenting in order to improve their service.
In the case of OKCupid, the company found that connecting people who are incompatible ends up working out better than it thought. So based on that result, in the future it may match up more people it has identified as incompatible.
In the case of Facebook, it did find that mood is contagious. So maybe it will "improve" Facebook in the future to build in a bias for positive, happy posts in order to make users happier with Facebook than they are with networks that don't filter based on positivity.
What's the solution?
While Twitter may follow Facebook down the rabbit hole of user manipulation, there is a category of "social network" where what you see is what you get -- namely, messaging apps.
When you send a message via, say, WhatsApp or Snapchat or any of the dozens of new apps that have emerged recently, the other person gets it. WhatsApp and Snapchat don't have algorithms that choose to not deliver most of your messages. They don't try to make you happy or sad or connect you with incompatible people to see what happens. They just deliver your communication.
I suspect that's one of the reasons younger users are increasingly embracing these alternatives to the big social networks. They're straightforward and honest and do what they appear to do, rather than manipulating everything behind the scenes.
Still, I'd love to see at least one major social site embrace honesty and respect for users as a core principle. That would mean no lying to users, no doing experiments on them without their clear knowledge, and delivering by default all of the posts of the people they follow.
In other words, I'd love to see the founders of social sites write blog posts that brag: "We DON'T experiment on human beings."
Wouldn't that be nice?
Mike Elgan writes about technology and tech culture. You can contact Mike and learn more about him at http://Google.me/+MikeElgan. You can also see more articles by Mike Elgan on Computerworld.com.
Read more about social media in Computerworld's Social Media Topic Center.
This story, "In Search of a Social Site That Doesn't Lie" was originally published by Computerworld.