\ Privacy | TechnoTaste


Usually I appreciate the commentary on TechCrunch. Even though it's often short-sighted and hyperbolic, I usually think they get the big picture ideas right and hit on the stuff that we really should be debating. But Paul Carr's recent article called Facebook Breached My Privacy, And Other Things That Whiny, Entitled Dipshits Say is so stupid, but at the same time so indicative of the ways that many tech. folks are stupid, that I just have to point it out.

Usually, I know, I'd lay down 750 words on it. No one ever accused me of brevity. But this is actually pretty simple. I'll encapsulate Carr's argument in a few sentences, then present my own.

Carr: People who complain about privacy on the web should shut up. They are deluded about what today's social systems are really like. They shouldn't put anything about themselves on the internet that might be a problem, and they should control all others who might do the same. "Blaming Facebook’s flaky approach to privacy for the ills of the exhibitionist generation is just yelling at the stable door, long after the horse has bolted."

Me: Carr sounds like an ignorant elitist jackass calling all the rest of us "whiny, entitled dipshits" just because we don't want to live by the lowest common denominator of privacy, whatever Facebook decides is best for its bottom line. It's ridiculous for a geeky, tech-savvy internet journalist who spends all his waking hours trying to understand online social systems to crap on people who do other things with their time by calling them whiny and entitled. Get a clue, buddy. People like you might code the web, but it's people like us who make it work. Learn to live by our rules, not the other way around. Expecting people to learn how to 100% control all the content they share online, and then do the same for everyone else around them is pure fantasy. If the horse has bolted, then lock the fucking stable door and we'll just hang with the chickens and the pigs.

The last week has been full of news about Facebook's new moves. Expanded product offerings, rampant privacy violations and the like. The big question is whether Facebook can get away with statements like this:

"People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people," Zuckerberg said at a technology awards show in January. "That social norm is just something that has evolved." (via The LA Times)

FALSE. Objectively. Some people are comfortable, but many / most are not. The question is, can Facebook dictate that norm to the web by making business-first decisions now, worrying about the consequences after? Increasingly I believe the answer is yes.

I hear many people say that Facebook is destined to go the way of MySpace, and be superseded by the next big thing in social networking. But I don't believe that anymore. There was Lycos and Altavista and the crew, and then Google came along and people thought the next thing would be along soon. Even in the last few years, there were the people who predicted that Bing or Cuil, or Powerset, or Wolfram or whatever would be the next big thing. But no one's stealing Google's market share on search (although Bing is doing ok…). Google has become a standard, and it will be very hard to shake.

Well, I think Facebook is moving towards that same position. Facebook's idea this past week has been to explode its walls. Facebook wants to be the social graph that powers the web. There will still be new, cool sites for users to get involved in, but why re-invent the wheel? Facebook will allow these sites to slice off a part of the Facebook graph for their users and populate it with their own content. All the while, of course, Facebook is keeping track, expanding its own graph, making a mint. Facebook knows things are going this way, and so this week they slapped down their trump card and said "just you try and stop us!"

We've seen a pretty big backlash in internet terms, but nothing strong enough to lead to anything but minor concessions on Facebook's part. The only things that will stop them at this point might be action from Congress or the courts. At least a few folks in Washington seem to be paying attention.

In the meantime, I think the kind of protest, resistance we're seeing is useful and necessary. I'll be interested to see if Facebook really takes notice. I'm guessing no. So for most of us, our real decision is whether to accept a public life with Facebook, or log off for good. As for me, I'm not thinking of logging off yet, but only because I always assume information about me is public and widely shared without my knowledge. I decided long ago not to put anything on Facebook (or elsewhere) that I wouldn't want to share with the world. But that's me. Facebook allows me to manage my privacy the way I'd like by default. But it should do the same for others too, rather than forcing them into potentially dangerous and uncomfortable choices.

I'm not going to rehash the Google Buzz fiasco. But by this point we've learned a few interesting tidbits about how this disaster happened:

  1. Google rushed the product to launch, bypassing their normal testing process.
  2. They tested Buzz internally with their 20,000 employees, but no one sounded the privacy alarm, or perhaps no one sounded it loudly enough.

It's point #2 that's fascinating to me. When it came out, Buzz was so obviously broken to so many people, not just researchers and geeks, but many in the general public. How did 20,000 Google employees miss that? And what does that say about Google's internal culture?

I began to think about this more when I noticed a news story about how Google and other top Silicon Valley firms are claiming that the demographics of their workforces are trade secrets, refusing to release them. Really? Seems like kind of an obvious cover-up there. Google is an engineering culture, and engineers tend to be overwhelmingly white and male. And what does a white, male engineering culture get you? Buzz, apparently, and a ridiculous inattention to common sense privacy concerns.

I don't mean to bash on Google – they're far from the only company that's predominately white, male, and engineering dominated. But until now I think Google and others have played that card as an asset. They're proud of the fact that they don't have any social scientists around. They think they don't need them. There are lots of computer scientists and engineers who are now creeping into social spaces, claiming they can use massive data and computing to solve the hard problems that social scientists haven't been able to solve. Well, I call BS (a thousand times BS!), and I use Buzz as Exhibit A. I'm no longer shocked that some computer scientists can be that naive and narrow minded. But I still don't understand what's so hard about saying that we need each other. Smart at one thing != smart at everything.

So, do I think Google's internal culture will change? Not in the short term, and maybe not at all. Not unless they suddenly hire a slew of social scientists and put them in positions with real power over engineers, product direction. But I hope this Buzz experience could be the start of a slow realization that algorithms have no answers, they have no whys. They have stunningly small amounts of nuance and subtlety, which is where I'd argue real wisdom lies. And apparently they don't have much common sense either.

In case you haven't heard, Google's Buzz service is the latest privacy apocalypse – check out a nice short summary here, or the details here, here, or here. Now Google has responded by tweaking its service to address some but not all of the privacy concerns. And yet there are still some fairly horrifying implications of Google's move.

I'll let other more knowledgeable folks take a swing at the nature of the privacy debate. I think this whole debacle reveals a more fundamental flaw in the way web companies handle online privacy today: they treat it as a one-size-fits-all phenomenon. Google sat down to make decisions about how to share information for Buzz users. Undoubtedly they started with a list of outcomes that would be good for Google. Then they probably started to imagine the user, and they wanted to make it easy to manage the service. They saw themselves as simplifying what some think is a tedious process of finding friends, managing connections, sharing content. They thought their innovation would be to make all that happen auto-magically. And once they came up with a solution they liked, they shoved it down our collective throat. We revolted (vomited) at their presumption. So they made a few changes. But it's still pretty much like the Gap selling all its clothes in XXL.

I think the diversity of responses to Buzz and its privacy implications should encourage us to stop thinking of privacy as a unitary concept. Attitudes about privacy are personal and contextual. Some people will decide that Buzz is so brilliant, it shouldn't matter that there are some privacy hiccups. Some people are so used to transparently sharing their online lives that revealing all their contacts wouldn't make a difference to them. Others, of course, will have the opposite reaction and feel completely and utterly violated. I myself fall squarely in the middle. I won't be using Buzz, at least in the short term. And my primary reaction is to be angry at Google for having the gall to do this. They knew exactly what they were doing – this was not a privacy "accident" – but they decided it didn't matter. They decided to try and dictate the next privacy norm to us via their awesome power.

The single worst thing about the web right now is that it tries to squeeze all us irregular geometric shapes into the same round hole. There has been almost no effort to assess privacy attitudes and adapt to them. And I'm not talking about opt-in and opt-out, or the types of (seemingly but not really) fine-grained privacy and sharing choices that Facebook recently implemented. I think Google's impulse was probably right: it's a lot to ask of many users to manage all that themselves, especially as systems are so complex and the tendrils and traces of our content and behavior spread out across the web through APIs. But it wasn't right for everyone. In fact, it wasn't right for most people. The $10 billion question is: how can we tell the difference between users, and adapt the experience to what they want? The pace of innovation over the last 10 years has been accompanied by social norms that move so fast they can be easily pushed around by the behemoths of the web. I suspect that era is coming to a close, and companies like Google and Facebook will have to start responding to our attitudes about things like privacy, trust, and motivation rather than trying to dictate them to us.

In what was the least surprising and most self-serving statement of the weekend, Facebook CEO Mark Zuckerberg has proclaimed that privacy is no longer a social norm in our world. That's right. Privacy… GONE. Over. We're all now happy to put the most intimate and minute details of our lives on the internet, and we won't think twice about it. Thank goodness we have CEOs like Zuckerberg to tell us about our social norms.

RIP Privacy.

Now back to reality. Privacy is not dead. Far from it. Privacy is a bigger issue than it has ever been.

So how should we read Zuckerberg's statement? On the one hand, we can default to the most general implication of what he's saying: notions of privacy are in flux. True. But that's always been true. Is the internet changing privacy more fundamentally than radio or television did? It's interesting to think… It couls be that the pace of evolving norms has been accelerated of late. Or, we could remember that Zuckerberg is the mouthpiece for the internet's most prominent and, arguably, egregious privacy violator. It is squarely in his company's interest to argue that the default reaction of the Facebook-going public is to share everything with everyone. It saves him the hassle of having to deal with the violations that are increasingly occurring.

So, what can we say about privacy in the age of Facebook and Twitter? First of all, I think we should resist the urge to make blanket pronouncements. There is certainly a group of young people who have grown up with Facebook in their lives. For these people, privacy means something different, just as "friend" means something different, than it does for many other people. But that's far from a common view. Yesterday's NYTimes Week in Review has a nice article on this subject. For many, arguably most folks, privacy is still very real. And it's something that many people hold important and nuanced attitudes about.

With Coye Cheshire and Elizabeth Churchill, I have been looking into these attitudes. We've been finding that discretion – the ability or desire to suss out the nature of a specific situation and act accordingly, rather than applying a blanket attitude – is key. I suspect that many people exercise a huge amount of discretion about their online information. They differentiate between contexts, audiences, and types of information. After all, why do we assume that the same privacy attitudes would apply to information about, say, our bank accounts, our present geographical location, and our breakfast?

I think privacy is going to be the banner issue of 2010 and beyond. But the banner isn't going to read "Privacy is Dead." The challenge for sites like Facebook is going to be to build socially smart tools that don't apply blanket rules about privacy. Facebook's new privacy rules are organized around functions on the site. But I don't want to decide who can read all my status updates. I want different people to have access depending on what I'm writing about, when I'm writing, where I am, etc.

Dealing with privacy effectively will mean first doing some tough research. What aspects of individuals, of contexts, and of interactions bear on specific privacy attitudes? We need to be thinking of privacy as a whole range of attitudes, not simply a single standard. Then we need to design easy-to-use technologies that can give people the privacy they want based on what they're doing and who they're doing it with.

Facebook can't solve the privacy issue by wishing it away or declaring it gone. If Zuckerberg's comment is indicative of their stance, I'm seeing the chink in Facebook's armor. Some wily start-up is going to come along with a beautiful and flexible technology that will allow people to share the way they want to and they're going to eat Facebook's lunch.

Via Joe, I just read this NY Times article on Jen King and colleagues' new study on Americans' attitudes about privacy and behavioral targeting: Two-Thirds of Americans Object to Online Tracking.

The article brought up some thoughts I started knocking around last summer while working at Yahoo! with Elizabeth Churchill. Namely this contradiction, the form of which I borrow from Larry Downes' Law of Disruption:

Attitudes about privacy change incrementally, but revenue models for free online services change exponentially.

Result: what companies need to do to support free services clashes with privacy attitudes. Note that I'm not making any statement about the goodness or badness of behavioral targeting. My only point is that the expectation of free online services sometimes doesn't jive that well with the fact that the companies that provide those services have to find a way to make money.

As concerns about privacy, control of personal data, and behavioral targeting escalate, I predict we're going to find at least some companies crying foul that consumers want to have their cake and eat it too. This could lead to a move towards subscription services that relieve companies from the burden of having to make money from targeted ads, market research, and the like.

Update: Here's the original study.

Facebook Logo

OMG, I'm the gajillionth blogger to write a post about Facebook's change to their terms of service. In a nutshell, they now reserve the right to keep and use your data even when you stop using Facebook and cancel your account. Eek! Fodder for privacy debate!

From my point of view, the interesting part is the metaphor we use to think about it. In Mark Zuckerberg's blog post defending the change, he says this:

When a person shares something like a message with a friend, two copies of that information are created—one in the person’s sent messages box and the other in their friend’s inbox. Even if the person deactivates their account, their friend still has a copy of that message. We think this is the right way for Facebook to work, and it is consistent with how other services like email work.

Touche, Mr. Zuckerberg. I see what you're getting at, but this is not the right metaphor. The real question is, when I send you a letter in the mail, does the postal service get to keep it and use it how they see fit? In the world of email, do the mail servers my message passes through get to log all the traffic, mine it for profit? Let's hope not.

But that metaphor doesn't quite work either, depending on how you see the issue of what services Facebook provides. So let's try a different one. If you make a bunch of copies of a photo of yourself – let's say you, wearing a funny hat, marker on your face, drunk as hell – and post them on public bulletin boards all over town, what rights do you have to your photo if you want to go back on the whole thing?

To Zuckerberg's credit, I think he rightly points out that there's nothing easy about all this, and the privacy issues are far from clear cut. I also think these little scuffles, which from my POV have few direct implications, are most important because they force us to reconcile these issues, which will only be more common as time goes on. What expectations is it reasonable for us to have when we put our data out into the cloud? What rights does a company that provides us with a free, valuable service have to the information we pass through it? Is Facebook more like a postal service or a bulletin board? The answer is likely to be both. Oh boy.