February 2010

Here's something I never thought I'd have to do, but a recent journal submission requires that tables be submitted in .doc format — despite the fact that they fire everything back into LaTeX when they're typesetting the final journal anyway! Well, I guess there's just no arguing, so I had to figure out how to convert a LaTeX table to Word format. This process turned out to be surprisingly easy. Here's what I've figured out to be the easiest way to do this:

  1. Get Latex2RTF, which is an open source converter for both Windows and Linux. Install it.
  2. The GUI front-end that came with the software didn't work, so I skipped directly to the command line. I navigated to the proper directory – for me that's C:\Program Files\latex2rtf. The basic command worked great for me: "latex2rt {path to table}\table1.tex" Remember that on Windows, if you have spaces in your directory names you may need to put the whole path to the file in quotes. It'll do this for you automatically if you use the tab autocomplete when typing the path.
  3. Latex2RTF seems pretty complete, but there are still one or two latex commands that it just won't handle. For example, I had to remove the @{} format from my table declaration, which was no big deal. Finding the commands that bonk is easy, just use the debugging option at level 4, like "latex2rt -d4 {path to table}\table1.tex"
  4. I got an almost perfect RTF out of this process, but make sure to check the table carefully. I found, for example, that it didn't convert my text daggers, so I had to add them back. After a minimum of fiddling with font sizes, row spacing, etc., save as .doc, CELEBRATE!

It's not the statistics, it's what you say about them:

(Click for a larger image.)

I'm not going to rehash the Google Buzz fiasco. But by this point we've learned a few interesting tidbits about how this disaster happened:

  1. Google rushed the product to launch, bypassing their normal testing process.
  2. They tested Buzz internally with their 20,000 employees, but no one sounded the privacy alarm, or perhaps no one sounded it loudly enough.

It's point #2 that's fascinating to me. When it came out, Buzz was so obviously broken to so many people, not just researchers and geeks, but many in the general public. How did 20,000 Google employees miss that? And what does that say about Google's internal culture?

I began to think about this more when I noticed a news story about how Google and other top Silicon Valley firms are claiming that the demographics of their workforces are trade secrets, refusing to release them. Really? Seems like kind of an obvious cover-up there. Google is an engineering culture, and engineers tend to be overwhelmingly white and male. And what does a white, male engineering culture get you? Buzz, apparently, and a ridiculous inattention to common sense privacy concerns.

I don't mean to bash on Google – they're far from the only company that's predominately white, male, and engineering dominated. But until now I think Google and others have played that card as an asset. They're proud of the fact that they don't have any social scientists around. They think they don't need them. There are lots of computer scientists and engineers who are now creeping into social spaces, claiming they can use massive data and computing to solve the hard problems that social scientists haven't been able to solve. Well, I call BS (a thousand times BS!), and I use Buzz as Exhibit A. I'm no longer shocked that some computer scientists can be that naive and narrow minded. But I still don't understand what's so hard about saying that we need each other. Smart at one thing != smart at everything.

So, do I think Google's internal culture will change? Not in the short term, and maybe not at all. Not unless they suddenly hire a slew of social scientists and put them in positions with real power over engineers, product direction. But I hope this Buzz experience could be the start of a slow realization that algorithms have no answers, they have no whys. They have stunningly small amounts of nuance and subtlety, which is where I'd argue real wisdom lies. And apparently they don't have much common sense either.

I've mentioned my borderline unhealthy interest in Team Fortress 2 before. I'm also interested in the genres of video that have sprung up around the game – frag videos, griefing, machinima. And now an inspired and hilarious cartoon that I think you'll appreciate even if you've never played the game, but especially if you have:

In case you haven't heard, Google's Buzz service is the latest privacy apocalypse – check out a nice short summary here, or the details here, here, or here. Now Google has responded by tweaking its service to address some but not all of the privacy concerns. And yet there are still some fairly horrifying implications of Google's move.

I'll let other more knowledgeable folks take a swing at the nature of the privacy debate. I think this whole debacle reveals a more fundamental flaw in the way web companies handle online privacy today: they treat it as a one-size-fits-all phenomenon. Google sat down to make decisions about how to share information for Buzz users. Undoubtedly they started with a list of outcomes that would be good for Google. Then they probably started to imagine the user, and they wanted to make it easy to manage the service. They saw themselves as simplifying what some think is a tedious process of finding friends, managing connections, sharing content. They thought their innovation would be to make all that happen auto-magically. And once they came up with a solution they liked, they shoved it down our collective throat. We revolted (vomited) at their presumption. So they made a few changes. But it's still pretty much like the Gap selling all its clothes in XXL.

I think the diversity of responses to Buzz and its privacy implications should encourage us to stop thinking of privacy as a unitary concept. Attitudes about privacy are personal and contextual. Some people will decide that Buzz is so brilliant, it shouldn't matter that there are some privacy hiccups. Some people are so used to transparently sharing their online lives that revealing all their contacts wouldn't make a difference to them. Others, of course, will have the opposite reaction and feel completely and utterly violated. I myself fall squarely in the middle. I won't be using Buzz, at least in the short term. And my primary reaction is to be angry at Google for having the gall to do this. They knew exactly what they were doing – this was not a privacy "accident" – but they decided it didn't matter. They decided to try and dictate the next privacy norm to us via their awesome power.

The single worst thing about the web right now is that it tries to squeeze all us irregular geometric shapes into the same round hole. There has been almost no effort to assess privacy attitudes and adapt to them. And I'm not talking about opt-in and opt-out, or the types of (seemingly but not really) fine-grained privacy and sharing choices that Facebook recently implemented. I think Google's impulse was probably right: it's a lot to ask of many users to manage all that themselves, especially as systems are so complex and the tendrils and traces of our content and behavior spread out across the web through APIs. But it wasn't right for everyone. In fact, it wasn't right for most people. The $10 billion question is: how can we tell the difference between users, and adapt the experience to what they want? The pace of innovation over the last 10 years has been accompanied by social norms that move so fast they can be easily pushed around by the behemoths of the web. I suspect that era is coming to a close, and companies like Google and Facebook will have to start responding to our attitudes about things like privacy, trust, and motivation rather than trying to dictate them to us.