The Wall Street Journal did a pretty serious investigation into the cookies and other bits and pieces that websites put in our computers as we surf our way around the internet. From reading that article I learnt that the process of selling ad space based on our profile is an extremely quick one and can happen even as the page we are viewing is loading. I also learnt that the companies that are building our profiles can offer incredibly precise segmentations, but that they do not actually know who we are.
Here, though, is the rub. Let’s say they don’t know our names, but I’m pretty sure that they could work out who I am based on the sports, technology and health pages I look at. The sites I look at to check movie times would give you a pretty good idea of where in the world I live. On the other hand, all of the data processing is done automatically – no one at these companies is actually trying to work out who anyone is based on their profiles. Frankly, it would be a waste of their time.
But what if the data got out? When AOL released hundreds of thousands of searches in what it thought was a generous gesture to the research community, some zealous investigators tracked down individual users. This suggests that the main privacy issue is one of data protection. Unlike the bank, however, whose security measures you can assess, we don’t really have a clue what is going on in these companies. We know that they will have to pay hefty fines if there are data breaches, which, in the US at least, they are legally obliged to report, and that this should serve as a good reason for them to secure their (our?) data, but perhaps by then the damage will be done.
An article in Friday’s New York Times discusses a new start-up, Bynamite. What the people at Bynamite want is for us to take back control over the information that advertising networks have about us. Like they say on their home page:
You should always be in control over what advertisers know about you – you should be able to see it, change it, and delete it. If they won’t give you control, they shouldn’t use your information.
So they tell you what the ad networks know about you, giving you the chance to change that. This all seems very in keeping with recent developments surrounding privacy – being able to know what others know about you and correct errors in that information.
But I’m wondering about the direction of this. One of the features of online advertising is that it commodifies our online behavior – the links we click, the searches we do – and turns it into commercially valuable information. What Bynamite is doing is saying that this is an irreversible process, so you might as well have some input into it. Some input may be better than none, of course, but there is a sense here in which this new start-up is encouraging us to be active players in our own commodification and to help advertisers target us even more accurately.
The truth is, then, that Bynamite is not a company that has anything to do with privacy, except in the rather loose sense of controlling the information that marketers posses about us. The benefit that they offer us is that we might see fewer “irrelevant” adverts. Excuse me if I’m underwhelmed…
Update: One of the Bynamite founders took the time to comment on this post. He points out that:
Bynamite opts you out of ad networks that *don’t* give you enough transparency and control. If they won’t show you what they know about you, and give you the power to change their profile, then they can’t use your information. That give-and-take is built into the product, so that if we are successful, it should mean an overall increase in consumer power over the ad industry.
I’m doing some research into privacy and technology. I’ve probably mentioned that already. Yesterday, as part of my research, I interviewed an man who has been involved in the internet in Israel since 1994. We had a very interesting chat over a lovely cup of coffee in a very beautiful corner of Tel Aviv (Cafe Ben Ami, if you were wondering).
Our conversation was very interesting, but then it occurred to me that we hadn’t really explicitly defined which aspects of privacy we were talking about. We were mostly talking about how people put more and more stuff online, more and more of which is publicly accessible by other people. In other words, we were talking about privacy in terms of the stuff other people know about you.
This is no doubt interesting, but I’m not sure it’s the main point at all. I think that of more interest is what machines know about you, and how this enables them to target you with certain adverts. For instance, Google’s search logs have been called a “database of intentions”. Put very simply, your past behavior (and your searches are in some ways a proxy for your behavior) might predict your future behavior. If, every Friday, you search for a nice place to go out for dinner that night, how complicated would it be to give you an advert on Friday morning for a restaurant?
This is why Elliot Schrage, vice president for public policy at Facebook, gets it completely wrong in his really quite awkward questions and answers piece in the New York Times. As far as he is concerned, there’s no problem with Facebook sharing your data with other companies because they never share your name or other personally identifiable information. (The issue of de-anonymization is one for another post, so I’ll just put that aside for now.) The point, of course, isn’t that they have my name. The concern isn’t what other people know about me (or at least, that’s not the only concern). The concern is about how knowledge of my past behavior and interests might enable a commercial entity to have too much influence over my future behavior and consumption decisions.