Deepsec 2011: Are Companies “Evil” When it Comes to Privacy?

Last month I attended the In-Depth Security Conference (better known as Deepsec) 2011.  This was my first security conference (outside of Microsoft’s Bluehat) so I’m not sure exactly how to characterize it compared to the better known and larger Black Hat and RSA conferences.  I had a few favorite sessions such as Duncan Campbell’s keynote on Terrorist’s use of Encryption (bottom line is that contrary to the fear mongering we’ve been hearing since the Cold War they aren’t using encryption very much at all), and the sheer fun of Kizz Myanthia’s pursuit of toys using technology from the James Bond films.  But perhaps the session that made the biggest impression on me was Chistopher Soghoian’s session titled “Why the software we use is designed to violate our privacy”.

As it turns out Chris and I almost completely agree on what companies should be doing.  But I found his explanations of why they don’t rather one-sided, and his characterizations of them as “evil” mostly off base.   Chris looks at these issues from the standpoint of a researcher and advocate without apparent benefit of having had to deliver and support a high-volume software product or service.  As a result he tends to put the blame for companies’ actions on lawyers and bean counters (i.e., “the suits”) while ignoring the engineering reasons for their actions.

Last January I wrote a blog post on the topic of full-drive Encryption that, like Chris, called for full-drive encryption to be the default.  However, I drilled into the practical difficulties of delivering this to consumers while in his talk Chris not only ignored these aspects but when I challenged him it was clear he didn’t have a full grasp of them.  I only worked specifically in the security arena at Microsoft for 18 months, but if I’d thought bringing full-drive encryption to consumers was easy I would have pushed like hell to make it happen.  Instead my investigation of the topic lead me to believe more ducks needed to be lined up.  See my earlier post on the topic for why this is so hard, but to put a fine point on it we are still struggling to make this foolproof for Enterprises  and need a lot more evolution across the entire ecosystem for it to become practical for the average consumer.  BTW, another interesting development in this area is the failure to-date of self-encrypting hard drives to catch on.  In other words this is a complex area where the parties’ motives are good but the path to nirvana is a maze of twisty little passages.

Another part of Chris’ talk focused on Microsoft’s decision to drop the “InPrivate Filtering” feature in Internet Explorer 8 (IE8).  Chris focused on the Wall Street Journal’s coverage of Microsoft’s internal debate, in which they describe the conflict inherent in Microsoft being in the advertising business and how that caused InPrivate Filtering to effectively be removed from IE8.  Chris called what happened “Evil”, again a characterization I disagree with.  The technology that Dean (Hachamovitch) and company came up with was both delightfully and problematically simple.  They counted the number of different websites that referenced a third-party cookie and if it exceeded a certain number (10 by default, if I recall correctly) they assumed it is a tracking cookie and blocked further access.  That had two bad side-effects.  One is that in a growing world of Mashups web sites would suddenly stop working correctly because popular services they shared would have their cookies blocked even though they weren’t engaged in tracking.  Second is that there was no way to discriminate between tracking cookies that the user might want (e.g., something they explicitly opted-in to) and those they definitely would not want (e.g., a tracker that had weak privacy protections).

Looking at how InPrivate Filtering was “removed” tells a lot about the state of the discussions inside Microsoft.  They obviously occurred shortly before IE8 was to ship because Microsoft didn’t really remove InPrivate Filtering, all they did was remove the  check box for persisting it across browsing sessions.  This was the absolute smallest change they could make as to avoid potentially breaking IE8 and delaying its shipment.  And, in fact, anyone with a little knowledge (or the ability to do a search with Bing or Google) could find and modify the registry entry to turn on InPrivate Filtering all the time.  I did this and ran that way on my own PCs until IE9 became available.  What I found is that the feature broke so many websites that I refused to turn it on for other members of my family.  Putting myself in the place of the Microsoft executives debating whether to ship the full InPrivate Filtering or not, and taking both the input from the advertising community as well as evidence that the feature had usability issues in mind, and being so late in the shipment cycle for IE8, I would have pulled InPrivate Filtering from the release and made it a goal to design a better solution for the next release.  And that’s exactly what Dean (Hachamovitch) and company did in IE9.  Buried in IE9’s Tracking Protection List (TPL)feature is the old InPrivate Filtering (in all its glory).  It is now the “Your Personalized List” feature within TPL.  However you are far better off, and will have a much better browser experience, selecting a list developed by third-parties that apply some intelligence and criteria to the selection process.  You get to choose which list, from those that are extremely exclusive to others that allow tracking by sites that conform to a set of industry privacy principles.  The latter addresses the objections the advertising industry raised about the IE8 feature, but it is your choice.  The only thing left for Microsoft to do is enable TPL by default, something they’ll likely do once there is industry momentum behind a particular list.

So I don’t agree with Chris’ characterization of what happened around IE8 InPrivate Filtering as “evil”.  They clearly would have been better off had the debate occurred before introducing the feature into beta.  And I really wish they had been more forthcoming in explaining why the feature was removed and what they were going to do about it.  I think that is really Chris’ objection to what happened.  Had Microsoft come out and clearly said “InPrivate Filtering needs to be redesigned to address concerns raised by the advertising industry, including our internal people, as well as usability concerns discovered during beta” he would have been disappointed but understanding.  This is one area where I agree with Chris on the topic of lawyers and other spin-doctors.  Companies have let the lawyers et al control how candid they are in public communications, usually to their detriment.

A last area is that of turning on SSL by default for website access, something I addressed last April.  Here Chris and I are in almost 100% agreement.  So far Gmail remains the only major site to have done this.  Facebook, Twitter, and Hotmail now support always using SSL, but they don’t do so by default and they (Hotmail and Facebook) hide the feature rather deeply in their Settings where typical users will have trouble finding them.  Again I understand some of the usability, engineering, and cost reasons for not having gone straight to turning on SSL by default as Google did for Gmail.  Cost-wise of course it requires many more web servers to handle the same number of users with SSL (https) as without (http).  And it takes time to roll out all those new servers.  But there are other considerations.  For example, in the case of Hotmail turning on SSL for the website broke access by Outlook, Windows Live Mail, Outlook Express, etc.  Microsoft has to wait for a high-enough percentage of users to upgrade to newer versions that work with SSL turned on before it would want to turn on SSL by default.   But it is important that having SSL turned on by default become the norm ASAP.  Earlier this week I noticed that my cousin’s ex-wife’s Facebook account had been compromised.  She kept changing her password but the account was soon compromised again.  I discovered she was using public WiFi hotspots for most of her access and knew nothing about the SSL feature in Facebook.  She also didn’t know what a strong password was.  I told her how to turn on SSL in Facebook and explained strong passwords.  Then I asked her what email provider she uses and it is Yahoo.  Sigh, the only one of the big 3 who doesn’t offer SSL at all (except for login).  I’m afraid her privacy will continually be at risk because there is no practical way to get her, or most other consumers, into a mode of communicating securely all (or even most) of the time.  Unlike the other two areas this is one where control of the situation is nearly completely in the hands of the web sites and doesn’t require changes to the rest of the ecosystem or major changes in user behavior.  Every web site should have SSL on either by default, or with a very easy to find (“in your face”) way to enable it if there are mitigating circumstances (like Hotmail’s Outlook issue).  So I agree with Chris that this is an area where the industry’s behavior is just inexcusable.

Finally I want to address Chris’ belief that the lawyers are in charge.  Sorry, no.  But they do have a lot more influence than they used to.  You can blame governments.  When I joined Microsoft in the mid-90s it was rather hard to find a lawyer.  Basically if you were doing a contract, filing a patent, buying a company, or doing some other legal process you went and contacted them.  But lawyers didn’t come to product design meetings, marketing meetings, staff meetings, etc.  After the U.S. DoJ filed its anti-trust suit against Microsoft you almost couldn’t have a meeting without a lawyer present (and how much do you want to bet that the increased scruitiny of Google is leading to the same thing).  It’s more balanced at Microsoft these days, but lawyers have far more visibility and input than they used to.  Still the General Manager, CVP, or more senior executives get to make the actual decisions.  The lawyers are advisors making sure you understand the risks of your decisions, not actual decision makers.  And rarely is their advise black and white.  If they are good and you are smart then you pay a lot of attention to their input, but in the end one can’t abdicate decisions to them.  A company that does is doomed.  I think what Chris is seeing is that companies are now tending to hide behind lawyers more in public.  How much time does an executive have to spend working with governments, regulators, etc.?  I will talk, candidly, to customers and analysts until I’m blue in the face.  I’ll be a little more guarded with the press, but I’m happy to talk to them at appropriate times.  But talking to government bodies holds little interest for me (or most other technology executives).  You need specialists to do that job, and one of those specialties is Attorney at Law.

Chris and I agree on what we want to see happen, and I even agree with him that companies often live in complex environments with conflicting pressures (though I attribute more of these to practical considerations while he attributes them to “the suits”).  Where we mostly differ is on the hyperbole.


This entry was posted in Computer and Internet, Privacy, Security and tagged , , , , , . Bookmark the permalink.

1 Response to Deepsec 2011: Are Companies “Evil” When it Comes to Privacy?

  1. Bob says:

    I think part of the problem is that not everyone has the same definition of ‘evil’. For example, when you go to your pharmacy’s web site and sign up to be able to order prescription refills online, and they default to checked the checkbox that says, “Send me info on specials…”. Is this evil because they defaulted the checkbox to checked? What about the ISP that logs every site you visit and sells that information to marketing companies without your knowledge or approval? I suspect many people would say ‘No’ to the first and ‘Yes’ to the second example.

Comments are closed.