Archive for March, 2008

how we love meaningless ‘facts’ about security

March 31st, 2008

Consumers like simple answers. In fact, they insist on them. When I was shopping around for an espresso maker, I knew that I know nothing about the subject. I also didn't care to learn about it just so I can pick out the right machine, it's unlikely to be a worthwhile investment. So when I went to the store and started glancing over all the different machines, and the salesman comes up to me, all I really wanted to know is which is the best one? This is the way consumers think. You can give the whole run down of specifications and they will still want you to tell them which one is best. The guy will dance around the issue a little, "well it depends on what you want etc" but eventually he will converge with your viewpoint, because he knows what you want to hear. You want him to tell you which one to pick. It doesn't even matter if he tells you the truth, you just want an excuse so that you don't have to think about it. If it later turns out that he was lying, well I guess I'll have to bite the bullet and do my own research next time.

This attitude demonstrates that we want simple answers to complex questions. Just get the answer and have your peace of mind already, it doesn't matter how accurate it really is.

As technologists, one of our favorite issues is security. People get passionate about security, they have long discussions about it and they're so keen on the latest in security development - in the magazines, on the blogs, everywhere. But it's really just entertainment. They don't actually understand the issues or even want to learn about them, they just want to have the simple answer.

Ironically, security is particularly badly suited to such black&white perception on reality, as it is one of the most complicated aspects of our technologies. Nevertheless, you will often see stories like this, about cracking 3 laptops running OS X, Vista and Linux respectively. Apparently, the Mac was popped first. Now, the reporter of this story will not declare that this test makes Linux the safest platform. Such a conclusion would be completely unfounded. But not saying it is actually not very far from saying it, because if that wasn't the point of this exercise, then what was?

The fact is that the issue of security is much more complicated than most people want to deal with. They just want a smiling salesman to pat them on the back for making the right choice. Consumer IT security is the car sales of the industry.

The other thing is that security is a very delicate issue in and of itself. It isn't about the general quality of a system, it's merely about finding the one weakness in an otherwise perfect system and that can be enough to compromise the whole thing. This makes it distinct from many other facets of computer systems, where 95% is a great score (on say, ease of use). Security is about 100% coverage, non-negotiable. The way to achieve that is to run as few applications as possible, allow as little incoming communication as possible, and keeping a close watch on everything on a day-to-day basis. Which is exactly what desktop users want out of their systems, right?

Ultimately, the stakes aren't high enough to have secure desktops. There is actually software out there that literally does not break, does not crash, never misbehaves at all. The first place to look for something like that would be NASA, where software bugs have enormous financial consequences. Companies also have much better security than you and I do. Companies are conservative, they will stick with a system for 10 years if it runs reliably, no matter how ugly or annoying it may be. But then again they are liable for losing/leaking/corrupting lots of important data other than their own, so they like to be careful about it. We don't have that burden. The worst we can do is lose our own data, which typically means copying it back from a usb drive or something, no big deal.

People get riled up about desktop security when there is no desktop security. All you have is pockets of time where no exploits are found, but then the next one comes along. Servers are a lot more secure, because they obey strict guidelines on when to upgrade software and on what conditions. Tinkering is set to an absolute minimum. Servers also have strict policies on what types of access they allow, and to whom. This is why servers get compromised far less than desktops.

The mentioned report indicates that the Mac was hacked immediately, which probably means there's a glaring exploit out there right now. Then the Vista box survived 3 more days before it was brought down. Run that test next month and the results are likely to be completely different. Treat these tests statistically and all you get is a bunch of exploits moving around such that at any given moment there are a number of exploits available on every platform. If hacking a Linux box is 10 times harder than a Windows machine, then that doesn't really do much for us, as the past decade has shown that compromising Windows machines is a no brainer for a motivated person. That means instead of 2 days you'll survive for 2 weeks, little comfort. Now if it were 100 or 1000 times more difficult, that could actually make you feel better, but the difference is unlikely to be that big.

The desktop is practically the most insecure platform in use today. You can run anything you want on it and change your entire software stack everyday if you want to. Who's gonna stop you? From a security point of view, this is completely untenable. You cannot give people the freedom to run whatever they want while at the same time enforcing a strict security policy, those two are mutually exclusive. If you want better security, you'll have to put up with more pain, and you'll have less freedom. The quoted article says that the Vista box was compromised through a bug in the flash plugin. Now can you really blame Microsoft for bad code in Adobe's godawful flash plugin? Flash is the perpetrator here, not Vista. Okay, so they could be stricter in what things they allow and what not. But that would either partially break the flash functionality (or other software) or render it entirely incompatible. Would you prefer that? Of course not.

The truth of the matter is that technology advances at a fast pace, and we love to be part of that. We will happily run beta code as long as it's easy to install (Firefox) and doesn't burn us too much. But you can't combine the incessant thirst for the newest software with any kind of reasonable security model. It's easy enough to tell the server: Only these 5 applications are allowed to run here. But you can't do that on a desktop, because the user might want to do anything and everything. There is no security, because there isn't nearly enough security auditing happening. qmail hasn't had a security hole in 20 years, but then it's hardly received any updates in years. Would you be satisfied running Firefox 0.8 today? I doubt that. But that sort of longterm and meticulous verification is what it takes to examine a piece of software in detail and make absolutely sure that it doesn't have any issues.

Granted, there are very different attitudes toward security in various places. Unix was designed as a multi user system, and therefore security was one of the guiding principles. No user should be able to mess with another user's data and no user should be able to bring down the system. Microsoft never designed their stuff for security (apparently it didn't seem to matter) and had a nasty backlash when they found out to their astonishment that people did care about the issue. In recent years, they have tried to take the matter seriously, but it seems like they're still perplexed by it.

Now that the media seems poised to regularly stack up Windows, Os X and Linux on "security", this topic isn't going away. There will be many more "head to head" tests, which based on their criteria (sometimes more sound, sometimes less) will indicate a so-called winner, in a contest that ultimately doesn't measure anything useful. If the test was to pit the kernels of each operating system against each other, then that could be insightful. But who apart from kernel developers cares about the kernel? Kernel security bugs is a class of bugs to be taken seriously. But far more security holes are uncovered everyday in common applications that we really want to use. Applications that have nothing like the scrutiny of a kernel.

Our systems are insecure (some more, some less, but ultimately they all have weak points) because we care a lot more about new software than about security. A system that gets cracked in a week is not a secure system. One that can withstand years of attacks, now that's more like it. Our desktops are of the former category.

God is not great

March 30th, 2008

Christopher Hitchens is not just an author, he's also a journalist and speaker. His style is very characteristic for its intellectual depth and eloquent expressions. There is an interesting debate on Google Video among four critics of religion where he takes part.

God is not Great: How Religion Poisons Everything is one of the best books I've read about religion. It is also a rather deep book. Unlike Sam Harris, who apart from presenting examples also includes a lot of his own reasoning, Hitchen tries to present his arguments by finding quotes and references for everything. This might make the argument more convincing, but it also makes it a bit hard to follow with the flurry of examples he draws upon.

Hitchens makes many points in his book, but one of his central and probably most interesting arguments is that religion is man made. And from there on he finds it not excessively difficult to explain the many atrocities associated with religious groups over the centuries.

gdm sloppiness

March 28th, 2008

Today's example sponsored by gdm. Say that you have a certain session (gnome, kde, fluxbox, whatever) and you're experimenting with another one which isn't working quite smoothly yet. Then you'll be stuck going back and forth a few times. And you'll probably see this dialog:

gdm1.png

The Ubuntu gdm theme is nice and clean and it's easy to figure out how to change the session. This dialog does the job without much ado. But then you find this:

gdm2.png

After you've changed the session, assumed that the change succeeded, stopped thinking about it, and moved on to start the session by logging in, you get this idiotic dialog.

This is horrifying in several ways. First of all, the gdm login screen is completely clean of any dialogs, so there is no hint given that you should expect a popup. Secondly, once you've set everything using the secondary controls at the bottom of the screen, you just want to login and be on your way. When I'm in that mode, I've basically learnt to hit Enter as many times as it takes to get me through, so I'm very likely to accidentally accept the dialog since I don't know it's coming.

And finally, the question of whether to make the session the default one is completely cut out from the menu for changing the session, which shows a complete lack of consistency. Here I'm done doing something and later on I have to answer unexpected question about something I already finished.

Not to mention that the "unsafe" choice is selected by default, I might accidentally change my default session just by clicking Enter twice after putting in my password.

Worst of all, even when I know that the popup is coming, I absolutely do not want to have to answer it again and again just because someone couldn't figure out a better place to put that option. Make it a checkbox on the previous dialog, that's what everyone else does, why must you be so special?

I'll be nice and I'll just call this sloppiness.

EDIT: Bug filed.

UPDATE: Bug fixed in gdm 2.21.

buh-rilliant!!!

March 27th, 2008

Remember how Ubuntu came out of nowhere and just like that made everyone else feel like they're lagging behind? That's really what makes Ubuntu stand out, there is a real understanding of user needs in their leadership. The past couple of years they have empowered so many people who were interested in Linux but just didn't know how to get started or fix common annoyances (like lack of media codecs, say). And that policy hasn't gone unnoticed, I certainly feel like Fedora, for example, is doing a lot better job at embracing a wide audience than before Ubuntu ever came to light.

The latest in Ubuntu reiterates their ability to empower their users. If you're a Ubuntu user you probably know about something called 'Personal Package Archives' (ppa). It is currently the designated method of installing kde4, until it goes mainline. Well guess what, Launchpad now offers a ppa to every user! How's them apples.

This means you now get your own little apt repository you can use, and offer your packages through the same mechanism as any officially supported package, without resorting to .debs and custom "here's how you install it" instructions. Fabulous!

Here's my shiny new PPA:
https://launchpad.net/~numerodix/+archive

For the time being I'll only be keeping undvd packages there.

Unfortunately, debian packaging is something of a cult and not the easiest thing to get involved with. They are nazi about following guidelines to-a-t and therefore wrapping up a .deb takes considerably more time than writing an .ebuild or building an .rpm. I appreciate the care that goes into it, but I wish they would find a more efficient mechanism for it. The debian/ directory should be more of an abstraction, not actually having to go and hand edit the files in there, that's silly.

ui is all about ideas

March 23rd, 2008

User interfaces take a whole lot of effort to get right and that's the main reason I'm not particularly inclined to write gui apps. There are so many examples of bad user interface that I could spend my life writing about nothing else. Ui is hard mostly because what seems correct for one person gets forced on everyone to use. It's also because good ui takes good ideas and those are not as common as you might think. A lot of the bad ui we have to put up with comes from one group of people copying not so much the ideas but the results of another group without understanding them equally well or realizing them the same way. There's a lot of ranting in ui circles about how we're all using decade old paradigms in user interfaces, but where is the rich vein of fresh supplies?

Well, sometimes ideas do actually surface. Jensen Harris talks about the process of redesigning the gui for Ms Office. It's an insightful talk that approaches the problem of the well known gui (that has caused us all a lot of pain) with the appropriate humility towards the frustrated user. It also shows off some of the improvements (which are major!) that makes the 2007-series gui a lot more intuitive by making commands whose names we know appear visually and offer dynamic previews. Obviously, these ideas are specific to the domain of formatting and don't apply to any application, which happens to coincide with why they represent a big step forward: they enrich the experience in the domain they are for.

As you might expect, I'm not particularly interested in Office or any products coming out of Microsoft in general as they do not address my needs. But I sure do wish projects like OpenOffice (which, again, we're stuck with) would stop trying to clone the bad gui of old Microsoft products and do a little brainstorming themselves. Other examples that continue to live on in infamy include the gimp (how many thousands of dialogs have you clicked through today?), vlc (great technical performance, horrendous gui) and gvim (the gui offers next to nothing over the keyboard ui).