Archive for the ‘reviews’ Category

latex sucks! (for presentations)

November 28th, 2007

Hold your fire.

Got your attention there, didn't I? :D

Latex is good at what it was built to do - markup big documents. Fine for writing reports, articles, books. The problem is that we have no other tools nearly as good, so we end up using latex for things it wasn't meant for. Like presentation slides.

Why is this? Well, let's see. There's Powerpoint. It's a toy. It's one step up from drawing each slide separately in Paint: it has no structure, it makes you duplicate all common elements, no indexing, no outline. What does it offer you? Oh yeah, a spell checker. It sucks. Then there's latex. You reuse everything you already know about it, it has a ton of helpful packages, and with beamer you can actually write slides without hurting too much. Except.. it's still latex.

Here is the problem. Presentation slides are not the same as prose. Everything latex is good at has to do with handling text. Everything presentations are supposed to be is not centered on text. Giving a presentation is just as much about showing as it is about telling. And you're already telling when you're talking, so you should compose your slides so that you can show people what you mean.

We have horrid tool support for writing presentations. No wonder almost every talk you see is frame after frame of bullet points that no one will remember anyway. People actually paste parts of their text from articles into slides, as if that is supposed to produce a good visual device! :googly:

I have given 8 presentations in the last 12 months, and I'm set to do 4 more by the end of the year. Almost all of them written in latex with beamer. And you know what? I didn't enjoy myself. On the surface it looks promising. The beamer class has a ton of features, you use regular latex, and you output a pdf. But when you get down to the details...

The ratio of illustrations, diagrams, source code and other example materials is much higher than it is in prose. It's supposed to be that way. This is what latex is awkward at. Presentations are visual, so the first thing people think about is the slide design. Well guess what, if you have the perfect design already, good for you, but if don't, you're stuck editing latex styles, which no one understands. Strike one.

Then there is importing images into your slides. Raster images work fairly well, but obviously you have the resize problem - a raster image displayed at a non-native resolution will look jagged. You don't want that. So you hack up a vector image in something like dia. But you can't use that directly in latex, you have to convert it to either eps or pdf. I have found the latter least painful of late. So now not only do you have to know latex and dia, you have to author Makefiles to get a sane build process out of this. That is if you can remember to include just the right latex packages in your document, make sure they are installed both locally and any other place you may want to do a last minute recompile. Strike two.

Speaking of Makefiles, everyone knows you have to recompile a latex document to see the changes you've made. But with presentations these changes are much more often concerned with layout than they are with content. And it's a huge pain to recompile again and again to debug something as simple as a two column layout (use \minipage and set the width in absolute units just right so they fit next to each other). So now I have a compile loop running every second that invokes make and I see the updates in kpdf. All because I want to tweak the layout of my content. Speaking of compiling, have you seen those latex error messages lately? They're not meant for humans. And as far as languages go, latex isn't exactly the most... robust of them all. Make one tiny mistake here and your compile detonates. Strike three.

It is a pain. It is a pain to have to recheck old documents to remind myself of how I included a piece of source code. Or how I handled a particular image. Latex is extremely hackish. It makes it friggin' hard to generalize anything so that it becomes reusable. I find myself copy pasting code all the time from one document to another. There is nothing sensible about how things are done, it's just a matter of writing something that works, and having a copy of it somewhere. The more presentations I write the more difficult to remember which trick I used where. An unmaintainable mess.

And then there's time pressure. When you're writing a technical document, you have the time to work out the little quirks you will encounter. It's a permanent document, it will continue to exist for years after the fact, it's worth the effort. But presentation slides are not. You just want to write them fairly quick so that you can give the talk and move on. You're not going to be investing time in this, it's a one time thing. And yet writing slides requires a lot more patience with latex than writing prose does.

Nevermind that, writing slides is a creative process. It is not filling in predefined sections of a document, like an article for a conference is. You want to be able to experiment, to try things just for the hell of it. Latex is horrid for this. Let's see if I can fit this figure next to the text here. Okay, let me first look up how I do that. How do you actually find this out? I already covered that earlier. Then try it, debug the layout, finally I can see if I got what I wanted. No, that didn't work out, let's switch it back to the way it was. Compile error. Goddamn it, now what? Okay, let me gradually undo to a state that does compile and that way I can figure out what's causing it.

Not. how. it. was. meant. to. be. done.

KVM: the state of the game

November 16th, 2007

KVM is one of those things I've kept an eye open for ever since I heard about it. First of all, I should mention that there is currently a great deal happening on the virtualization front, here's a list of virtual machines to look over. What is more, it is not so much a wealth of competing products, because it seems that everyone has their own little twist on the matter, and none of these are completely equivalent solutions. There is a host of different terms to describe these techniques to boot.

I've known about VMware for 5-6 years, and that is how I originally started out playing around with linux in a safe and easy environment. Since then there have been many linux centric projects that have made waves at some time or another, not all of which I've tried. I think User Mode Linux was the first one I heard about. It ran a modified host kernel and could host multiple guest machines that would also be running a modified kernel. You could use it to host Virtual Private Servers, which has been common now for a while, but wasn't all that common back then. You could also set up honeypots with them. There was also a VMware competitor called Win4Lin (for Win9x) which was like a poor man's VMware. It would run fairly well, but it was far from being equally polished and feature complete. It also had the advantage of being strictly an application, no kernel hooks. A couple of years later there was a new project that was very loud, namely Xen. Xen is one of those odd projects that had a lot of potential and broad support, and yet didn't quite have the kind of adoption one would expect. It was quite a complicated piece of code that required a lot of setting up and so on, but Fedora shipped it by default (I'm not sure if they still do) and it worked out of the box. Again it was a modified host kernel, but the guests could run completely unmodified. I'm not sure how recent this is, but nowadays Xen also runs various operating systems, although in some cases they need to be modified. Then there was CoLinux, which approached the issue from the opposite side of the table: Linux as guest. I never got around to trying it as I never had to use Windows for long stretches of time, but a lot of people (those poor souls trapped on Windows) were very enthusiastic about it.

So what about KVM? Well, KVM is interesting because of how "close to the metal" it is. Obviously, any kind of virtualization adds some overhead due to the unavoidable indirection. But KVM actually does not position itself above the kernel, as most virtual machines do, but uses kernel primitives to host guests (in fact it sets up a whole new kernel execution mode guest for this). It also uses cpu virtualization extensions (current Intel and AMD chips) to gain speed. All of this is fairly recent stuff, KVM was merged into the kernel for 2.6.20. So, of course, the major advantage of KVM is that it's well supported and well tested in kernel mainline.

However, KVM is not a virtual machine in itself, in the sense of being a complete application. It's more like an access layer which exposes a /dev/kvm interface. The way to run KVM is through QEMU, which is actually an emulator. I haven't mentioned emulators so far, but they are in a sense the second type of virtualizer/emulator software. The difference is that a virtualizer creates a virtual machines with virtual hardware (eg. network adapter) for the guest operating system to run on, but the cpu is still that of the host. An emulator instead emulated the guest cpu, and therefore has to translate every machine instruction into a different one. This means that you can run say a powerpc guest on an x86 host. It sounds very cool, but it's orders of magnitude slower, so it's not as much of a hot topic.

But back to KVM. As I said, the application is still QEMU, but it's accelerated through KVM. I had been looking forward to taking it for a spin, especially now that I also have an Intel chip with the virtualizing extensions. On Ubuntu it's all ready to go, just install the userland applications from the repo (qemu and kvm) and you're set. Qemu is quite nice and simple to use, but the combo is temperamental and unforgiving. You can set up your guest machine in 2 minutes, but with Windows I had quite a number of fatal crashes (Exception 13) that aren't that obvious to figure out. Furthermore, it seems that even linux distros aren't trivial to run on KVM/QEMU. In terms of a complete machine virtualization it leaves something to be desired. Notably, the video flickers quite visibly while running on a vesa driver. That isn't to say that other virtual machines cannot or will not use KVM as part of their solution.

But for the time being VirtualBox (something of a lightweight VMware, which is also free software) is more convenient.

Disclaimer: This entry is something of a historical account of my exposure to virtual machines. I did not do any fact checking here, and the statements only reflect what I recall about the particular products.

does ogg suck or does ogg support suck?

July 5th, 2007

So ogg is well known as the open format for audio and people seem to love it, it has broad support. To be precise, ogg is just a container format, so just the fact that a file is an ogg file doesn't tell you much more about it than a zip file tells you about its content.

On the video side of things, noone seems to use ogg. Even though it is an open and by all accounts free format, I very rarely come across ogg videos. And sadly when I do, often they are broken. Since Theora seems to be the most used codec for ogg video, I'm assuming they're all encoded in that format (I know I've never seen an ogg video not in theora, although I don't make a point of checking).

Common symptoms:

  • Timer is completely bonkers, saying a 20 minute video is several hours long and whatnot.
  • Video stream has lots of bugs, frame freezes.
  • Audio/video sync can be way off.

Case in point, download this video from Akademy. In mplayer and vlc alike the timer is confused, and the video stream freezes as well. This is the kind of problem I've seen a lot of times with ogg videos.

Now it may be a production mishap for this particular video. But I've seen these bugs with ogg a lot more than with other formats. In mplayer, even if you reindex the stream the timer still doesn't get it right. It tells me the video is 412:29:15 long.

So what is it? Is ogg/theora busted or is the implementation busted? Is the encoder so bad that it's impossible to produce a video without bugs? Or is the decoder buggy?

remember Prince of Persia?

May 26th, 2007

Yees yes, that fantastic game we played in the early 90s. Remember it? Of course, who could forget. :cap: Getting that prince unscathed through all those tunnels, traps and past all his enemies was great fun.

Ah, the days when Ms-Dos was our operating system of erm... choice and we gave the keyboard a good workout.

prince_of_persia.jpgPrince of Persia kept reappearing in new releases, but as far as I'm concerned once you kill the classic 2D game play feel, it's all downhill. The story lines for the later versions were also tediously complicated, nothing like the elegant simplicity of the classic. The original and the sequel, those are the two I played back then.

And now you can re-live the experience yourself. The Unofficial Prince of Persia Website has all the scoop, with plenty of extras. The oldest versions (1 & 2) are considered abandonware (which means no one is there to collect) and are up for download from the site. There's also cheats and walkthroughs if you get stuck (ah, how much easier it is to play these games nowadays when you don't have to figure it out yourself :D ).

Not only that, the walkthroughs for the sequel even have captures on google video, so you don't even have to play it yourself. :D

"But wait a minute", you say, "didn't you say Ms-Dos? How am I going to play these games? I've moved on from Dos by now." Funny you should ask. There is a Dos emulator called dosbox, which gives you a window into Dos, if you will. Inside there you can play any Dos game, and dosbox has a pretty long list of supported games.

Enjoy Prince of Persia! :cool:

the state of RAW support in linux

May 11th, 2007

This only affects you if you have some source of RAW images, typically a camera would be that source. Then the RAW images need to be post-processed (which of course is something that's already done if you extract JPG's instead of RAW images from the camera) and converted to a target format, like JPG.

Viewers/browsers

The best one I know so far is showfoto, a component of digikam. digikam itself is fussy about images having to be part of albums, but showfoto has an adequate image browser with exif data display and some statistics about the image. It's also worth noting that digikam itself has been given a lot of attention, and has recently developed into a much better and more useful program than it was a few years ago.

Rawstudio also has a rudimentary image browser.

Converters

For this I would advocate ufraw. It's a standalone program, but it's also a plugin for the gimp. The interface is straightforward and quite handy.

showfoto/digikam also has features for conversion, but they are somehow tucked away in the menus and harder to find.

Rawstudio aims to be the tool of choice for this, but for the moment is seems rather immature and the interface could use work.

I think I read somewhere that Krita is supposed to convert its inner colorspace to be 16bit, which would make editing RAW images native, without needing to convert them first. That would be awesome. For the time being, I can't say anything for Krita, because it crashes the moment I start it (probably a bug in the koffice ebuilds).

Status

So the support for RAW images is quite encouraging. Not as nice as in Photoshop CS3, and this applies principally to the conversion options and the types of adjustments that can be made, but decent all the same.