Drawing Finite Automata and State Machines

I had to draw couple of Finite Automata and Turing Machines for some university assignments. Usually I would have done it using Inkscape (as it is my favorite tool for creating figures for my LaTeX documents), but doing it manually is pretty tedious work. Inkscape diagram tool is currently sub par, so everything have to be done by hand. It’s OK if you need to draw one State Machine once in a while, but not suitable for larger quantities. I’ve also tried using Dia, but it also required lots of manual tweaking and tuning.

To my surprise, Graphviz (and especially the dot utility) turned out to be the (almost) perfect tool for the job. It lets you describe the graph in a simple text-based way, and it handles the graph layout by himself. This is somewhat like LaTeX but for graphs (you concentrate on content not layout).

My Finite Automata needed no manual tweaking and resulted in a very nice graphs. For more complicated State Machines it’s sometimes necessary to do some manual tuning. The commands I found most useful to tweak the graph were:

  • Grouping nodes to be in the same level – { rank="same"; "q1"; "q2"; "q3"}. The other options for rank can affect how the group is positioned relative to the other nodes in the graph (source, above all, sink bellow all).
  • Adding weight to edges – q1 -> q2 [weight="10"]. This affects the cost of strecting the edge. The higher the weight the straighter the edges will be.
  • Adding invisible edges – q1 -> q3 [style="invis"]. This allowed me to control the order of the nodes in the same rank (height).

Last but not least Graphivz can generate the graphs in variety for formats including eps, pdf and svg (which allows post-processing with inkscape).

tarsum – Calculate Checksum for Files inside Tar Archive

Update: I’ve released tarsum-0.2, a new version of tarsum.

Some time ago, I got back a hard disk back from data recovery. One of the annoying issues I encountered with the recovered data was corrupted files. Some files looked like they were recovered successfully but their content was corrupted. The ones that were configuration files, where usually easy to detect, as it raised errors in programs that tried to use them. But when such error occurs in some general text file, (or inside the data of an SQL dump), the file may seem correctly fine unless closely inspected.

I have an habit of storing old backups on CDs (they are initially made to online storage), I do it in order to reduce backup costs. But the recovered/corrupted data issue raised some concerns about my ability to recover using this disks. Assuming that I have a disk failure, and I couldn’t recover from my online backups for reason, how can I check the integrity of my CD backups?

Only storing and comparing hash signature for the whole archive, is almost useless. It allows you to validate whether all the files are probably fine, but it can’t tell apart one corrupted file in the archive from a completed corrupted archive. My idea was to calculate checksum (hash) for each file in the data and store the signature in a way that would allow me to see which individual files are corrupted.

This is where tarsum comes to the rescue. As it’s name applies it calculate checksum for each file in the archive. You can download tarsum from here.

Using tarsum is pretty straight forward.

tarsum backup.tar > backup.tar.md5

Calculates the MD5 checksums of the files. You can specify other hashes as well, by passing a tool that calculates it (it must work like md5sum).

tarsum --checksum=sha256sum backup.tar > backup.tar.sha256

To verify the integrity of the files inside the archive we use the diff command:

tarsum backup.tar | diff backup.tar.md5 -

where backup.tar.md5 is the original signature file we created. This is possible because the signatures are sorted alphabetically by the file name inside the archive, so it the order of the files is always the same.

Note that if you use an updated version of GNU tar, tarsum can also operate directly on compressed archives (e.g. tar.bz2, tar.gz).

s3backup – Easy backups of Folders to Amazon S3

This is an updated version of my previous backups script – Backup Directories to Amazon S3 Script. The new script works much better and safer. Unlike the old script, the new one creates the tarballs in a temporary file under /tmp, and allows more control over the backup process.

Continue reading s3backup – Easy backups of Folders to Amazon S3

Kernel Configuration for acpid Issue

I’ve installed acpid on my system some time ago (Gentoo package: sys-power/acpid. However each time I tried to start it it complained:

acpid: can't open /proc/acpi/event: No such file or directory

Apparently acpid requires you to enable ACPI_PROC_EVENT, which in it’s labels states “Deprecated /proc/acpi/event support”. I really wonder why such tool only support the deprecated way to receive the ACPI events.

LaTeX Error: Command \textquotedbl unavailable in encoding HE8

I was testing today the SVN version of LyX 1.6.0 and 1.5.7. Due to a change in the way the double quotation mark (“) is handled, adding it to Hebrew text resulted in the following LaTeX error:

LaTeX Error: Command \textquotedbl unavailable in encoding HE8

Continue reading LaTeX Error: Command \textquotedbl unavailable in encoding HE8

WordPress Backup to Amazon S3 Script

This is an updated version of my WordPress Backup Script. The new version basically does the same thing: backup up a wordpress blog (actually any site that consists of files and a MySQL database). The new thing about the script is that instead of only saving the backup locally, it also uploads it to Amazon S3.

Continue reading WordPress Backup to Amazon S3 Script

Alpha Channel Problems When Creating .ico Files Using ImageMagick

I’ve tried to use ImageMagick to create .ico files for Open Yahtzee, out of PNGs of various sizes. The command as it should have been:

convert openyahtzee16.png openyahtzee32.png openyahtzee64.png openyahtzee.ico

resulted in the alpha channel being reversed. I’ve used ImageMagick 6.4.0, and I didn’t remember this misbehavior happening in the previous versions.

While this annoyed, and was due to no apparent reason, it could be easily solved using the ImageMagick switches to reverse the alpha channel:

-channel Alpha -negate

So the command that produces a correct .ico file was:

convert openyahtzee16.png openyahtzee32.png openyahtzee64.png -channel Alpha -negate openyahtzee.ico

Retrieving Google’s Cache for a Whole Website

Some time ago, as some of you noticed, the web server that hosts my blog went down. Unfortunately, some of the sites had no proper backup, so some thing had to be done in case the hard disk couldn’t be recovered. My efforts turned to Google’s cache. Google keeps a copy of the text of the web page in it’s cache, something that is usually useful when the website is temporarily unavailable. The basic idea is to retrieve a copy of all the pages of a certain site that Google has a cache of.
Continue reading Retrieving Google’s Cache for a Whole Website

Generating URL List from Access Log (access_log)

I had to parse an access_log of a website, in order to generate a sitemap. More precisely, a list of all URLs in the site. After playing around I’ve found a solution using sed, grep, sort and uniq. The good thing that each of this tools is available by default on most Linux distributions.
Continue reading Generating URL List from Access Log (access_log)

NVRM: not using NVAGP, kernel was compiled with GART_IOMMU support

For the past several weeks I had a strange problem. Sometimes when I booted my computer, it would refuse to start the X server and would give the following error in dmesg:

NVRM: not using NVAGP, kernel was compiled with GART_IOMMU support!!
NVRM: failed to allocate stack!

The weird thing about it is that normally if I rebooted the computer it would magically work again. So this error only showed up once-in a while and seemed to disappear at will. Today, it happened again, so I decided to fix it.
Continue reading NVRM: not using NVAGP, kernel was compiled with GART_IOMMU support