Lens calibration workflow for Canon PowerShot SX710HS

CHDK adds a RAW shooting option to many point and shoot cameras, including Canon PowerShot SX710HS. As the camera does not apply lens correction automatically for RAWs, one has to fix lens distortion when developing the RAWs. Lensfun is the go-to place for FOSS lens correction, however it did not have lens correction data for the SX710HS. So I’ve decided to calibrate the lens myself.

The Lensfun tutorial for lens calibration is quite cumbersome. It directs you go out and shoot some buildings with straight lines, and then manually track those lines in Hugin and extract the distortion parameters. Well, after searching a bit, I came up with what I think is a better workflow.

I’ve started by creating in Inkscape (using the Cartesian grid tool) and printing a page with regularly spaced horizontal lines. I’ve printed it on an A3 paper, but in hindsight A4 would have worked just the same and is more available. I’ve placed the paper on the floor, added some simple lightning and and placed a paperclip in the middle just to make focusing easier.

Next I’ve took several shoots at each focal length, having the printed lines parallel to the long side of the image. To make sure I’m covering the entire zoom range consistently, I’ve used my set_zoom script. I’ve taken calibration shoots in 0 (minimal focal lenght), 5, 10, 20, … ,100, 110 and 111 (maximal focal length) zoom steps. After taking all the shots, I’ve copied the DNGs (CHDK saved RAWs as DNGs) from the camera, and used exiftool to sort them into directories by focal length:

$ exiftool '-directory<${FocalLength;}' *.DNG

As we will use Hugin, for the calibration and Hugin does not support DNGs, we have to convert them to TIFFs.

find -name "*.DNG" | xargs --verbose -P6 -I{} dcraw -t 0 -T {}

The -t 0 is used to remove the orientation info from the TIFFs. As I shoot the pictures with camera pointing straight down, that info is many times incorrect and messes up the calibration in Hugin.

Load each set of TIFFs of a given focal length into Hugin Lens Calibration GUI. Make sure the focal length and focal length multiplier are correct (for some reason Hugin was off by 0.01-0.02 sometimes), and use the “Find lines” button to automatically detect straight lines. Next optimize the radial distortion parameters. Those a, b and c parameters are the numbers Lensfun needs.

Hugin Lens Calibration GUI, after line detection and optimization.
Hugin Lens Calibration GUI, after line detection and optimization.

The next step is to create a Lensfun profile for the camera. The basic blocks are the <camera> tag and the <lens> tag. Both are mandatory, even thought the camera has a fixed lens. For each focal length we calibrated the lens for, we add a <distortion> tag, modifiying the focal and the a, b and c parameters according to the values from the calibration GUI.

<lensdatabase version="1">
    <camera>
        <maker>Canon</maker>
        <model>Canon PowerShot SX710 HS</model>
        <model lang="en">PowerShot SX710 HS</model>
        <mount>canonSX710HS</mount>
        <cropfactor>5.6</cropfactor>
    </camera>
    <lens>
        <maker>Canon</maker>
        <model>Canon PowerShot SX710 HS & compatibles, with CHDK's DNG</model>
        <model lang="en">fixed lens, with CHDK's DNG</model>
        <model lang="de">festes Objektiv, mit CHDK-DNG</model>
        <mount>canonSX710HS</mount>
        <cropfactor>5.6</cropfactor>
        <aspect-ratio>4:3</aspect-ratio>
        <calibration>
            <distortion model="ptlens" focal="4.5" a="0.03396" b="-0.11175" c="-0.01779" />
            <distortion model="ptlens" focal="5.5" a="0.03289" b="-0.11253" c="0.02444" />
            <distortion model="ptlens" focal="6.6" a="0.02218" b="-0.08388" c="0.03521" />
            <distortion model="ptlens" focal="9.5" a="0.01211" b="-0.04143" c="0.02055" />
            <distortion model="ptlens" focal="13.5" a="-0.0022" b="0.00264" c="-0.00558" />
            <distortion model="ptlens" focal="18.2" a="-0.00212" b="0.00441" c="-0.00378" />
            <distortion model="ptlens" focal="23.4" a="0.00328" b="-0.01242" c="0.01298" />
            <distortion model="ptlens" focal="29.1" a="-0.00036" b="-0.00082" c="0.00209" />
            <distortion model="ptlens" focal="36.2" a="0.00129" b="-0.00626" c="0.00705" />
            <distortion model="ptlens" focal="46.3" a="0.00715" b="-0.02588" c="0.0244" />
            <distortion model="ptlens" focal="46.3" a="0.00715" b="-0.02588" c="0.0244" />
            <distortion model="ptlens" focal="62.9" a="0.00067" b="-0.00852" c="0.01085" />
            <distortion model="ptlens" focal="92.0" a="-0.00306" b="0.00016" c="-0.00374" />
            <distortion model="ptlens" focal="128.7" a="-0.0037" b="0.00365" c="-0.00582" />
            <distortion model="ptlens" focal="135.0" a="0.00333" b="-0.02616" c="0.03388" />
        </calibration>
    </lens>
</lensdatabase>

The lens’ model name follows the Lensfun convention for CHDK. However, it makes auto-detection of the lens to fail in many application such as Darktable. You can workaround it by copying exactly the model name from the <camera> tag. The xml file itself should go in ~/.local/share/lensfun.

The two pictures in the beginning of the post show some examples of uncorrected and corrected photos using the newly acquired calibration data.

CHDK’s set_zoom range for Canon PowerShot SX710HS

CHDK supports programmatically setting the zoom level on the camera. However, the exact range for the zoom and apparently the translation between the zoom level in CHDK to actual focal length is camera dependent.

Zoom to focal length, Canon PowerShot SX710HS

In order to figure out the exact range for the zoom and the relation between the numerical zoom steps and the focal length, I’ve used the following Lua script to get the current zoom setting and set it to a certain value. I’ve also configured the display of the camera to show the exact focal length at all times.

-- Guy Rutenberg 2018-08-10
-- 

--[[
@title Get and Set Zoom
@chdk_version 1.3
@param z Zoom level
@default z 0
--]]

z_old = get_zoom()
print("Current zoom "..z_old)
print("Zooming to "..z)
set_zoom(z)

Next, I’ve set the zoom to multiple value, and plotted it against the actual focal length. The results are shown in the graph above.

The range for the zoom is 0 to 111, which matches focal length of 4.5 to 135mm. The change is pretty linear in two section: 0 to about 85 and again from 85 up to 111.

Google Adsense for Wordpres – No Plugin Needed

Adding Google Adsense ads to your your WordPress blog was a tedious task. Either you needed to manually modify your theme, or you had to use a plugin, such as Google’s own Adsense plugin. Even then, placements were limited and handling both mobile and desktop themes was complicated at best. Recently, two things have changed: Google retired the Adsense plugin and introduced Auto Ads.

At first, the situation seemed like it turned for the worse. Without the official plugin, you had to resort into using a third-party plugin or manually placing ads in your theme. But Auto ads made things much simpler. Instead of having to manually place your ads, you can let Google do it for you. It works great on both desktop and mobile theme.

The easiest way to enable Auto ads is using a child theme. First, you need to get the Auto ads ad code. Next, in your child theme’s functions.php add the following lines, making sure to replace the javascript snippet with your own one.

// Add Google Adsense
function my_google_adsense_header() {
?>
<script async src="//pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<script>
     (adsbygoogle = window.adsbygoogle || []).push({
          google_ad_client: "ca-pub-4066984350135216",
          enable_page_level_ads: true
     });
</script>
<?php
}
add_action( 'wp_head', 'my_google_adsense_header');

Google Analytics for WordPress

To set up Google Analytics tracking for WordPress you don’t need any third-party plugin. It can be easily done using a child theme. A child theme, is a code that modifies the current theme in a way that won’t interfere with future upgrades. To enable Google Analytics, start by creating a child theme using the official documentation.

Now, you need to get you Google Analytics tracking code. Over the years the tracking code had a few different versions. You should make sure you are getting the latest tracking code, which currently looks like:

<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-380837-9"></script>
<script>
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag('js', new Date());

  gtag('config', 'UA-380837-9');
</script>

To get the tracking code, follow the instructions on this page.

Now we add the tracking code to each page using the child theme. In your child theme’s directory, edit the functions.php file and add the following lines. Replace the tracking code with the one you acquired.

// Add Google Analytics tracking
function my_google_analytics_header() {
?>
<!-- Global site tag (gtag.js) - Google Analytics -->
<script async src="https://www.googletagmanager.com/gtag/js?id=UA-380837-9"></script>
<script>
  window.dataLayer = window.dataLayer || [];
  function gtag(){dataLayer.push(arguments);}
  gtag('js', new Date());

  gtag('config', 'UA-380837-9');
</script>
<?php
}
add_action( 'wp_head', 'my_google_analytics_header');

This adds the tracking code to the <head> of every page.

Creating Local Backups using `rdiff-backup`

rdiff-backup provides an easy way to maintain reverse-incremental backups of your data. Reverse incremental backups are different from normal incremental backups by synthetically updating the full backup and keeping reverse diffs of all the files changed. It is best illustrated by an example. Let’s consider backups taken on three consecutive days:
1. Full backup (1st day).
2. Full backup (2nd day), reverse-diff: 2nd -> 1st.
3. Full backup (3rd day), reverse diffs: 3rd -> 2nd, 2nd -> 1st.

Compare that with the regular incremental backup model which would be:
1. Full backup (1st day).
2. Diff: 2nd -> 1st, full backup (1st day).
3. Diffs: 3rd -> 2nd, 2nd -> 1st, full backup (1st day).

This especially makes purging old backups easier. Reverse incremental backups allows you to simply purge the reverse-diffs as they expire. This happens because newer backups never depend on older ones. In contrast, in the regular incremental model, each incremental backup depends on each prior backup in the chain, going back to the full backups. Thus, you can’t remove the full backup until all the incremental backups that depend on it expire as well. This means that most of the time you need to keep more than one full backups, which takes up precious disk space.

rdiff-backup has some disadvantages as well:
1. Backups are not encrypted, making it unsuitable as-is for remote backups.
2. Only the reverse-diffs are compressed.

The advantages of rdiff-bakcup make it suitable to create local Time Machine-like backups.

The following script, set via cron to run daily, can be used to take backups of your home directory:

#! /bin/sh

SOURCE="/home/user/"
TARGET="/home/user/backups/rdiff-home/"

## Backup
rdiff-backup --exclude-if-present .nobackup --exclude-globbing-filelist /home/user/backups/home-exclude --print-statistics $SOURCE $TARGET

## Remove old data
rdiff-backup --remove-older-than 1M --force --print-statistics $TARGET

where `/home/user/backups/home-exclude should look like:

+ /home/user/Desktop
+ /home/user/Documents
+ /home/user/Music
+ /home/user/Pictures
+ /home/user/Videos
+ /home/user/.vim
+ /home/user/.vimrc
+ /home/user/.ssh
+ /home/user/.gnupg

**

In order to select only certain files and directories to backup.

The --exclude-if-present .nobackup allows you to easily add a .nobackup file to directories you wish to ignore. The --force argument when purging the old backups allows it to remove more than one expired backup in a single run.

Listing backup chains:

$ rdiff-backup -l ~/backups/rdiff-home/

Restoring files from the most recent backup is simple. Because rdiff-backup keeps the latest backup as a normal mirror on the disk, you can simply copy the file you need out of the backup directory. To restore older files:

$ rdiff-backup --restore-as-of 10D ~/backups/rdiff-home/.vimrc restored_vimrc

Set default application using `xdg-mime`

You can use the xdg-mime utility to query the default mime-type associations and change them.

xdg-mime query default video/mp4

Will return the .desktop file associated with the default app to open mp4 files. To change the default association:

xdg-mime default vlc.desktop video/mp4

you need to specify the desktop file to open files of the specified mime type. To check the mime-type of a given file, use

file -ib filename

Installing Firefox Quantum on Debian Stretch

Debian only provides the ESR (Extended Support Release) line of Firefox. As a result, currently, the latest version of Firefox available for Debian Stretch is Firefox 52, which is pretty old. Lately, Firefox 57, also known as Quantum, was released as Beta. It provides many improvements over older Firefox releases, including both security and performance.

Begin by downloading the latest beta (for Firefox 57) and extract it to your home directory:


$ wget -O firefox-beta.tar.bz2 "https://download.mozilla.org/?product=firefox-beta-latest&os=linux64&lang=en-US"
$ tar -C ~/.local/ -xvf firefox-beta.tar.bz2

This installs Firefox to your current user. Because Firefox is installed in a user-specific location (and without root-priveleges), Firefox will also auto-update when new versions are released.

If you prefer using the stable version of firefox, simply replace the first step by


$ wget -O firefox-stable.tar.bz2 "https://download.mozilla.org/?product=firefox-latest&os=linux64&lang=en-US"

Next, we take care of desktop integration. Put the following in ~/.local/share/applications/firefox-beta.desktop:


[Desktop Entry]
Type=Application
Name=Firefox Beta
Exec=/home/guyru/.local/firefox/firefox %u
X-MultipleArgs=false
Icon=firefox-esr
Categories=Network;WebBrowser;
Terminal=false
MimeType=text/html;text/xml;application/xhtml+xml;application/xml;application/vnd.mozilla.xul+xml;application/rss+xml;application/rdf+xml;image/gif;image/jpeg;image/png;x-scheme-handler/http;x-scheme-handler/https;

Duply credential error when using Amazon S3

Duply is a convenient wrapper around duplicity, a tool for encrypted incremental backups I’ve used for the last couple of years. Recently, after a recent upgrade, my Amazon S3 backups failed, reporting the following error:

    'Check your credentials' % (len(names), str(names)))
NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] Check your credentials

Boto, the backend duplicity relies on for the Amazon S3 backend, requires to pass authentication parameters through the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables. As different backends require different variables, duply used to make that transparent, one would just set TARGET_USER and TARGET_PASS and duply would take care of the rest. However, duply 1.10 broke compatibility and requires you to set the variables yourself. Hence, the fix is to replace the TARGET_* variables with exported AWS_* variables:

# TARGET_USER='XXXXXXXXXXXX'
# TARGET_PASS='XXXXXXXXXXXX'
export AWS_ACCESS_KEY_ID='XXXXXXXXXXXX'
export AWS_SECRET_ACCESS_KEY='XXXXXXXXXXXX'

Patching an Existing Debian Package

This tutorial walks you through patching an existing Debian package. It is useful if you want to create a .deb of a package after fixing some bug or modifying the source in any other way. We will hugin as our example.

We start by fetching the source package

$ apt-get source hugin
$ cd hugin-2017.0.0+dfsg

We will need a tool named quilt to make the process easier.

# apt install quilt

Before using quilt we want to make it aware of the debian/patches directory which holds the patches. Adding the following lines to ~/.quiltrc will make quilt search up the directory tree the debian/patches directory.

d=. ; while [ ! -d $d/debian -a `readlink -e $d` != / ]; do d=$d/..; done
if [ -d $d/debian ] && [ -z $QUILT_PATCHES ]; then
        # if in Debian packaging tree with unset $QUILT_PATCHES
        QUILT_PATCHES="debian/patches"
        QUILT_PATCH_OPTS="--reject-format=unified"
        QUILT_DIFF_ARGS="-p ab --no-timestamps --no-index --color=auto"
        QUILT_REFRESH_ARGS="-p ab --no-timestamps --no-index"
        QUILT_COLORS="diff_hdr=1;32:diff_add=1;34:diff_rem=1;31:diff_hunk=1;33:diff_ctx=35:diff_cctx=33"
        if ! [ -d $d/debian/patches ]; then mkdir $d/debian/patches; fi
fi

Now starts the actual patching process. The patches are applies in series, and our new patch should be the last one. We start by applying any existing patches, and then creating a new patch

$ quilt push -a
$ quilt new 44_setlocale.patch

I chose the 44_ prefix because hugin already has a patch named 43_fallbackhelp.patch and the convention is naming patches so the names reflect the order they are applied. Next we specify to quilt which files we modify and then we edit them.

$ quilt add src/hugin1/hugin/huginApp.cpp
$ vim src/hugin1/hugin/huginApp.cpp

Alternatively, instead of editing the files manually, quilt import can be used to import an existing patch.

Each patch comes with its own metadata to let other people know who wrote it and what it does. Use

$ quilt header --dep3 -e

to edit this metadata. For example:

Description: Call setlocale()
This fixes a bug in wxExecute, see https://trac.wxwidgets.org/ticket/16206
The patch has been submitted to upstream, https://groups.google.com/d/msg/hugin-ptx/FCi7ykPDZ5E/3w8E5U1SCQAJ
Author: Guy Rutenberg <guyrutenberg@gmail.com>
Last-Update: 2017-10-04
---
This patch header follows DEP-3: http://dep.debian.net/deps/dep3/
See [DEP-3](http://dep.debian.net/deps/dep3/) for more details about the different fields.

After we finish editing we finalize the patch, and unapply all the patches

$ quilt refresh
$ quilt pop -a

Now, you can continue to build the deb from source as usual. We use debchange to create a new version, and debuild to build the package. After the package is build it can be installed using debi

$ DEBEMAIL="Guy Rutenberg <guyrutenberg@gmail.com>" debchange --nmu
$ debuild -us -uc -i -I -j7

Sources:

Use `mk-build-deps` instead of `apt-get build-dep`

If you are building a package from source on Debian or Ubuntu, usually the first step is to install the build-dependencies of the package. This is usually done with one of two commands:

$ sudo apt-get build-dep PKGNAME

or

$ sudo aptitude build-dep PKGNAME

The problem is that there is no easy way to undo or revert the installation of the build dependencies. All the installed packages are marked as manually installed, so later one cannot simply expect to “autoremove” those packages. Webupd8 suggests clever one-liner that tries to parse the build dependencies out of apt-cache and mark them as automatically installed. However, as mentioned in the comments, it may be too liberal in marking packages as automatically installed, and hence remove too many packages.

The real solution is mk-build-deps. First you have to install it:

$ sudo apt install devscripts

Now, instead of using apt-get or aptitude directly to install the build-dependencies, use mk-build-deps.

$ mk-build-deps PKGNAME --install --root-cmd sudo --remove

mk-build-deps will create a new package, called PKGNAME-build-deps, which depends on all the build-dependencies of PKGNAME and then install it, therefore pulling all the build-dependencies and installing them as well. As those packages are now installed as dependencies they are marked as automatically installed. Once, you no longer need the build-dependencies, you can remove the package PKGNAME-build-deps, and apt will autoremove all the build-dependencies which are no longer necessary.