Technology Tales

Adventures & experiences in contemporary technology

Command Line Processing of EXIF Image Metadata

8th July 2013

There is a bill making its way through the U.K. parliament at the moment that could reduce the power of copyright when it comes to images placed on the web. The current situation is that anyone who creates an image automatically holds the copyright for it. However, the new legislation will remove that if it becomes law as it stands. As it happens, the Royal Photographic Society is doing what it can to avoid any changes to what we have now. There may be the barrier of due diligence but how many of us take steps to mark our own intellectual property? For one, I have been less that attentive to this and now wonder if there is anything more that I should be doing. Others may copyleft their images but I don’t want to find myself unable to share my own photos because another party is claiming rights over them. There’s watermarking them but I also want to add something to the image metadata too.

That got me wondering about adding metadata to any images that I post online that assert my status as the copyright holder. It may not be perfect but any action is better than doing nothing at all. Given that I don’t post photos where EXIF metadata is stripped as part of the uploading process, it should be there to see for anyone who bothers to check and there may not be many who do.

Because I also wanted to batch process images, I looked for a command line tool to do the needful and found ExifTool. Being a Perl library, it is cross-platform so you can use it on Linux, Windows and even OS X. To install it on a Debian or Ubuntu based Linux distro, just use the following command:

sudo apt-get install libimage-exiftool-perl

The form of the command that I found useful for adding the actual copyright information is below:

exiftool -p “-copyright=(c) John …” -ext jpg -overwrite_original

The -p switch preserves the timestamp of the image file while the -overwrite_original one ensures that you don’t end up with unwanted backup files. The copyright message goes within the quotes along with the -copyright option. With a little shell scripting, you can traverse a directory structure and change the metadata for any image files contained in different sub-folders. If you wish to do more than this, there’s always the user documentation to be consulted.

ImageMagick and Ubuntu 9.04

5th May 2009

Using a command line tool like ImageMagick for image processing may sound a really counter-intuitive thing to do but there’s no need to do everything on a case by case interactive basis. Image resizing and format conversion come to mind here. Helper programs are used behind the scenes too with Ghostscript being used to create Postscript files, for example.

The subject of helper programs brings me to an issue that has hampered me recently. While I am aware that there are tools like F-Spot available, I am also wont to use a combination of shell scripting (BASH & KSH), Perl and ImageMagick for organising my digital photos. My preference for using Raw camera files (DNG & CRW) means that ImageMagick cannot access these without a little helper. In the case of Ubuntu, it’s UFRaw. However, Jaunty Jackalope appears to have seen UFRaw updated to a version that is incompatible with the included version of ImageMagick (6.4.5 as opposed to 3.5.2 at the time of writing). The result is that the command issued by ImageMagick to UFRaw -- issue the command man ufraw-batch to see the details -- is not accepted by the included version of the latter, 0.15 if you’re interested. It seems that an older release of UFRaw accepted the output device ppm16 (16-bit PPM files) but this should now be specified as ppm for the output device and 16 for the output depth. In a nutshell, where the parameter output-type did the lot, you now need both output-type and output-depth.

I thought of decoupling things by using UFRaw to create 16-bit PPM files for processing by ImageMagick but to no avail. The identify command wouldn’t return the date on which the image was taken. I even changed the type to 8-bit JPEG’s with added EXIF information but no progress was made. In the end, a mad plan came to mind: creating a VirtualBox VM running Debian. The logic was that if Debian deserves its reputation for solidity, dependencies like ImageMagick and UFRaw shouldn’t be broken and I wasn’t wrong. To make it fly though, I needed to see if I could get Guest Additions installed on Debian. Out of the box, the supported kernel version must be at least 2.6.27 and Debian’s is 2.6.26 so additional work was on the cards. First, GCC, Make and the correct kernel header files need to be installed. Once those are in place, the installation works smoothly and a restart sets the goodies in motion. To make the necessary Shared folder to be available, a command like the following was executed:

mount -t vboxfs [Shared Folder name] [mount point]

Once that deed was done and ImageMagick instated, the processing that I have been doing for new DSLR images was reinstated. Ironically, Debian’s version of ImageMagick, 6.3.7, is even older than Ubuntu’s but it works and that’s the main thing. There is an Ubuntu bug report for this on Launchpad so I hope that it gets fixed at some point in the near future. However, that may mean awaiting 9.10 or Karmic Koala so I’m glad to have the workaround in the meantime.

Recursive FTP with the command line

6th August 2008

Here’s a piece of Linux/UNIX shell scripting code that will do a recursive FTP refresh of a website for you:

lftp <<~/Tmp/log_file.tmp 2>>~/Tmp/log_file.tmp

open ${HOSTNAME}

user ${USER} ${PSSWD}

mirror -R -vvv “${REP_SRC}” “${REP_DEST}”

EndFTP

When my normal FTP scripting approach left me with a broken WordPress installation and an invalid ticket in the project’s TRAC system that I had to close, I turned to looking for a more robust way of achieving the website updates and that’s what led me to seek out the options available for FTP transfers that explicitly involve directory recursion. The key pieces in the code above are the use of lftp in place of ftp, my more usual tool for the job, and the invocation of the mirror command that comes with lftp. The -R switch ensures that file transfer is from local to remote (vice versa is the default) and -vvv turns on maximum verbosity, a very useful thing when you find that it takes longer than more usual means. It’s all much slicker than writing your own script to do the back-work of ploughing through the directory structure and ensuring that the recursive transfers take place. Saying that, it is possible to have a one line variant of the above but the way that I have set things up might be more familiar to users of ftp.

Automating FTP I: UNIX and Linux

11th April 2008

Having got tired of repeated typing in everything at the prompt of an interactive command line FTP session and doing similar things via the GUI route, I started to wonder if there was a scripting alternative and, lo and behold, I found it after a spot of googling. There are various opportunities for its extension such as prompting for username and password instead of the risky approach of including them in a script or cycling through a directory structure but here’s the foundation stone for such tinkering anyway:

HOSTNAME='ftp.server.host'
USER='user'
PSSWD='password'
REP_SRC='source_directory'
REP__DEST='destination_directory'
FILENAME='*'

rm -rf log_file.tmp

cd "${REP_DEST}"

ftp -i -n -v <<EndFTP >>log_file.tmp 2>>log_file.tmp
open ${HOSTNAME}
user ${USER} ${PSSWD}
prompt
cd "${REP_SRC}"
mget "${FILENAME}"
EndFTP

cd ~

Negative logic in Korn shell scripts

16th October 2007

I was looking for a way to negative logic, doing something when a condition is not satisfied, that is, and found that the way to do it is to do nothing when the condition is satisfied and something when it isn’t. Being used to saying do something when a condition is false, this does come as a surprise. It may be that I find another way on my UNIX shell scripting journey. In the meantime, the code below will only create a directory when it doesn’t already exist.

dirname=test
if [[ -d $dirname ]]
then
    :    # the colon operator means do nothing
else
    mkdir test
fi

Checking existence of files and directories on UNIX using shell scripting

23rd April 2007

Having had a UNIX shell script attempt to copy a non-existent file, I decided to take another look for ways to test the existence of a file. For directory existence checking, I was testing for the return code from the cd command and I suppose that the ls command might help for files. However, I did find a better way:

if [ -f $filename ]
then
echo "This filename [$filename] exists"
elif [ -d $dirname ]
then
echo "This dirname [$dirname] exists"
else
echo "Neither [$dirname] or [$filename] exist"
fi

The -d and -f flags within the evaluation expressions test for the existence of directories and files, respectively. One gotcha is that those spaces within the brackets are important too but it is a very way of doing what I wanted.

Trying out OpenSolaris

2nd February 2007

Having been programming (mostly in SAS as it happens) on Sun’s venerable Solaris operating system platform at work since the start of this year, the chance to try OpenSolaris x86 edition in a VMware virtual machine seemed a good opportunity for advancing my skills.

Prior to this, my exposure to Solaris was when I was at university and things have moved on a bit since then, not least on the technology side but also in terms of my own skills. In those days, my mindset was fixed by exposure to macOS and Windows with their point-and-click functionality; the fact that the terminals that we were using were ancient didn’t make for a positive impression. You can see below what I mean. And the concept of tackling a command line, even one as powerful as that in UNIX, armed with a good book was somehow foreign to me.

View of old Solaris desktop

Mind you, in those pre-Safari days, getting your hands on books not in the university library was an expensive outing for the student finances. Armed with years of programming and web development experience, the UNIX command line now looks like a powerful tool to be used to the greatest advantage. Years of exposure to Perl and Linux have made the tool a less daunting one for me. Also, the availability of shell scripting makes the Windows batch file language look positively archaic. The default ksh shell (I believe that it is ksh88) in Solaris is not as friendly as it could be, but bash is available on demand, so life isn’t that uncomfortable on the command line.

To date, my experience of OpenSolaris has been brief because I wrecked the installation while trying to sort out an annoying graphics issue that appeared after installing VMware Tools (drivers for various pseudo-devices) on OpenSolaris; I have yet to put things back. The installation procedure is pretty painless for what is a technical operating system. The Community: Tools section of the OpenSolaris website has articles on installation and installation under VMware is discussed on Developer’s Quarterdeck Log.

As regards a desktop environment, you have a choice between the ubiquitous Gnome and Sun’s own CDE, of which I have seen plenty at work. As it happened, I installed the developer edition, but there are the usual Linux mainstays on the desktop: StarOffice (in place of OpenOffice), GIMP, Mozilla Firefox, etc. One thing that I wasn’t able to sort out was the internet connection, and that may be because ZoneAlarm was blacklisting VMware at the time of installation. All in all, it looked like a far friendlier environment for users than that which I encountered during my early years on UNIX. I must get it back in action and take things on from here…

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.