Technology Tales

Adventures & experiences in contemporary technology

Command Line Processing of EXIF Image Metadata

8th July 2013

There is a bill making its way through the U.K. parliament at the moment that could reduce the power of copyright when it comes to images placed on the web. The current situation is that anyone who creates an image automatically holds the copyright for it. However, the new legislation will remove that if it becomes law as it stands. As it happens, the Royal Photographic Society is doing what it can to avoid any changes to what we have now. There may be the barrier of due diligence but how many of us take steps to mark our own intellectual property? For one, I have been less that attentive to this and now wonder if there is anything more that I should be doing. Others may copyleft their images but I don’t want to find myself unable to share my own photos because another party is claiming rights over them. There’s watermarking them but I also want to add something to the image metadata too.

That got me wondering about adding metadata to any images that I post online that assert my status as the copyright holder. It may not be perfect but any action is better than doing nothing at all. Given that I don’t post photos where EXIF metadata is stripped as part of the uploading process, it should be there to see for anyone who bothers to check and there may not be many who do.

Because I also wanted to batch process images, I looked for a command line tool to do the needful and found ExifTool. Being a Perl library, it is cross-platform so you can use it on Linux, Windows and even OS X. To install it on a Debian or Ubuntu based Linux distro, just use the following command:

sudo apt-get install libimage-exiftool-perl

The form of the command that I found useful for adding the actual copyright information is below:

exiftool -p “-copyright=(c) John …” -ext jpg -overwrite_original

The -p switch preserves the timestamp of the image file while the -overwrite_original one ensures that you don’t end up with unwanted backup files. The copyright message goes within the quotes along with the -copyright option. With a little shell scripting, you can traverse a directory structure and change the metadata for any image files contained in different sub-folders. If you wish to do more than this, there’s always the user documentation to be consulted.

Saving Windows Command Prompt & Powershell command history to a file for later useage

15th May 2013

It’s amazing what ideas Linux gives that you wouldn’t encounter that clearly in the world of Windows. One of these is output and command line history so a script can be created. In the Windows world, this would be called a batch file. Linux usefully has the history command and it does the needful for taking a snapshot like so:

history > ~/commands.sh

All of the commands stored in a terminal’s command history get stored in the commands.sh in the user’s home area. The command for doing the same thing from the Windows command line is not as obvious because it uses the doskey command that is intended for command line macro writing and execution. Usefully, it has a history option that tells it to output all the commands issued in a command line session. Unless, you create a file with them in there, there seems to be no way to store all those commands across sessions, unlike UNIX and Linux. Therefore, a command like the following is a partial solution that is more permanent than using the F7 key on your keyboard:

doskey /history > c:\commands.bat

Windows Powershell has something similar too and it even has aliases of history and even h. All Powershell scripts have file extensions of ps1 and the example below follows that scheme:

get-history > c:\commands.ps1

However, I believe that even Powershell doesn’t carry over command history between sessions though Microsoft are working on adding this useful functionality. They could co-opt Cygwin of course but that doesn’t seem to be their way of going about things.

Using the Windows Command Line for Security Administration

24th July 2009

While there are point and click tools for the job, being able to set up new user groups, attaching them to folders and assign uses to them using the command line has major advantages when there are a number to be set up and logs of execution can be retained too. In light of this, it seems a shame that terse documentation along with its being hard to rack down answers to any questions using Google, or whatever happens to be your search engine of choice, makes it less easy to discern what commands need to be run. This is where a book would help but the whole experience is in direct contrast to the community of information providers that is the Linux user community, with Ubuntu being a particular shining example. Saying that, the Windows help system is not so bad once you can track down what you need. For instance, knowing that you need commands like CACLS and NET LOCALGROUP, the ones that have been doing the back work for me, it offers useful information quickly enough. To illustrate the usefulness of the aforementioned commands, here are a few scenarios.

Creating a new group:

net localgroup [name of new group] /comment:”[more verbose description of new group]” /add

Add a group to a folder:

cacls [folder address] /t /e /p [name of group]

The /t switch gets cacls to apply changes to the ACL for the specified folder and all its subfolders, recursive action in other words, while the /e specifies ACL editing rather than its replacement and /p induces replacement of permissions for a given user or group. Using :n, :f, :c or :r directly after the name of a specified user or group assigns no, full, change (write) or read access, respectively. Replacing /p with /r revokes access and leaving off the :n/:f/:c/:r will remove the group or user from the folder.

Add a user to a group:

net localgroup [name of group] [user name (with domain name if on a network)] /add

In addition to NET LOCALGROUP, there is also NET GROUP for wider network operations, something that I don’t have cause to do. Casting the thinking net even wider, I suspect that VB scripting and its ability to tweak the Windows Management Interface might offer more functionality than what is above (PowerShell also comes to mind while we are on the subject) but I am sharing what has been helping me and it can be hard to find if you don’t know where to look.

ImageMagick and Ubuntu 9.04

5th May 2009

Using a command line tool like ImageMagick for image processing may sound a really counter-intuitive thing to do but there’s no need to do everything on a case by case interactive basis. Image resizing and format conversion come to mind here. Helper programs are used behind the scenes too with Ghostscript being used to create Postscript files, for example.

The subject of helper programs brings me to an issue that has hampered me recently. While I am aware that there are tools like F-Spot available, I am also wont to use a combination of shell scripting (BASH & KSH), Perl and ImageMagick for organising my digital photos. My preference for using Raw camera files (DNG & CRW) means that ImageMagick cannot access these without a little helper. In the case of Ubuntu, it’s UFRaw. However, Jaunty Jackalope appears to have seen UFRaw updated to a version that is incompatible with the included version of ImageMagick (6.4.5 as opposed to 3.5.2 at the time of writing). The result is that the command issued by ImageMagick to UFRaw -- issue the command man ufraw-batch to see the details -- is not accepted by the included version of the latter, 0.15 if you’re interested. It seems that an older release of UFRaw accepted the output device ppm16 (16-bit PPM files) but this should now be specified as ppm for the output device and 16 for the output depth. In a nutshell, where the parameter output-type did the lot, you now need both output-type and output-depth.

I thought of decoupling things by using UFRaw to create 16-bit PPM files for processing by ImageMagick but to no avail. The identify command wouldn’t return the date on which the image was taken. I even changed the type to 8-bit JPEG’s with added EXIF information but no progress was made. In the end, a mad plan came to mind: creating a VirtualBox VM running Debian. The logic was that if Debian deserves its reputation for solidity, dependencies like ImageMagick and UFRaw shouldn’t be broken and I wasn’t wrong. To make it fly though, I needed to see if I could get Guest Additions installed on Debian. Out of the box, the supported kernel version must be at least 2.6.27 and Debian’s is 2.6.26 so additional work was on the cards. First, GCC, Make and the correct kernel header files need to be installed. Once those are in place, the installation works smoothly and a restart sets the goodies in motion. To make the necessary Shared folder to be available, a command like the following was executed:

mount -t vboxfs [Shared Folder name] [mount point]

Once that deed was done and ImageMagick instated, the processing that I have been doing for new DSLR images was reinstated. Ironically, Debian’s version of ImageMagick, 6.3.7, is even older than Ubuntu’s but it works and that’s the main thing. There is an Ubuntu bug report for this on Launchpad so I hope that it gets fixed at some point in the near future. However, that may mean awaiting 9.10 or Karmic Koala so I’m glad to have the workaround in the meantime.

Investigating Textpattern

9th March 2009

With the profusion of Content Management Systems out there, open source and otherwise, my curiosity has been aroused for a while now. In fact, Automattic’s aspirations for WordPress (the engine powering this blog) now seem to go beyond blogging and include wider CMS-style usage. Some may even have put the thing to those kinds of uses but I am of the opinion that it has a way to go yet before it can put itself on a par with the likes of Drupal and Joomla!.

Speaking of Drupal, I decided to give it a go a while back and came away with the impression that it’s a platform for an entire website. At the time, I was attracted by the idea of having one part of a website on Drupal and another using WordPress but the complexity of the CSS in the Drupal template thwarted my efforts and I desisted. The heavy connection between template and back end cut down on the level of flexibility too. That mix of different platforms might seem odd in architectural terms but my main website also had a custom PHP/MySQL driven photo gallery too and migrating everything into Drupal wasn’t going to be something that I was planning. In hindsight, I might have been trying to get Drupal to perform a role for which it was never meant so I am not holding its non-fulfillment of my requirements against it. Drupal may have changed since I last looked at it but I decided to give an alternative a go regardless.

Towards the end of last year, I began to look at Textpattern (otherwise known as Txp) in the same vein and it worked well enough after a little effort that I was able to replace what was once a visitor dossier with a set of travel jottings. In some respects, Textpattern might feel less polished when you start to compare it with alternatives like WordPress or Drupal but the inherent flexibility of its design leaves a positive impression. In short, I was happy to see that it allowed me to achieve what I wanted to do.

If I remember correctly, Textpattern’s default configuration is that of a blog and it can be used for that purpose. So, I got in some content and started to morph the thing into what I had in mind. My ideas weren’t entirely developed so some of that was going on while I went about bending Txp to my will. Most of that involved tinkering in the Presentation part of the Txp interface though. It differs from WordPress in that the design information like (X)HTML templates and CSS are stored in the database rather than in the file system à la WP. Txp also has its own tag language called Textile and, though it contains conditional tags, I find that encasing PHP in <txp:php></txp:php> tags is a more succinct way of doing things; only pure PHP code can be used in this way and not a mixture of such in <?php ?> tags and (X)HTML. A look at the tool’s documentation together with perusal of Apress’ Textpattern Solutions got me going in this new world (it was thus for me, anyway). The mainstay of the template system is the Page and each Section can use a different Page. Each Page can share components and, in Txp, these get called Forms. These are included in a Page using Textile tags of the form <txp:output_form form=”form1″ />. Style information is edited in another section and you can have several style sheets too.

The Txp Presentation system is made up of Sections, Pages, Forms and Styles. The first of these might appear in the wrong place when being under the Content tab would seem more appropriate but the ability to attach different page templates to different sections places their configuration where you find it in Textpattern and the ability to show or hide sections might have something to do with it too. As it happens, I have used the same template for all bar the front page of the site and got it to display single or multiple articles as appropriate using the Category system. It may be a hack but it appears to work well in practice. Being able to make a page template work in the way that you require really offers a great amount of flexibility and I have gone with one sidebar rather than two as found in the default set up.

Txp also has facility to add plugins (look in the Admin section of the UI) and this is very different from WordPress in that installation involves the loading of an encoded text file, probably for sake of maintaining the security and integrity of your installation. I added the navigation facility for my sidebar and breadcrumb links in this manner and back end stuff like Tiny MCE editor and Akismet came as plugins too. There may not be as many of these for Textpattern but the ones that I found were enough to fulfill my needs. If there are plugin configuration pages in the administration interface, you will find these under the Extensions tab.

To get the content in, I went with the more laborious copy, paste and amend route. Given that I was coming from the plain PHP/XHTML way of doing things, the import functionality was never going to do much for me with its focus on Movable Type, WordPress, Blogger and b2. The fact that you only import content into a particular section may displease some too. Peculiarly, there is no easy facility for Textpattern to Textpattern apart from doing a MySQL database copy. Some alternatives to this were suggested but none seemed to work as well as the basic MySQL route. Tiny MCE made editing easier once I went and turned off Textile processing of the article text. This was done on a case by case basis because I didn’t want to have to deal with any unintended consequences arising from turning it off at a global level.

While on the subject of content, this is also the part of the interface where you manage files and graphics along with administering things like comments, categories and links (think blogroll from WordPress). Of these, it is the comment or link facilities that I don’t use and even have turned comments off in the Txp preferences. I use categories to bundle together similar articles for appearance on the same page and am getting to use the image and file management side of things as time goes on.

All in all, it seems to work well even if I wouldn’t recommend it to many to whom WordPress might be geared. My reason for saying that is because it is a technical tool and is used best if you are prepared to your hands dirtier from code cutting than other alternatives. I, for one, don’t mind that at all because working in that manner might actually suit me. Nevertheless, not all users of the system need to have the same level of knowledge or access and it is possible to set up users with different permissions to limit their exposure to the innards of the administration. In line with Textpattern’s being a publishing tool, you get roles such as Publisher (administrator in other platforms), Managing Editor, Copy Editor, Staff Writer, Freelancer, Designer and None. Those names may mean more to others but I have yet to check out what those access levels entail because I use it on a single user basis.

There may be omissions from Txp like graphical presentation of visitor statistics in place of the listings that are there now and the administration interface might do with a little polish but it does what I want from it and that makes those other considerations less important. That more cut down feel makes it that little more useful in my view and the fact that I have created A Wanderer’s Miscellany may help to prove the point. You might even care to take a look at it to see what can be done and I am sure that it isn’t even close to exhausting the talents of Textpattern. I can only hope that I have done justice to it in this post.

Recursive FTP with the command line

6th August 2008

Here’s a piece of Linux/UNIX shell scripting code that will do a recursive FTP refresh of a website for you:

lftp <<~/Tmp/log_file.tmp 2>>~/Tmp/log_file.tmp

open ${HOSTNAME}

user ${USER} ${PSSWD}

mirror -R -vvv “${REP_SRC}” “${REP_DEST}”

EndFTP

When my normal FTP scripting approach left me with a broken WordPress installation and an invalid ticket in the project’s TRAC system that I had to close, I turned to looking for a more robust way of achieving the website updates and that’s what led me to seek out the options available for FTP transfers that explicitly involve directory recursion. The key pieces in the code above are the use of lftp in place of ftp, my more usual tool for the job, and the invocation of the mirror command that comes with lftp. The -R switch ensures that file transfer is from local to remote (vice versa is the default) and -vvv turns on maximum verbosity, a very useful thing when you find that it takes longer than more usual means. It’s all much slicker than writing your own script to do the back-work of ploughing through the directory structure and ensuring that the recursive transfers take place. Saying that, it is possible to have a one line variant of the above but the way that I have set things up might be more familiar to users of ftp.

JavaScript: write it yourself or use a library?

3rd July 2008

I must admit that I have never been a great fan of JavaScript. For one thing, its need to interact with browser objects places you at the mercy of the purveyors of such pieces of software. Debugging is another fine art that can seem opaque to the the uninitiated since the amount and quality of the logging is determined an interpreter that isn’t provided by the language’s overseers. All in all, it seems to present a steep and obstacle-strewn learning curve to newcomers. As it happens, I have always found server side scripting languages like PHP and Perl to be more to my taste and I have no aversion at all to writing SQL.

In the late 1990’s when I was still using free web hosting, JavaScript probably was the best option for my then new online photo gallery. Whatever was the truth, it certainly was the way that I went. Learning Java or Flash might have been useful but I never managed to devote sufficient time to the task so JavaScript turned out to be the way forward until I got a taste of server side scripting. Moving to paid hosting allowed for that to develop and the JavaScript option took a back seat.

Based on my experience of the browser wars and working with JavaScript throughout their existence, I was more than a little surprised at the buzz surrounding AJAX. Ploughing part of the way through WROX’s Beginning AJAX did nothing to sell the technology to me; it came across as a very dry jargon-blighted read. Nevertheless, I do see the advantages of web applications being as responsive as their desktop equivalents but AJAX doesn’t always guarantee this; as someone that has seen such applications crawling on IE6, I can certainly vouch for this. In fact, I suspect that may be behind the appearance of technologies such as AIR and Silverlight so JavaScript may get usurped yet again, just like my move to a photo gallery powered on the server side.

Even with these concerns, using JavaScript to add a spot more interactivity is never a bad thing even if it can be overdone, hence the speed problems that I have witnessed. In fact, I have been known to use DOM scripting but I need to have the use in mind before I can experiment with a technology; I cannot do it the other way around. Nevertheless, I am keen to see what JavaScript libraries such as jQuery and Prototype might have to offer (both have been used in WordPress). I have happened on their respective websites so they might make good places to start and who knows where my curiosity might take me?

On numeric for loops in Korn shell scripting

18th October 2007

The time hounoured syntax for a for loop in a UNIX script is what you see below and that is what works with the default shell in Sun’s Solaris UNIX operating system, ksh88.

for i in 1 2 3 4 5 6 7 8 9 10
do
        if [[ -d dir$i ]]
        then
            :
        else
            mkdir dir$i
        fi
done

There is a much nicer syntax supported syntax the advent of ksh93. It follows C language conventions found in all sorts of places like Java, Perl, PHP and so on. Here is an example:

for (( i=1; i<11; i++ ))
do
        if [[ -d dir$i ]]
        then
            :
        else
            mkdir dir$i
       fi
done

IE6 and JavaScript performance

22nd June 2007

Having been exposed to an application at work that uses a lot of JavaScript, I fully appreciate what some mean when they talk about IE6’s inefficient handling of JavaScript. After seeing a web page taking an age to reload and your CPU taking a hammering because of JavaScript processing, the penny does tend to drop… Needless to say, this very much impacts the world of AJAX-driven web applications with their heavy dependence of client-side JavaScript. While IE7 does come to the rescue, there remain plenty of IE6 users still out there and this is reflect in website statistics. This reflects a certain level of inertia in the browser market that not only afflicts the uptake of IE7 but also the likes of Mozilla, Opera and Safari. It also means that anyone developing AJAX applications very much needs to continue testing in IE6, especially if the product of their labours is for wider public use. An example of such an application is Zimbra, an open source web application for messaging and collaboration, and the people behind it have generously share the results of their browser performance benchmarking. They did comparisons of IE6 vs. IE7 and Firefox 2 vs. IE7. IE6 easily came out as the worst in these while Firefox 2 was the best. I suppose that the next question to be asked centres around the type of code that is processed inefficiently by IE6. I wouldn’t be at all surprised if a list emerged but here’s one: using Microsoft’s proprietary innerHTML object to update the DOM for a web page format. Having a quick trawl on Google, this came up for mention as a cause of memory leaks. It is also a Microsoft innovation that never got taken up by those overseeing web standards, hardly a surprise since a spot of DOM scripting achieves the same end. It may be faster to code than any alternatives, and it does have some support from other browsers, but it does seem to have got a bad name and so should be avoided if possible. That said, it would be interesting to see a performance comparsion between innerHTML and DOM methods in IE6.

A web development toolbox

23rd March 2007

Having been on a web-building journey from Geocities to having a website with my own domain hosted by Fasthosts, it should come as no surprise that I have encountered a number of tools and technologies over this time and that my choices and knowledge have evolved too. I’ll muse over the technologies first before going on to the tools that I use.

Technologies

XHTML

When I started building websites, HTML 4 was not long in existence and I devoured most if not all of Elizabeth Castro’s Peachpit Visual Quickstart guide to the language within a weekend. Having previously used fairly primitive WYSIWYG tools like Netscape Composer and Claris Home Page, it was an empowering experience and the first edition (it is now on its third) of Jennifer Niederst Robbins’ Web Design in a Nutshell took things much further, becoming something of a bible for a number of years.

When it first appeared, XHTML 1.0 wasn’t a major change from HTML 4, but its stricter more XML-compliant syntax was meant to point the way to the future and semantic markup was at its heart at least as much as it was for HTML 4. XHTML 2.0 is on the horizon and after the modular approach of XHTML 1.1 (which I have never used), it will be interesting to see how it develops. Nevertheless, there is a surprising development in that some people are musing over the idea of having an HTML 5. Let’s hope that the (X)HTML apple cart doesn’t get completely overturned after some years of relative stability. I still bear scars from the browser wars raging in the 1990’s and don’t want to see standards wars supplanting the relative peace that we have now. That said, I don’t mind peaceful progression.

CSS

Only seems to be coming into its own in the last few years and is truly an amazing technology in spite of the hobbles that MSIE places on our ambitions. CSS Zen Garden has been a major source of ideas; I wouldn’t have been able to customise this blog as much as I have without them. I was an early adopter of the technology and got burnt by inconsistent browser support; Netscape 4 was the proverbial bête noir back then, fulfilling the role that MSIE plays today. In those days, it was the idea of controlling text display and element backgrounds from a single place that appealed. Since then, I have progressed to using CSS to replace table-based layouts and to control element positioning. It can do more…

JavaScript

Having had a JavaScript-powered photo gallery before my current Perl-driven one, I can say that I have definitely sampled this ever-pervasive scripting language. Being a client-side language rather than a server-side one, it does place you rather at the mercy of the browser purveyors and it never ceases to amaze me that there is a buzz around AJAX because of this. In fact, the abundance of AJAX cross-browser function libraries is testimony to the need for browser-specific code. Despite my preferences for server-side scripting, I still find a use for JavaScript and its main use for me these days is to dynamically control CSS elements to do such things as control the height of a page element or whether it is shown or not. Apparently, CSS may get some dynamic capabilities in the future and reduce my dependence on JavaScript. In the meantime, Jeremy Keith’s DOM Scripting (Friends of Ed) will prove as much of an asset as it has done.

XML

These days, a lot of the raw data underlying my personal website is stored in XML. I did try to dynamically transform the display of the XML into something meaningful with CSS and XSLT when I first scaled its dizzy heights but I soon resorted to other techniques. Browser support and the complexity of what I required were the major contributors to this. The new strategy involved two different approaches. The first was to create PHP/XHTML pages from the precursor XML offline and this is how I generate the website’s directory pages. The other one is to process the XML as text to dynamically supply an XHTML page as the user visits it; this is the way that the photo gallery works.

Perl

This still powers all of my photo gallery. While thoughts of changing it all to PHP linger, there is a certain something about the Perl language that keeps it there. I suppose it is that PHP is entangled in the HTML while Perl encases the whole business and I am reasonably familiar with its syntax these days which is why it still does a lot of the data processing grunt work that I need.

PHP

PHP is everywhere these days, though it doesn’t attract quite the level of hype that used to be the case. It still appears with its sidekick MySQL in many website applications. Blogging software such as WordPress and content management systems like Drupal, Mambo and Joomla! wouldn’t exist without the pair. It appears on my website as the glue that holds my visitor directories together and is the processing engine of my WordPress blog. And if I ever get to a Drupal element to the site, by no means a foregone conclusion though I am spending a lot of time with it at the moment, PHP will continue its presence in my website scripting as it powers that too.

Applications

Macromedia HomeSite

I have a liking for hand coding, so this does most of what I need. When Macromedia (itself since taken over by Adobe, of course) took over Allaire, HomeSite sadly lost its WYSIWYG capability, but the application still soldiers on even though Dreamweaver offers a lot to code cutters these days. Nevertheless, it does have certain advantages over Dreamweaver: it is a fleeter beast to start up and colour codes Perl syntax.

Macromedia Dreamweaver

There was a time when Dreamweaver was solely a tool for visual web page development, but the advent of Dreamweaver UltraDev added server-side development capabilities to the Dreamweaver family. These days, there is only one Dreamweaver version, but UltraDev’s capabilities still live on in the latest version and I would not be surprised if they were taken further in these database-driven times.

Nowadays, Dreamweaver isn’t an application where I spend a great deal of time. In former times, when my site was made up of static HTML pages, I used Dreamweaver a lot even if its rendering capabilities were a step behind the then-current browser versions. I suppose that it didn’t fit the way in which I worked, but its template-driven workflow would have been a boon back then.

However, my move from a static site to a dynamic one, starting with my photo gallery, has meant that I haven’t used it as much since then. However, with my use of PHP/MySQL components on my site. Its server-side abilities could get the level of investigation that its PHP/MySQL capabilities allow.

Altova XMLSpy Professional

Adding MySQL databases to my web hosting costs money, not a lot but it could be spent on other (more important?) things. Hence, I use XML as the data store for my photo gallery and XML files are pre-processed into XHTML/PHP pages for my visitor directories prior to uploading onto the server.

I use XMLSpy to edit and manage the XML files that I use: its ability to view XML in grid format is a killer feature as far as I am concerned and XML validation also proves very useful; particularly with regard to ensuring that DTD’s and XML files are in step and for the correct coding of XSLT files. There are other features that I need to explore and that would also take my knowledge of the XML further to boot, not at all a bad thing.

Saxon

For processing XML into another file format such as XHTML, you need a parser and I use the free version of Saxon to do the needful, Saxonica offers commercial versions of it. There is, I believe, a parser in XMLSpy but I don’t use it because Saxon’s command line interface fits better into my workflow. This is a Perl-driven process where XML files are read and XSLT files, one per XML file, are built before both are fed to Saxon for transforming into XHTML/PHP files. It all works smoothly and updating the XML inputs is all that is required.

AceFTP

If I were looking for an FTP client now, it would be FileZilla but AceFTP has served me well over the last few years and it looks as if that will continue. It does have some extra features over FileZilla: transfers between remote sites, and scheduling, for example. I have yet to use either but they look valuable.

Hutmil

In bygone days when I had loads of static HTML files, making changes was a bit of a chore if they affected every single file. An example is changing the year on the copyright message on the page footers. Hutmil, which I found on a magazine cover-mounted disc, was a great time saver in those days. Today, I achieve this by putting this information into a single file and getting Perl or PHP to import that when building the page. The same “define once, use anywhere” approach underlies CSS as well and scripting very usefully allows you to take that into the XHTML domain.

Apache

Apache is ubiquitous these days and both the online and offline versions of my site are powered by it. It does require some configuration but it is a very powerful piece of kit. The introduction of 2.2.x meant a big change in the way that configuration files were modularised and while most things were contained in a single file for 2.0.x, the settings are broken up into different files in 2.2.x and it can take a while to find things again. Without having it on my home PC, I would not be able to use Perl, PHP or MySQL. Apart from this, I especially like its virtual site capability; very useful for offline development.

WordPress

My hosting supplier offers blogs on Blogware, but that didn’t offer the level of configuration that I would have liked. It is true that this is probably true of any host of blogs. I can’t speak for Blogger but WordPress.com does have its restrictions too. To make my hillwalking blog fit in with the appearance of my photo gallery, I went popped over to WordPress.org to download WordPress so that I could host a blog myself and have maximum control over its appearance. WordPress supports themes so I created my own and got my blog pages looking as if they are part of my website, rather than looking like something that was bolted on. Now that I think of it, what about WordPress supporting user-created themes? I support that there is the worry of insecure PHP code but what about it?

MySQL

I am between minds on whether this is a technology or a tool. SQL certainly would be a technology standard but I am not so clear on what MySQL would be. In any case, I have classed it as a tool and a very useful one at that. It is the linchpin for my WordPress blogs and, if I go for a content management system like Drupal, its role would surely grow. While I do have a lot of experience with using SAS SQL and this helps me to deal with other varieties, there is still a learning curve with MySQL that gets me heading for a good book and Kofler’s The Definitive Guide to MySQL5 (Apress) seems to perform more than adequately in this endeavour.

Paint Shop Pro

As someone who hosts an online photo gallery, it won’t come as a surprise that I have had exposure to image editors. Despite various other flirtations, Paint Shop Pro has been my tool of choice over the years, but it is now set to be usurped by a member of Adobe’s Photoshop family. Paint Shop Pro does have books devoted to it but it seems that Photoshop gets better coverage and I feel that my image processing needs to be taken up a gear, hence the potential move to Photoshop

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.