Technology Tales

Adventures & experiences in contemporary technology

Turning off seccomp sandbox in vsftpd

21st September 2013

Within the last week, I set up virtual web server using Arch Linux to satisfy my own curiosity since the DIY nature of Arch means that you can build up exactly what you need without having any real constraints put upon you. What didn’t surprise me about this was that it took me more work than the virtual server that I created using Ubuntu Server but I didn’t expect ProFTPD to be missing from the main repositories. The package can be found in the AUR but I didn’t fancy the prospect of dragging more work on myself so I went with vsftpd (Very Secure FTP Daemon) instead. In contrast to ProFTPD, this is available in the standard repositories and there is a guide to its use in the Arch user documentation.

However, while vsftpd worked well just after installation, connections to the virtual FTP soon failed with FileZilla  began issuing uninformative messages. In fact, it was the standard command line FTP client on my Ubuntu machine that was more revealing. It issued the following message that let me to the cause after my engaging the services of Google:

500 OOPS: priv_sock_get_cmd

With version 3.0 of vsftpd, a new feature was introduced and it appears that this has caused problems for a few people. That feature is seccomp sandboxing and it can turned off by adding the following line in /etc/vsftpd.conf:

seccomp_sandbox=NO

That solved my problem and version 3.0.2 of vsftpd should address the issue with seccomp sandboxing anyway. In case, this solution isn’t as robust as it should be because seccomp isn’t supported in the Linux kernel that you are using, turning off the new feature still needs to be an option though.

Creating a test web server using Ubuntu Server 13.04 and VirtualBox

1st September 2013

Having seen Linux Format cover tools like Vagrant and Puppet that manage virtual machines, I have been attracted by the prospect of a virtual web server running on my own PC. Certainly, having the LAMP software stack in a VM means that the corresponding tools don’t need to be added to a host system should its operating system need a fresh installation.

As intriguing as tools like Vagrant may be, I decided that I needed to learn a bit more about getting server instances set up in VirtualBox anyway. Thus, I went and downloaded the latest version of Ubuntu Server and gave that a go. One lesson that I learned was that Bridged Networking needs to be added to the VM before installation of the operating system unless you fancy overcoming the challenge of getting Ubuntu Server to recognise an altered or additional network interface. In my case, I added an extra adapter for the Bridged Networking and left the original in place as NAT. The reason for having Bridged Networking set up is that it allows access to the virtual web server from the host once you know the IP address and that information can be obtained by executing the ifconfig command on the virtual machine.

With the networking sorted, the next step was to install the 64-bit edition of Ubuntu Server. Unlike its desktop counterpart, this is all driven by text menus but remains fairly intuitive and there is hardly anything there that you wouldn’t see with another Linux distribution. A useful addition is the addition of a menu to select the type of server services that you’d like to see installed. From this, I chose the web server and SSH options and I seem to remember that there was a database server option too. If there was an FTP server option, I would have chosen that too but it was no ordeal to add ProFTPd later on anyway.

All of this set was done through the VirtualBox GUI just to keep life more straightforward. Even so, I only selected 12 MB of video memory and was tempted to cut the overall memory back from 512 MB but leaving things be for now. However, what I have begun to do is start and stop the virtual machine from the command line since servers are headless operations anyway. With SSH enabled, there is little need to have the VirtualBox GUI going. The command for starting the server is below:

VBoxManage startvm "Ubuntu Server" --type=headless

There is a VBoxHeadless command for the same end too but VBoxManage does what I need. The startvm option is what tells VBoxManage to start the server and the virtual machine’s name is enclosed in quotes. The --type=headless ensures that no window pops up. To stop the virtual web server cleanly, a command like the following is needed:

VBoxManage controlvm "Ubuntu Server" acpipowerbutton

Again, the VBoxManage command gets used and the acpipowerbutton option ensures that a clean shutdown is performed. Not doing so results in the server not fully starting up according to my experiences thus far. Getting the virtual web server to start and stop with the host machine itself starting and stopping but this looks more complex so I plan to leave things a while before trying that experiment.

Adding Microsoft Core Fonts to Fedora 19

6th July 2013

While I have a previous posting from 2009 that discusses adding Microsoft’s Core Fonts to the then current version of Fedora, it did strike me that I hadn’t laid out the series of command that were used. Instead, I referred to an external and unofficial Fedora FAQ. That’s still there but I also felt that I was leaving things a little to chance given how websites can disappear quite suddenly.

Even after next to four years, it still amazes me that you cannot install Microsoft’s Core Fonts in Fedora as you would in Ubuntu, Linux Mint or even Debian. Therefore, the following series of steps is as necessary now as it was then.

The first step is to add in a number of precursor applications such as wget for command line file downloading from websites, cabextract for extracting the contents of Windows CAB files, rpmbuild for creating RPM installers and utilities for the XFS file system that chkfontpath needs:

sudo yum -y install rpm-build cabextract ttmkfdir wget xfs

Here, I have gone with terminal commands that use sudo but you could become the superuser (root) for all of this and there are those who believe you should. The -y switch tells yum to go ahead with prompting you for permission before it does any installations. The next step is to download the Microsoft fonts package with wget:

sudo wget http://corefonts.sourceforge.net/msttcorefonts-2.0-1.spec

Once that is done, you need to install the chkfontpath package because the RPM for the fonts cannot be built without it:

sudo rpm -ivh http://dl.atrpms.net/all/chkfontpath

Once that is in place, you are ready to create the RPM file using this command:

sudo rpmbuild -ba msttcorefonts-2.0-1.spec

After the RPM has been created, it is time to install it:

sudo yum install --nogpgcheck ~/rpmbuild/RPMS/noarch/msttcorefonts-2.0-1.noarch.rpm

When installation has completed, the process is done. Because I used sudo, all of this happened in my own home area so there was a need for some housekeeping afterwards. If you did it by becoming the root user, then the files would be there instead and that’s the scenario in the online FAQ.

A need to update graphics hardware

16th June 2013

Not being a gaming enthusiast, having to upgrade graphics cards in PC’s is not something that I do very often or even rate as a priority. However, two PC’s in my possession have had that very piece of hardware upgraded on them and it’s not because anything was broken either. My backup machine has seen quite a few Linux distros on there since I built it nearly four years ago. The motherboard is an ASRock K10N78 that sourced from MicroDirect and it has onboard an NVIDIA graphics chip that has performed well if not spectacularly. One glitch that always existed was a less than optimal text rendering in web browsers but that never was enough to get me to add a graphics card to the machine.

More recently, I ran into trouble with Sabayon 13.04 with only the 2D variant of the Cinnamon desktop environment working on it and things getting totally non-functional when a full re-installation of the GNOME edition was attempted. Everything went fine until I added the latest updates to the system when a reboot revealed that it was impossible to boot into a desktop environment. Some will relish this as a challenge but I need to admit that I am not one of those. In fact, I tried out two Arch-based distros on the same PC and got the same results following a system update on each. So, my explorations of Antergos and Manjaro have to continue in virtual machines instead.

To get a working system, I gave Linux Mint 15 Cinnamon a go and that worked a treat. However, I couldn’t ignore that the cutting edge distros that I tried before it all took exception to the onboard NVIDIA graphics. systemd has been implemented in all of these and it seems reasonable to think that it is coming to Linux Mint at some stage in the future so I went about getting a graphics card to add into the machine. Having had good experiences with ATi’s Radeon in the past, I stuck with it even though it now is in the hands of AMD. Not being that fussed so long there was Linux driver support, I picked up a Radeon HD 6450 card from PC World. Adding it into the PC was a simpler of switching off the machine, slotting in the card, closing it up and powering it on again. Only later on did I set the BIOS to look for PCI Express graphics before anything else and I could have got away without doing that. Then, I made use of the Linux Mint Additional Driver applet in its setting panel to add in the proprietary driver before restarting the machine to see if there were any visual benefits. To sort out the web browser font rendering, I used the Fonts applet in the same settings panel and selected full RGBA hinting. The improvement was unmissable if not still like the appearance of fonts on my main machine. Overall, there had been an improvement and a spot of future proofing too.

That tinkering with the standby machine got me wondering about what I had on my main PC. As well as onboard Radeon graphics, it also gained a Radeon 4650 card for which 3D support wasn’t being made available by Ubuntu GNOME 12.10 or 13.04 to VMware Player and it wasn’t happy about this when a virtual machine was set to have 3D support. Adding the latest fglrx driver only ensured that I got a command line instead of a graphical interface. Issuing one of the following commands and rebooting was the only remedy:

sudo apt-get remove fglrx

sudo apt-get remove fglrx-updates

Looking at the AMD website revealed that they no longer support 2000, 3000 or 4000 series Radeon cards with their latest Catalyst driver the last version that did not install on my machine since it was built for version 3.4.x of the Linux kernel. A new graphics card then was in order if I wanted 3D graphics in VWware VM’s and both GNOME and Cinnamon appear to need this capability. Another ASUS card, a Radeon HD 6670, duly was acquired and installed in a manner similar to the Radeon HD 6450 on the standby PC. Apart from not needing to alter the font rendering (there is a Font tab on Gnome Tweak Tool where this can be set), the only real exception was to add the Jockey software to my main PC for installation of the proprietary Radeon driver. The following command does this:

sudo apt-get install jockey-kde

When that was done I issue the jockey-kde command and selected the first entry on the list. The machine worked as it should on restarting apart from an AMD message at the bottom right hand corner bemoaning unrecognised hardware. There had been two entries on that Jockey list with exactly the same name so it was time to select the second of these and see how it went. On restarting, the incompatibility message had gone and all was well. VMware even started virtual machines with 3D support without any messages so the upgrade did the needful there.

Hearing of someone doing two PC graphics card upgrades in a weekend may make you see them as an enthusiast but my disinterest in computer gaming belies this. Maybe it highlights that Linux operating systems need 3D more than might be expected. The Cinnamon desktop environment now issues messages if it is operating in 2D mode with software 3D rendering and GNOME always had the tendency to fall back to classic mode, as it had been doing when Sabayon was installed on my standby PC. However, there remain cases where Linux can rejuvenate older hardware and I installed Lubuntu onto a machine with 10 year old technology on there (an 1100 MHz Athlon CPU, 1GB of RAM and 60GB of hard drive space in case dating from 1998) and it works surprisingly well too.

It seems that having fancier desktop environments like GNOME Shell and Cinnamon means having the hardware on which it could run. For a while, I have been tempted by the possibility of a new PC since even my main machine is not far from four years old either. However, I also spied a CPU, motherboard and RAM bundle featuring an Intel Core i5-4670 CPU, 8GB of Corsair Vengence Pro Blue memory and a Gigabyte Z87-HD3 ATX motherboard included as part of a pre-built bundle (with a heatsink and fan for the CPU) for around £420. Even for someone who has used AMD CPU’s since 1998, that does look tempting but I’ll hold off before making any such upgrade decisions. Apart from exercising sensible spending restraint, waiting for Linux UEFI support to mature a little more may be no bad idea either.

Update 2013-06-23: The new graphics card in my main machine is working as it should and has reduced the number of system error report messages turning up too; maybe Ubuntu GNOME 13.04 didn’t fancy the old graphics card all that much. A rogue .fonts.conf file was found in my home area on the standby machine and removing it has improved how fonts are displayed on there immeasurably. If you find one on your system, it’s worth doing the same or renaming it to see if it helps. Otherwise, tinkering with the font rendering settings is another beneficial act and it even helps on Debian 6 too and that uses GNOME 2! Seeing what happens on Debian 7.1 could be something that I go testing sometime.

A little look at Debian 7.0

12th May 2013

Having a virtual machine with Debian 6 on there, I was interested to hear that Debian 7.0 is out. In another VM, I decided to give it a go. Installing it on there using the Net Install CD image took a little while but proved fairly standard with my choice of the GUI-based option. GNOME was the desktop environment with which I went and all started up without any real fuss after the installation was complete; it even disconnnected the CD image from the VM before rebooting, a common failing in many Linux operating installations that lands into the installation cycle again unless you kill the virtual machine.

Though the GNOME desktop looked familiar, a certain amount of conservatism reigned too since the version was 3.4.2. That was no bad thing since raiding the GNOME Extension site for a set of mature extensions was made all the more easy. In fact, a certain number of these was included in the standard installation anyway and the omission of a power off entry on the user menu was corrected as a matter of course without needing any intervention from this user. Adding to what already was there made for a more friendly desktop experience in a short period of time.

Debian’s variant of Firefox , Iceweasel, is version 10 so a bit of tweaking is needed to get the latest version. LibreOffice is there now too and it’s version 3.5 rather than 4. Shotwell too is the older 0.12 and not the 0.14 that is found in the likes of Ubuntu 13.04. As it happens, GIMP is about the only software with a current version and that is 2.8; a slower release cycle may be the cause of that though. All in all, the general sense is that older versions of current software are being included for the sake of stability and that is sensible too so I am not complaining very much about this at all.

The reason for not complaining is that the very reason for having a virtual machine with Debian 6 on there is to have Zinio and Dropbox available too. Adobe’s curtailment of support for Linux means that any application needing Adobe Air may not work on a more current Linux distribution. That affects Zinio so I’ll be retaining a Debian 6 instance for a while yet unless a bout of testing reveals that a move to the newer version is possible. As for Dropbox, I am sure that I can recall why I moved it onto Debian but it’s working well on there so I am in no hurry to move it over either. There are times when slower software development cycles are better…

Piggybacking an Android Wi-Fi device off your Windows PC’s internet connection

16th March 2013

One of the disadvantages of my Google/Asus Nexus 7 is that it needs a Wi-Fi connection to use. Most of the time this is not a problem since I also have a Huawei mobile WiFi hub from T-Mobile and this seems to work just about anywhere in the U.K. Away from the U.K. though, it won’t work because roaming is not switched on for it and that may be no bad thing with the fees that could introduce. My HTC Desire S could deputise but I need to watch costs with that too.

There’s also the factor of download caps and those apply both to the Huawei and to the HTC. Recently, I added Anquet‘s Outdoor Map Navigator (OMN) to my Nexus 7 through the Google Play store for a fee of £7 and that allows access to any walking maps that I have bought from Anquet. However, those are large downloads so the caps start to come into play. Frugality would help but I began to look at other possibilities that make use of a laptop’s Wi-Fi functionality.

Looking on the web, I found two options for this that work on Windows 7 (8 should be OK too): Connectify Hotspot and Virtual Router Manager. The first of these is commercial software but there is a Lite edition for those wanting to try it out; that it is not a time limited demo is not something that I can confirm though that did not seem to be the case since it looked as if only features were missing from it that you’d get if you paid for the Pro variant. The second option is an open source one and is free of charge apart from an invitation to donate to the project.

Though online tutorials show the usage of either of these to be straightforward, my experiences were not all that positive at the outset. In fact, there was something that I needed to do and that is why this post has come to exist at all. That happened even after the restart that Conectify Hotspot needed as part of its installation; it runs as a system service so that’s why the restart was needed. In fact, it was Virtual Router Manager that told me what the issue was and it needed no reboot. Neither did it cause network disconnection of a laptop like the Connectify offering did on me and that was the cause of its ejection from that system; limitations in favour of its paid addition aside, it may have the snazzier interface but I’ll take effective simplicity any day.

Using Virtual Router Manager turns out to be simple enough. It needs a network name (also known as an SSID), a password to restrict who accesses the network and the internet connection to be shared. In my case, the was Local Area Connection on the drop down list. With all the required information entered,  I was ready to start the router using the Start Network Router button. The text on this changes to Stop Network Router when the hub is operational or at least it should have done for me on the first time that I ran it. What I got instead was the following message:

The group or resource is not in the correct state to perform the requested operation.

The above may not say all that much but it becomes more than ample information if you enter it into the likes of Google. Behind the scenes, Virtual Router Manager is using native Windows functionality is create a WiFi hub from a PC and it appears to be the Microsoft Virtual Wi-Fi Miniport Adapter from what I have seen. When I tried setting up an adhoc Wi-Fi network from a laptop to the Nexus 7 using Windows’ own network set up capability via its Control Panel, it didn’t do what I needed so there might be something that third party software can do. So, the interesting thing about the solution to my Virtual Router Manager problem was that it needed me to delve into the innards of Windows a little.

Firstly, there’s running Command Prompt (All Programs > Accessories) from the Start Menu with Administrator privileges. It helps here if the account with which you log into Windows is in the Administrators group since all you have  to do then is right click on the Start Menu entry and choose Run as administrator entry in the pop-up context menu. With a command line window now open, you then need to issue the following command:

netsh wlan set hostednetwork mode=allow ssid=[network name] key=[password] keyUsage=persistent

When that had done its thing, Virtual Router Manager worked without a hitch though it did turn itself after a while and that may be no bad thing from the security standpoint. On the Android side, it was a matter of going in Settings > Wi-Fi and choose the new network that have been creating on the laptop. This sort of thing may apply to other types of tablet (Dare I mention iPads?) so you could connect anything to the hub without needing to do any more on the Windows side.

For those wanting to know what’s going on behind the scenes on Windows, there’s a useful tutorial on Instructables that shows what third party software is saving you from having to do. Even if I never go down the more DIY route, I probably have saved myself having to buy a mobile Wi-Fi hub for any trips to Éire. For now, the Irish 3G dongle that I already have should be enough.

Getting an Epson Pefection 4490 Photo scanner going with Ubuntu GNOME Remix 12.10

7th March 2013

My Epson Perfection 4490 Photo scanner has been in my possession for a while now and its impossible to justify any replacement given that it both works well and digital photography has taken over from its film predecessor for me. Every time I go installing an operating system afresh, I need to reinstate it again and last year’s installation of Ubuntu GNOME Remix 12.04 only saw me do the deed recently. When I did so, it was brought back to me that I’d never gone and documented on here how this was done. Given that I sometimes use this place as a repository of stuff to which I need to refer again in the future, it seemed remiss of me so here it is for you all.

Though I had XSane and SimpleScan already installed on the system, Sane wasn’t on there so I went and added it and a few other extras using the following command:

sudo apt-get install sane sane-utils libsane-extras

Then, it was onto the Epson website for their Perfection 4490 Photo Linux drivers since Sane’s support for this scanner seemingly remains incomplete even though it pre-dates my move to Linux in 2007. Three files were needed and the following commands install them (depending on when you do this, the file names may be different so just change them to whatever they are for you; it can be done with a single command too but there is not enough girth for that here):

sudo dpkg -i iscan-data_1.22.0-1_all.deb
sudo dpkg -i iscan_2.29.1-5~usb0.1.ltdl7_i386.deb
sudo dpkg -i iscan-plugin-gt-x750_2.1.2-1_i386.deb

With those in place, there was one other task that needed doing so that scanning could be done without resorting to running scanning software using sudo privileges. To free up the access to a normal user account, I needed a HAL device information file. These normally are in /usr/share/hal/fdi/ but they change every time an installation so any modifications that you may make are going to be lost. Therefore, there is no point modifying either /usr/share/hal/fdi/preprobe/10osvendor/20-libsane.fdi or /usr/share/hal/fdi/preprobe/10osvendor/20-libsane-extras.fdi where scanner information usually is to be found.

The first task in creating an fdi file was to issue the lsusb command and look for a line corresponding to my scanner. This is the one that I got:

Bus 001 Device 004: ID 04b8:0119 Seiko Epson Corp. Perfection 4490 Photo

From this, I gleaned the manufacturer ID and model ID as 04b8 and 0119, respectively. These are needed later on. Next I needed to create the hal/fdi/preprobe/ folder structure under /etc since it was there. Then, I created epson4490photo.fdi in the bottom folder of the tree (/etc/hal/fdi/preprobe/epson4490photo.fdi) as follows:

cd /etc/hal/fdi/preprobe/ && sudo touch epson4490photo.fdi

Then, I edited the new file using the following command:

gksu gedit epson4490photo.fdi &

When open, I added in the following text:

<?xml version="1.0" encoding="UTF-8"?>
<deviceinfo version="0.2">
<device>
<match key="info.subsystem" string="usb">
<!-- Epson Perfection 4490 Photo -->
<match key="usb.vendor_id" int="0x04b8">
<match key="usb.product_id" int="0x0119">
<append key="info.capabilities" type="strlist">scanner</append>
<merge key="scanner.access_method" type="string">proprietary</merge>
</match>
</match>
</match>
</device>
</deviceinfo>

It’s all in XML so the place to look is immediately beneath the scanner name comment. The int attributes of the two match elements immediately following the comment line are populated using the information from the lsusb command output with 0x prefixing both the manufacturer and model identifiers. The element with a key attribute of usb.vendor_id is the former and that with a key attribute of usb.product_id is the latter. With epson4490photo.fdi saved, I rebooted the machine to restart HAL and all was as I wanted it to be apart maybe from XSane making complaints that seemed not to be of any actual consequence. With Epson’s Image Scan! and Simple Scan on the PC, there’s no need to be bothered with those messages. Choice is good when you have it, especially when you have expended some effort to get that far.

Using a variant of Debian’s Iceweasel that keeps pace with Firefox

5th February 2013

Left to its own devices, Debian will leave you with an ever ageing re-branded version of Firefox that was installed at the same time as the rest of the operating system. From what I have found, the main cause of this was that Mozilla’s wanting to retain control of its branding and trademarks in a manner not in keeping with Debian’s Free Software rules. This didn’t affect just Firefox but also Thunderbird, Sunbird and Seamonkey with Debian’s equivalents for these being IceDove, IceOwl and IceApe, respectively.

While you can download a tarball of Firefox from the web and use that, it’d be nice to get a variant that updated through Debian’s normal apt-get channels. In fact, IceWeasel does get updated whenever there is a new release of Firefox even if these updates never find their way into the usual repositories. While I have been know to take advantage of the more frozen state of Debian compared with other Linux distributions, I don’t mind getting IceWeasel updated so it isn’t a security worry.

The first step in so doing is to add the following lines to /etc/apt/sources.list using root access (using sudo, gksu or su to assume root privileges) since the file normally cannot be edited by normal users:

deb http://backports.debian.org/debian-backports squeeze-backports main
deb http://mozilla.debian.net/ squeeze-backports iceweasel-release

With the file updated and saved, the next step is to update the repositories on your machine using the following command:

sudo apt-get update

With the above complete, it is time to overwrite the existing IceWeasel installation with the latest one using an apt-get command that specifies the squeeze-backports repository as its source using the -t switch. While IceWeasel is installed from the iceweasel-release squeeze-backports repository, there dependencies that need to be satisfied and these come from the main squeeze-backports one. The actual command used is below:

sudo apt-get install -t squeeze-backports iceweasel

While that was all that I needed to do to get IceWeasel 18.0.1 in place, some may need the pkg-mozilla-archive-keyring package installed too. For those needing more information that what’s here, there’s always the Debian Mozilla team.

Installing the Cinnamon Desktop Environment on Sabayon Linux

26th January 2013

During the week, I did an update on my Sabayon system and GNOME 3.6 came on board without to much of a bother. There was no system meltdown or need for an operating system re-installation. However, there was one matter that rankled: adding and updating extensions from extensions. gnome.org was impossible. The process would create a new folder in ~/.local/share/gnome-shell/extensions/  but not fill it with anything at all. Populating from another my Ubuntu GNOME Remix 12.10 machine didn’t seem to achieve the needful and I am left wondering if it is down to the version of GNOME Shell being 3.6.2. However, even adding an entry for the current version of GNOME Shell to metadata.json for one plugin didn’t appear to do what I wanted so resolving this issue needs further enquiry.

In the meantime, I added the Cinnamon desktop environment using the following command and will be using that from now on. If the GNOME Shell extension issue ever gets sorted I may move back but there is no rush. GNOME 3.8 sounds like it’s bringing an interesting option that makes use of the approach Linux Mint took for version 12 of that distribution and I can await that, especially if it avoids the need for adding extension on a personal basis like now.

sudo equo update && sudo equo install cinnamon

With the installation completed by the above command, it was a matter of logging out and choosing the Cinnamon entry (there is a 2D version too) from the session dropdown menu on the login screen to get it going. Then, it was a matter of tweaking Cinnamon to my heart’s content. Getting a two panel layout required logging out and in again as well as choosing the appropriate setting in the Cinnamon Panel options tab. Next, I decided to check on what themes are available at cinnamon.linuxmint.org before settling on Cinnamint 1.6. It all feels very comfortable apart from not having an automatically growing list of workspaces that are a default offering in GNOME Shell. That goes against the design principles of Cinnamon though so only hopes of someone making an extension that does that are left.

Adding Quicktime movie support to Firefox running on Linux

24th January 2013

For whatever reason, my installation of Ubuntu GNOME Remix 12.10, didn’t have Quicktime support added to Firefox by default. To make a website work near enough as it was intended, this needed to be remedied. The solution was to issue a command to install the missing software:

sudo apt-get install mplayer gnome-mplayer gecko-mediaplayer

Mplayer is the main component here but running the above command adds all the required supporting packages too so that a Firefox restart was all that needed to get things going.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.