Technology Tales

Adventures & experiences in contemporary technology

Restoring GNU Parallel Functionality in Ubuntu GNOME 13.04

31st July 2013

There is a handy comand line utility called GNU Parallel that allows you to run Linux commands on more than one CPU core at a time to perform parallel processing of the task at hand. Here is a form of the command that is similar to one that I often use:

ls *.* | parallel gm convert -sharpen 1×3 {} sharpened_images/{}

What it does is pipe a list of files in a folder to GraphicsMagick for sharpening and outputting to a sharpened_images directory. The {} in the command is where the filenames go in the sharpening command.

This worked fine in Ubuntu GNOME 12.10 but stopped doing so after I upgraded to the next version. A look on the web set me to running the following command:

parallel --version

That produced output that included the following line:

WARNING: YOU ARE USING --tollef. IF THINGS ARE ACTING WEIRD USE --gnu.

Rerunning the original command with the --gnu option worked but there was a more permanent solution than using something like this:

ls *.* | parallel --gnu gm convert -sharpen 1×3 {} sharpened_images/{}

That was editing /etc/parallel/config with root privileges to delete the --tollef option from there. With that completed, all was as it should again and it makes me wonder why the change was made in the first place. Perhaps because of it, there even is a discussion about the possibility of removing the --tollef option altogether since it is raising more questions than it answers.

Command Line Processing of EXIF Image Metadata

8th July 2013

There is a bill making its way through the U.K. parliament at the moment that could reduce the power of copyright when it comes to images placed on the web. The current situation is that anyone who creates an image automatically holds the copyright for it. However, the new legislation will remove that if it becomes law as it stands. As it happens, the Royal Photographic Society is doing what it can to avoid any changes to what we have now. There may be the barrier of due diligence but how many of us take steps to mark our own intellectual property? For one, I have been less that attentive to this and now wonder if there is anything more that I should be doing. Others may copyleft their images but I don’t want to find myself unable to share my own photos because another party is claiming rights over them. There’s watermarking them but I also want to add something to the image metadata too.

That got me wondering about adding metadata to any images that I post online that assert my status as the copyright holder. It may not be perfect but any action is better than doing nothing at all. Given that I don’t post photos where EXIF metadata is stripped as part of the uploading process, it should be there to see for anyone who bothers to check and there may not be many who do.

Because I also wanted to batch process images, I looked for a command line tool to do the needful and found ExifTool. Being a Perl library, it is cross-platform so you can use it on Linux, Windows and even OS X. To install it on a Debian or Ubuntu based Linux distro, just use the following command:

sudo apt-get install libimage-exiftool-perl

The form of the command that I found useful for adding the actual copyright information is below:

exiftool -p “-copyright=(c) John …” -ext jpg -overwrite_original

The -p switch preserves the timestamp of the image file while the -overwrite_original one ensures that you don’t end up with unwanted backup files. The copyright message goes within the quotes along with the -copyright option. With a little shell scripting, you can traverse a directory structure and change the metadata for any image files contained in different sub-folders. If you wish to do more than this, there’s always the user documentation to be consulted.

A need to update graphics hardware

16th June 2013

Not being a gaming enthusiast, having to upgrade graphics cards in PC’s is not something that I do very often or even rate as a priority. However, two PC’s in my possession have had that very piece of hardware upgraded on them and it’s not because anything was broken either. My backup machine has seen quite a few Linux distros on there since I built it nearly four years ago. The motherboard is an ASRock K10N78 that sourced from MicroDirect and it has onboard an NVIDIA graphics chip that has performed well if not spectacularly. One glitch that always existed was a less than optimal text rendering in web browsers but that never was enough to get me to add a graphics card to the machine.

More recently, I ran into trouble with Sabayon 13.04 with only the 2D variant of the Cinnamon desktop environment working on it and things getting totally non-functional when a full re-installation of the GNOME edition was attempted. Everything went fine until I added the latest updates to the system when a reboot revealed that it was impossible to boot into a desktop environment. Some will relish this as a challenge but I need to admit that I am not one of those. In fact, I tried out two Arch-based distros on the same PC and got the same results following a system update on each. So, my explorations of Antergos and Manjaro have to continue in virtual machines instead.

To get a working system, I gave Linux Mint 15 Cinnamon a go and that worked a treat. However, I couldn’t ignore that the cutting edge distros that I tried before it all took exception to the onboard NVIDIA graphics. systemd has been implemented in all of these and it seems reasonable to think that it is coming to Linux Mint at some stage in the future so I went about getting a graphics card to add into the machine. Having had good experiences with ATi’s Radeon in the past, I stuck with it even though it now is in the hands of AMD. Not being that fussed so long there was Linux driver support, I picked up a Radeon HD 6450 card from PC World. Adding it into the PC was a simpler of switching off the machine, slotting in the card, closing it up and powering it on again. Only later on did I set the BIOS to look for PCI Express graphics before anything else and I could have got away without doing that. Then, I made use of the Linux Mint Additional Driver applet in its setting panel to add in the proprietary driver before restarting the machine to see if there were any visual benefits. To sort out the web browser font rendering, I used the Fonts applet in the same settings panel and selected full RGBA hinting. The improvement was unmissable if not still like the appearance of fonts on my main machine. Overall, there had been an improvement and a spot of future proofing too.

That tinkering with the standby machine got me wondering about what I had on my main PC. As well as onboard Radeon graphics, it also gained a Radeon 4650 card for which 3D support wasn’t being made available by Ubuntu GNOME 12.10 or 13.04 to VMware Player and it wasn’t happy about this when a virtual machine was set to have 3D support. Adding the latest fglrx driver only ensured that I got a command line instead of a graphical interface. Issuing one of the following commands and rebooting was the only remedy:

sudo apt-get remove fglrx

sudo apt-get remove fglrx-updates

Looking at the AMD website revealed that they no longer support 2000, 3000 or 4000 series Radeon cards with their latest Catalyst driver the last version that did not install on my machine since it was built for version 3.4.x of the Linux kernel. A new graphics card then was in order if I wanted 3D graphics in VWware VM’s and both GNOME and Cinnamon appear to need this capability. Another ASUS card, a Radeon HD 6670, duly was acquired and installed in a manner similar to the Radeon HD 6450 on the standby PC. Apart from not needing to alter the font rendering (there is a Font tab on Gnome Tweak Tool where this can be set), the only real exception was to add the Jockey software to my main PC for installation of the proprietary Radeon driver. The following command does this:

sudo apt-get install jockey-kde

When that was done I issue the jockey-kde command and selected the first entry on the list. The machine worked as it should on restarting apart from an AMD message at the bottom right hand corner bemoaning unrecognised hardware. There had been two entries on that Jockey list with exactly the same name so it was time to select the second of these and see how it went. On restarting, the incompatibility message had gone and all was well. VMware even started virtual machines with 3D support without any messages so the upgrade did the needful there.

Hearing of someone doing two PC graphics card upgrades in a weekend may make you see them as an enthusiast but my disinterest in computer gaming belies this. Maybe it highlights that Linux operating systems need 3D more than might be expected. The Cinnamon desktop environment now issues messages if it is operating in 2D mode with software 3D rendering and GNOME always had the tendency to fall back to classic mode, as it had been doing when Sabayon was installed on my standby PC. However, there remain cases where Linux can rejuvenate older hardware and I installed Lubuntu onto a machine with 10 year old technology on there (an 1100 MHz Athlon CPU, 1GB of RAM and 60GB of hard drive space in case dating from 1998) and it works surprisingly well too.

It seems that having fancier desktop environments like GNOME Shell and Cinnamon means having the hardware on which it could run. For a while, I have been tempted by the possibility of a new PC since even my main machine is not far from four years old either. However, I also spied a CPU, motherboard and RAM bundle featuring an Intel Core i5-4670 CPU, 8GB of Corsair Vengence Pro Blue memory and a Gigabyte Z87-HD3 ATX motherboard included as part of a pre-built bundle (with a heatsink and fan for the CPU) for around £420. Even for someone who has used AMD CPU’s since 1998, that does look tempting but I’ll hold off before making any such upgrade decisions. Apart from exercising sensible spending restraint, waiting for Linux UEFI support to mature a little more may be no bad idea either.

Update 2013-06-23: The new graphics card in my main machine is working as it should and has reduced the number of system error report messages turning up too; maybe Ubuntu GNOME 13.04 didn’t fancy the old graphics card all that much. A rogue .fonts.conf file was found in my home area on the standby machine and removing it has improved how fonts are displayed on there immeasurably. If you find one on your system, it’s worth doing the same or renaming it to see if it helps. Otherwise, tinkering with the font rendering settings is another beneficial act and it even helps on Debian 6 too and that uses GNOME 2! Seeing what happens on Debian 7.1 could be something that I go testing sometime.

Moving a Windows 7 VM from VirtualBox to VMware Player

14th October 2012

Seeing how well Windows 8 was running in an VMware Player virtual machine and that was without installing VMware Tools in the guest operating system, I was reminded about how sluggish my Windows 7 VirtualBox VM had become. Therefore, I decided to try a migration of the VM from VirtualBox to VMware. My hope was that it was as easy as exporting to an OVA file (File > Export Appliance… in VirtualBox) and importing that into VMware (File > Open a VM in Player). However, even selecting OVF compatibility was insufficient for achieving this and the size of the virtual disks meant that the export took a while to run as well. The solution was to create a new VM in VirtualBox from the OVA file and use the newly created VMDK files with VMware. That worked successfully and I now have a speedier more responsive Windows 7 VM for my pains.

Access to host directories needed reinstatement using a combination of the VMware Shared Folders feature and updating drive mappings in Windows 7 itself to use what appear to it like network drives in the Shared Folders directory on the \\vmware-host domain. For that to work, VMware Tools needed to be installed in the guest OS (go to Virtual Machine > Install VMware Tools to make available a virtual CD from which the installation can be done) as I discovered when trying the same thing with my Windows 8 VM, where I dare not instate VMware Tools due to their causing trouble when I last attempted it.

Moving virtual machine software brought about its side effects though. Software like Windows 7 detects that it’s on different hardware so reactivation can be needed. Windows 7 reactivation was a painless online affair but it wasn’t the same for Photoshop CS5. That meant that I needed help from Adobe’s technical support people top get past the number of PC’s for which the software already had been activated. In hindsight, deactivation should have been done prior to the move but that’s a lesson that I know well now. Technical support sorted my predicament politely and efficiently while reinforcing the aforementioned learning point. Moving virtual machine platform is very like moving from one PC to the next and it hadn’t clicked with me quite how real those virtual machines can be when it comes to software licencing.

Apart from that and figuring out how to do the it, the move went smoothly. An upgrade to the graphics driver on the host system and getting Windows 7 to recheck the capabilities of the virtual machine even gained me a fuller Aero experience than I had before then. Full screen operation is quite reasonable too (the CTRL + ALT + ENTER activates and deactivates it) and photo editing now feels less boxed in too.

Changing from to Nvidia Graphics Drivers on Linux Mint Debian Edition 64-bit

22nd April 2012

One way of doing this is to go to the Nvidia website and download the latest file from the relevant page on there. Then, the next stage is to restart your PC and choose rescue mode instead of the more usual graphical option. This drops you onto a command shell that is requesting your root password. Once this is done, you can move onto the next stage of the exercise. Migrate to the directory where the *.run file is located and issuing a command similar to the following:

bash NVIDIA-Linux-x86_64-295.40.run

The above was the latest file available at the time of writing so the name may have changed by the time that you read this. If the executable asks to modify your X configuration file, I believe that the best course is to let it do that. Editing it yourself or running nvidia-xconfig are alternative approaches if you so prefer.

Proprietary Nvidia drivers are included the repositories for Linux Mint Debian Edition so that may be a better course of action since you will get updates through normal system update channels. Then, the course of action is to start by issuing the following installation comands:

sudo apt-get install module-assistant
sudo apt-get install nvidia-kernel-common
sudo apt-get install nvidia-glx
sudo apt-get install kernel-source-NVIDIA
sudo apt-get install nvidia-xconfig

Once those have completed, issuing the following in turn will complete the job ahead of a reboot:

sudo m-a a-i nvidia
sudo modprobe nvidia
sudo nvidia-xconfig

If you reboot before running the above like I did, you will get a black screen with a flashing cursor instead of a full desktop because X failed to load. Then, the remedy is to reboot the machine and choose the rescue mode option, provide the root password and issue the three commands (at this point, the sudo prefix can be dropped because it’s unneeded) then. Another reboot will see order restored and the new driver in place. Running the following at that point will do a check on things as will be the general appearance of everything:

glxinfo | grep render

Creating placeholder graphics in SAS using PROC GSLIDE for when no data are available

18th March 2012

Recently, I found myself with a plot to produce but there were no data to be presented so a placeholder output is needed. For a lisitng or a table, this is a matter of detecting if there are observations to be listed or summarised and then issuing a placeholder lisitng using PROC REPORT if there are no data available. Using SAS/GRAPH, something similar can be acheived using one of its curiosities.

In the case of SAS/GRAPH, PROC GSLIDE looks like the tool to user for the same purpose. The procedure does get covered as part of a SAS Institute SAS/GRAPH training course but they tend to gloss over it. After all, there is little reason to go creating presentations in SAS when PowerPoint and its kind offer far more functionality. However, it would make an interesting tale to tell how GSLIDE became part of SAS/GRAPH in the first place. Its existence makes me wonder if it pre-exists the main slideshow production tools that we use today.

The code that uses PROC GSLIDE to create a placeholder graphic is as follows (detection of the number of observations in a SAS dataset is another entry on here):

proc gslide;
note height=10;
note j=center "No data are available";
run;
quit;

PROC GSLIDE is one of those run group procedures in SAS so a QUIT statement is needed to close it. The NOTE statements specify the text to be added to the graphic. The first of these creates a blank line of the required height for placing the main text in the middle of the graphic. It is the second one that adds the centred text that tells users of the generated output what has happened.

Using ODS Graphics to Create Plots Using PROC LIFETEST

3rd September 2010

One of the nice things about SAS 9.2 is that creation of statistical graphics is enhanced using ODS. One of the beneficiaries of this is PROC LIFETEST, a procedure that gained a lot when data sets could be created from it using ODS OUTPUT  statements. Before that, it was a matter of creating text output and converting it to a SAS data set using Data Step and that was a nuisance on a system that attached special significance to output destinations set up using PROC PRINTTO. What you’ll find below is a sample of the type of code for creating a Kaplan-Meier survival plot for time to adverse events resulting in discontinuation of study treatment with actual and censored times. The IMAGENAME parameter on the ODS GRAPHICS statement line controls the name of the file and it is possible to change the type using the IMAGEFMT parameter too.

ods graphics on / imagename=”fig5″;
proc lifetest data=km3 method=km plots=survival;
time timetoae*cens_ae(0);
run;
ods graphics off;

Converting from CGM to Postscript

24th November 2009

On thing that I recently had to investigate was the possibility of converting CGM vector graphics files into Postscript and from there into PDF. Having used ImageMagick for converting images before, that was an obvious option. However, that cannot process CGM files on its own and needs a delegate or helper application as well. This is the case with raw digital camera files too with UFRaw being the program chosen. For CGM images, the more obscure RALCGM is what’s needed and tracking it down is a bit of an art. The history is that it was developed at the U.K.’s Rutherford Appleton Laboratory but it seems that it was left go off into the wilderness rather than someone keeping an eye on things. With that in mind, here are the installation packages for Windows and Linux (RPM):

Windows Installer

Linux RPM

RALCGM is a handy command line tool that can covert from CGM to Postscript on its own without any need for ImageMagick at all. From what I have seen, fonts on graphical output may look greyer than black but it otherwise does its job well. However, considering that it is a freely available tool, one cannot complain too much. There are other packages for doing vector to raster conversion and the ones that I have seen do have GUI‘s but the freedom to look at for cost software wasn’t mine to have. The required command looks something like the following:

ralcgm -d PS -oL test.cgm test.ps

The switch -d PS uses the software’s Postscript driver and -oL specifies landscape orientation. If you want to find out more, here’s a PDF rendition of the help file that comes with the thing:

RALCGM Documentation

Still able to build PC systems

25th October 2009

This weekend has been something of a success for me on the PC hardware front. Earlier this year, a series of mishaps rendering my former main home PC unusable; it was a power failure that finished it off for good. My remedy was a rebuild using my then usual recipe of a Gigabyte motherboard, AMD CPU and crucial memory. However, assembling the said pieces never returned the thing to life and I ended up in no man’s land for a while, dependent on and my backup machine and laptop. That wouldn’t have been so bad but for the need for accessing data from the old behemoth’s hard drives but an external drive housing set that in order. Nevertheless, there is something unfinished about work with machines having a series of external drives hanging off them. That appearance of disarray was set to rights by the arrival of a bare bones system from Novatech in July with any assembly work restricted to the kitchen table.There was a certain pleasure in seeing a system come to life after my developing a fear that I had lost all of my PC building prowess.

That restoration of order still left finding out why those components bought earlier in the year didn’t work together well enough to give me a screen display on start-up. Having electronics testing equipment and the knowledge of its correct use would make any troubleshooting far easier but I haven’t got these. There is a place near to me where I could go for this but you are left wondering what might be said to a PC build gone wrong. Of course, the last thing that you want to be doing is embarking on a series of purchases that do not fix the problem, especially in the current economic climate.

One thing to suspect when all doesn’t turn out as hoped is the motherboard and, for whatever reason, I always suspect it last. It now looks as if that needs to change after I discovered that it was the Gigabyte motherboard that was at fault. Whether it was faulty from the outset or it came a cropper with a rogue power supply or careless with static protection is something that I’ll never know. An Asus motherboard did go rogue on me in the past and it might be that it ruined CPU’s and even a hard drive before I laid it to rest. Its eventual replacement put a stop to a year of computing misfortune and kick-started my reliance on Gigabyte. That faith is under question now but the 2009 computing hardware mishap seems to be behind me and any PC rebuilds will be done on tables and motherboards will be suspected earlier when anything goes awry.

Returning to the present, my acquisition of an ASRock K10N78 and subsequent building activities has brought a new system using an AMD Phenom X4 CPU and 4 GB of memory into use. In fact, I writing these very words using the thing. It’s all in a new TrendSonic case too (placing an elderly behemoth into retirement) and with a SATA hard drive and DVD writer. The new motherboard has onboard audio and graphics so external cards are not needed unless you are an audiophile and/or a gamer; for the record, I am neither. Those additional facilities make for easier building and fault-finding should the undesirable happen.

The new box is running the release candidate of Ubuntu 9.10 and it seems to be working without a hitch too. Earlier builds of 9.10 broke in their VirtualBox VM so you should understand the level of concern that this aroused in my mind; the last thing that you want to be doing is reinstalling an operating system because its booting capability breaks every other day. Thankfully, the RC seems to have none of these rough edges so I can upgrade the Novatech box, still my main machine and likely to remain so for now, with peace of mind when the time comes.

Harnessing the power of ImageMagick

26th October 2008

Using the command line to process images might sound senseless but the tools offered by ImageMagick certainly prove that it has its place. I have always been wary of using bulk processing for my digital photo files (some digitised from film prints with a scanner) but I do agree that some of it is needed to free up some time for other more necessary things. With this in mind, it is encouraging to see the results from ImageMagick and I can see it making a major difference to how I maintain my online photo gallery.

For instance, making thumbnail images for the gallery certainly seems to be one of those operations where command line bulk processing comes into its own and ImageMagick’s own convert command is heaven sent for this one. For resizing images, all that’s needed is the following:

convert -resize 40% input.jpg output.jpg

Add a spot of further shell scripting and even a dash of Perl and the possibilities for this sort of thing become clearer and this is but the pinnacle of the proverbial iceberg. The -rotate switch will do what the name suggests and there are a whole plethora of other options on tap. So long as you have Ghostscript on your system, conversion of graphics to Postscript (and Encapsulated Postscript too) and PDF files is possible with the -page option controlling the margin around the image itself in the resulting outputs. Unfortunately, portrait is the sole orientation on offer but a bit of judicious post processing will turn things around. Here’s a command that’ll do the trick:

convert -page 792×612+72+72 input.png ps2:output.ps

For retrieving image metadata like its resolution and size, the identify command comes into play. The -verbose option invokes the output of all manner of image metadata so using grep or egrep is perhaps advisable, especially for bulking processing with the likes of Perl. Having the ability to stream image metadata makes loading databases like MySQL less of a chore than the manual data entry that has been my way of doing things until now.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.