Technology Tales

Adventures & experiences in contemporary technology

Moving from Ubuntu 10.10 to Linux Mint 10

23rd April 2011

With a long Easter weekend available to me and with thoughts of forthcoming changes in the world of Ubuntu, I got to wondering about the merits of moving my main home PC to Linux Mint instead. Though there is a rolling variant based on Debian, I went for the more usual one based on Ubuntu that uses GNOME. For the record, Linux Mint isn’t just about the GNOME desktop but you also can have it with Xfce, LXDE and KDE desktops as well. While I have been known to use Lubuntu and like its LXDE implementation, I stuck with the option of which I have most experience.

Once I selected the right disk for the boot loader, the main installation of Mint went smoothly. By default, Ubuntu seems to take care of this but Mint leaves it to you. When you have your operating system files on sdc, installation on the default of sda isn’t going to produce a booting system. Instead, I ended up with GRUB errors and, while I suppose that I could have resolved these, the lazier option of repeating the install with the right boot loader location was the one that I chose. It produced the result that I wanted: a working and loading operating system.

However, there was not something not right about the way that the windows were displayed on the desktop with title bars and window management not working as they should. Creating a new account showed that it was the settings that were carried over from Ubuntu in my home area that were the cause. Again, I opted for a less strenuous option and moved things from the old account to the new one. One outcome of that decisions was that there was a lot of use of the chown command in order to get file and folder permissions set for the new account. In order to make this all happen, the new account needed to be made into an Administrator just like its predecessor; by default, more restrictive desktop accounts are created using the Users and Groups application from the Administration submenu. Once I was happy that the migration was complete, I backed up any remaining files from the old user folder and removed it from the system. Some of the old configuration files were to find a new life with Linux Mint.

In the middle of the above, I also got to customising my desktop to get the feel that is amenable. For example, I do like a panel at the top and another at the bottom. By default, Linux Mint only comes with the latter. The main menu was moved to the top because I have become used to having there and switchers for windows and desktops were added at the bottom. They were only a few from what has turned out not to be a short list of things that I fancied having: clock, bin, clearance of desktop, application launchers, clock, broken application killer, user switcher, off button for PC, run command and notification area. It all was gentle tinkering but still is the sort of thing that you wouldn’t want to have to do over and over again. Let’s hope that is the case for Linux Mint upgrades in the future. That the configuration files for all of these are stored in home area hopefully should make life easier, especially when an in-situ upgrade like that for Ubuntu isn’t recommended by the Mint team.

With the desktop arranged to my liking, the longer job of adding to the collection of software on there while pruning a few unwanted items too was next. Having had Apache, PHP and MySQL on the system before I popped in that Linux Format magazine cover disk for the installation, I wanted to restore them. To get the off-line websites back, I had made copies of the old Apache settings that simply were copied over the defaults in /etc/apache (in fact, I simply overwrote the apache directory in /etc but the effect was the same). MySQL Administrator had been used to take a backup of the old database too. In the interests of spring cleaning, I only migrated a few of the old databases from the old system to the new one. In fact, there was an element of such tidying in my mind when I decided to change Linux distribution in the first place; Ubuntu hadn’t been installed from afresh onto the system for a while anyway and some undesirable messages were appearing at update time though they were far from being critical errors.

The web server reinstatement was only part of the software configuration that I was doing and there was a lot of use of apt-get while this was in progress. A rather diverse selection was added: Emacs, NEdit, ClamAV, Shotwell (just make sure that your permissions are sorted first before getting this to use older settings because anything inaccessible just gets cleared out; F-Spot was never there is the first place in my case but it may differ for you), UFRaw, Chrome, Evolution (never have been a user of Mozilla Thunderbird, the default email client on Mint), Dropbox, FileZilla, MySQL Administrator, MySQL Query Browser, NetBeans, POEdit, Banshee (Rhythmbox is what comes with Mint but I replaced it with this), VirtualBox and GParted. This is quite a list and while I maybe should have engaged the services of dpkg to help automate things, I didn’t on this occasion though Mint seems to have a front end for it that does the same sort of thing. Given that the community favour clean installations, it’s little that something like this is on offer in the suite of tools in the standard installation. This is the type of rigmarole that one would not draw on themselves too often.

With desktop tinkering and software installations complete, it was time to do a little more configuration. In order to get my HP laser printer going, I ran hp-setup to download the (proprietary, RMS will not be happy…) driver for it because it otherwise wouldn’t work for me. Fortune was removed from the terminal sessions because I like them to be without such things. To accomplish this, I edited /etc/bash.bashrc and commented out the /usr/games/fortune line before using apt-get to clear the software from my system. Being able to migrate my old Firefox and Evolution profiles, albeit manually, has become another boon. Without doubt, there are more adjustments that I could be making but I am happy to do these as and when I get to them. So far, I have a more than usable system, even if I engaged in more customisation than many users would go doing.

It probably is useful to finish this by sharing my impressions of Linux Mint. What goes without saying is that some things are done differently and that is to be expected. Distribution upgrades are just one example but there are tools available to make clean installations that little bit easier. To my eyes, the desktop looks very clean and fond display is carried over from Ubuntu, not at all a bad thing. That may sound a small matter but it does appear to me that Fedora and openSUSE could learn a thing or too about how to display fonts on screen on their systems. It is the sort of thing that adds the spot of polish that leaves a much better impression. So far, it hasn’t been any hardship to find my way around and I can make the system fit my wants and needs. That it looks set to stay that way is another bonus. We have a lot of change coming in the Linux world with GNOME 3 on the way and Ubuntu’s decision to use Unity as their main desktop environment. While watching both of these developments mature, it looks as if I’ll be happily using Mint. Change can refresh but a bit of stability is good too.

An avalanche of innovation?

23rd September 2010

It seems that, almost in spite of the uncertain times or maybe because of them, it feels like an era of change on the technology front. Computing is the domain of many of the postings on this blog and a hell of a lot seems to be going mobile at the moment. For a good while, I managed to stay clear of the attractions of smartphones until a change of job convinced me that having a BlackBerry was a good idea. Though the small size of the thing really places limitations on the sort of web surfing experience that you can have with it, you can keep an eye on the weather, news, traffic, bus and train times so long as the website in question is built for mobile browsing. Otherwise, it’s more of a nuisance than a patchy phone network (in the U.K., T-Mobile could do better on this score as I have discovered for myself; thankfully, a merger with the Orange network is coming next month).

Speaking of mobile websites, it almost feels as if a free for all has recurred for web designers. Just when the desktop or laptop computing situation had more or less stabilised, along come a whole pile of mobile phone platforms to make things interesting again. Familiar names like Opera, Safari, Firefox and even Internet Explorer are to be found popping up on handheld devices these days along with less familiar ones like Web ‘n’ Walk or BOLT. The operating system choices vary too with iOS, Android, Symbian, Windows and others all competing for attention. It is the sort of flowering of innovation that makes one wonder if a time will come when things begin to consolidate but it doesn’t look like that at the moment.

The transformation of mobile phones into handheld computers isn’t the only big change in computing with the traditional formats of desktop and laptop PC’s being flexed in all sorts of ways. First, there’s the appearance of netbooks and I have succumbed to the idea of owning an Asus Eee. Though you realise that these are not full size laptops, it still didn’t hit me how small these were until I owned one.  They are undeniably portable and tablets look even more interesting in the aftermath of Apple’s iPad. You may call them over-sized mobile photos but the idea of making a touchscreen do the work for you has made the concept fly for many. Even so, I cannot say that I’m overly tempted though I have said that before about other things.

Another area of interest for me is photography and it is around this time of year that all sorts of innovations are revealed to the public. It’s a long way from what we thought was the digital photography revolution when digital imaging sensors started to take the place of camera film in otherwise conventional compact and SLR cameras, making the former far more versatile than they used to be. Now, we have SLD cameras from Olympus, Panasonic, Samsung and Sony that eschew the reflex mirror and prism arrangement of an SLR using digital sensor and electronic viewfinders while offering the possibility of lens interchangeability and better quality than might be expected from such small cameras. In recent months, Sony has offered SLR-style cameras with translucent mirror technology instead of the conventional mirror that is flipped out of the way when a photographic image is captured.  Change doesn’t end there with movie making capabilities being part of the toolset of many a newly launch  compact, SLD and SLR camera. The pixel race also seems to have ended though increases still happen as with the Pentax K-5 and Canon EOS 60D (both otherwise conventional offerings that have caught my eye though so much comes on the market at this time of year that waiting is better for the bank balance).

The mention of digital photography brings to mind the subject of digital image processing and Adobe Photoshop Elements 9 is just announced after Photoshop CS5 appeared earlier this year. It almost feels as if a new version of Photoshop or its consumer cousin are released every year, causing me to skip releases when I don’t see the point. Elements 6 and 8 were such versions for me and I’ll be in no hurry to upgrade to 9 yet either though the prospect of using content aware filling to eradicate unwanted objects from images is tempting. Nevertheless, that shouldn’t stop anyone trying to exclude them in the first place. In fact, I may need to reduce the overall number of images that I collect in favour of bringing away only good ones. The outstanding question on this is can I slow down and calm my eagerness to bring at least one good image away from an outing by capturing anything that seems promising at the time. Some experimentation but being a little more choosy can  save work later on.

While back on the subject of software, I’ll voyage in to the world of the web before bringing these meanderings to a close. It almost feels as if there is web-based application following web-based application these days when Twitter and Facebook nearly have become household names and cloud computing is a phrase that turns up all over the place.  In fact, the former seems to have encouraged a whole swathe of applications all of itself. Applications written using technologies well used on the web must stuff many a mobile phone app store too and that brings me full circle for it is these that put so much functionality on our handsets with Java seemingly powering those I use on my BlackBerry. Them there’s spat between Apple and Adobe regarding the former’s support for Flash.

To close this mental amble, there may be technologies that didn’t come to mind while I was pondering this piece but they doubtless enliven the technological landscape too. However, what I have described is enough to take me back more than ten years ago when desktop computing and the world of the web were a lot more nascent than is the case today. Then, the changes that were ongoing felt a little exciting now that I look back on them and it does feel as if the same sort of thing is recurring though with things like phones creating the interest in place of new developments in desktop computing such as a new version of Window (though 7 was anticipated after Vista). Web designers may complain about a lack of standardisation and they’re not wrong but this may be an ear of technological change that in time may be remembered with its own fondness too.

Why the manual step?

18th January 2010

One of the consequences of buying a new camera is that your current photo processing software may not be fully equipped for the job of handling the images that it creates. This is a particular issue with raw image files and Adobe Photoshop Elements 5 was unable to completely handle DNG files made with my Pentax K10D until I upgraded to version 7. Yes, I do realise that the upgrading camera should been in order but I only lost the white balance adjustments so I could with things as they until upgrading gained a more compelling case.

As things stood, Elements 7 was unable to import CR2 files from my Canon PowerShot G11 into the Organiser so it was off to the appropriate page on the Adobe website for a Camera Raw updater. I picked up the latest release of Camera Raw (5.6 at the time of writing) even though it was found in the Elements 8 category (don’t be put by this because release notes address the version compatibility question more extensively). Strangely, the updater doesn’t complete everything because you still need to copy Camera Raw.8bi from the zip archive and backup the original. Quite why this couldn’t have been more automated, even with user prompts for file names and locations, is beyond me but that is how it is. However, once all was in place, CR2 files were handled by Elements without missing a beat.

A new acquisition

16th January 2010

Back in the early days of this blog, I mulled over the idea of having a high-end digital compact camera to complement a DSLR that then was delivering very dusty images; that Canon EOS 10D was cleaned since then and comes in for occasional use to this day. That was nearly three years ago and a first generation Ricoh GR Digital was the item that then was catching my eye. At the time, I failed to justify spending that much money on such a thing and ended up acquiring a new Pentax K10D DSLR instead. The question that rattled about my head was this: what was the point of spending DSLR money on a compact camera? Its one that never really went away and comes to mind when you see the prices of interchangeable lens compacts like Olympus’ Pen and equivalent offerings from Panasonic and Ricoh (there, it’s interchangeable lens units rather than actual lenses).

The strongest counterpoint to the cost conundrum is the little matter of size. SLR (film or digital) cameras are sizeable things and there is a place for having something that drops into a pocket. It is that which has propelled me into taking delivery of a Canon PowerShot G11. It may need a good-sized pocket but, unless you are going out with no jacket, it shouldn’t be a problem most of the time. For those shorter sorties when I don’t fancy bringing an SLR out, it well-built and looks the business though some acclimatisation is in order to make the best of the knobs, buttons and menus. Nevertheless, the included manual will help with this process (there’s a paper quick start guide and more detailed documentation on CD).

Canon PowerShot G11

The camera hasn’t seen extensive use just yet so here are a few early impressions. Firstly, there’s the matter of size: it’s even smaller than the first camera that I ever bought (more than fifteen years ago) and that was a Ricoh 35 mm compact film camera. That comparison is even more striking when you consider the feature sets. The Ricoh was a fixed 35 mm lens affair with things like date and time stamping, ISO choice and a nod towards scenic mode selection. In contrast, the much newer Canon is loaded with the sorts of things that normally are found almost exclusively on SLR’s, starting with its effective 28-140 mm focal length range.

Exposure modes such as manual, aperture priority and shutter priority complement scene-based modes and another for movies (not a concern of mine, it has to be said). As if that weren’t enough, there’s exposure compensation too. It came as a surprise to me to find a form of manual focussing included though it is not as convenient as turning a focussing ring on a lens. You can see the inbuilt flash above but there’s also a hotshoe and a place to attach a tripod too. Settings like white balance and file format are accessed using the Function/Set button with the lever underneath the shutter release button controlling the focal length of the lens. In addition, there’s also image stabilisation and that’s important when you’re using live view to compose a photo. Spot metering and focal point selection are other things that find their way into the package. Some may be excited by other things but exposure and focussing are essential for any photographic efforts.

An optical viewfinder is included and it has dioptre settings too but my first impressions are that live view though the rear screen trumps it and I see no need for such things on SLR’s. That also flips out from the camera body and can be rotated either for self-portraiture or for folding back in on the camera body for use like a non-articulated screen. Another use is with those occasions when the subject means holding the camera in positions that would be impossible with a conventional screen; holding the camera over your head or down low on the ground are the sorts of situations that come to mind.

Of course, there’s more there than those features that I have listed and the specifications on the Canon website are as good a place to start as any. So far, my only testing has been of the cursory checks variety and to make sure that the thing works properly. Still, this has given me more of a feel for the camera and how it operates. As you’d expect, high ISO settings are noisy but a bigger surprise was that the smallest aperture setting is f/8. Being used to SLR’s, I was expecting to get f/16 and its like on there but a spot of internet investigation showed that I should have been taking the size of the sensor into account with my expectations. Any trials so far have been in dull weather so I’d need to use it in a wider variety of conditions before giving it the sort of wider appraisal that you’d find in the likes Outdoor Photography (who liked it, it has to be said). For what it’s worth, I have found no major criticism so far though I cannot see it usurping my SLR’s but that never was the intention anyway.

Converting from CGM to Postscript

24th November 2009

On thing that I recently had to investigate was the possibility of converting CGM vector graphics files into Postscript and from there into PDF. Having used ImageMagick for converting images before, that was an obvious option. However, that cannot process CGM files on its own and needs a delegate or helper application as well. This is the case with raw digital camera files too with UFRaw being the program chosen. For CGM images, the more obscure RALCGM is what’s needed and tracking it down is a bit of an art. The history is that it was developed at the U.K.’s Rutherford Appleton Laboratory but it seems that it was left go off into the wilderness rather than someone keeping an eye on things. With that in mind, here are the installation packages for Windows and Linux (RPM):

Windows Installer

Linux RPM

RALCGM is a handy command line tool that can covert from CGM to Postscript on its own without any need for ImageMagick at all. From what I have seen, fonts on graphical output may look greyer than black but it otherwise does its job well. However, considering that it is a freely available tool, one cannot complain too much. There are other packages for doing vector to raster conversion and the ones that I have seen do have GUI‘s but the freedom to look at for cost software wasn’t mine to have. The required command looks something like the following:

ralcgm -d PS -oL test.cgm test.ps

The switch -d PS uses the software’s Postscript driver and -oL specifies landscape orientation. If you want to find out more, here’s a PDF rendition of the help file that comes with the thing:

RALCGM Documentation

A certain lack of speed

2nd November 2009

A little while, I encountered a problem with ImageMagick processing DNG files in Ubuntu 9.04. Not realising that I could solve me own problem by editing a file named delegates.xml, I took to getting a Debian VM to do the legwork for me. That’s where you’ll find all the commands used when helper software is used by ImageMagick to help it on its way. On its own, ImageMagick cannot deal with DNG files so the command line variant of UFRaw (itself a front end for DCRaw) is used to create a PNM file that ImageMagick can handle. The problem a few months back was that the command in delegates.xml wasn’t appropriate for a newer version of UFRaw and I got it into my head that things like this were hard-wired into ImageMagick. Now, I know better and admit my error.

With 9.1o, it seems that the command in delegates.xml has been corrected but another issue had raised its head. UFRaw 0.15, it seems, isn’t the speediest when it comes to creating PNM files and, while my raw file processing script works after a spot of modification to deal with changes in output from the identify command used, it takes far too long to run. GIMP also uses UFRaw so I wonder if the same problem has surfaced there too but it has been noticed by the Debian team and they have a package for version 0.16 of the software in their unstable branch that looks as if it has sorted the speed issue. However, I am seeing that 0.15 is in the testing branch so I’d be tempted to stick with Lenny (5.x) if any successor turns out to have slower DNG file handling with ImageMagick and UFRaw. In my estimation, 0.13 does what I need so why go for a newer release if it turns out ot be slower?

8?

12th October 2009

It now seems that we have a new version of Photoshop Elements from Adobe for every year unless you’re a Mac user. Version 7 convinced me to splash out and that gained me Camera Raw recognition of my Pentax K10D along with subtly enhanced image processing power that I have been putting to good use to get more pleasing results than I ever got before. What can be achieved by using levels, curves and the shadow/highlight adjustment tool for exposure correction has amazed me recently. Quick selection functionality has allowed me to treat skies differently from everything else in landscape photos, a flexible graduated filter if you like. It seems to work on Windows 7 along with Vista and XP so I plan to stick with it for a while yet. As you may have gathered from this, it would take some convincing to make me upgrade and, for me, version 8 doesn’t reach that mark. All in all, it seems that it is a way of giving Mac users a new release with added goodness after having to stay with 6 for so long; yes, there are new features like autotagging in the image organiser but they just don’t grab me. Given that they already have Aperture from Apple and Windows users seem to get more releases, it’s a wonder that any Mac user would toy with Elements anyway. Maybe, that’s Adobe’s suspicion too.

Anquet and VirtualBox Shared Folders

18th July 2009

For a while now, I have had Anquet installed in a virtual machine instance of Windows XP but it has been throwing errors continuously on start up. Perhaps surprisingly, it only dawned upon me recently what might have been the cause. A bit of fiddling revealed that my storing the mapping data Linux side and sharing it into the VM wasn’t helping and copying it to a VM hard drive set things to rights. This type of thing can also cause problems when it comes getting Photoshop to save files using VirtualBox’s Shared Folders feature too. Untangling the situation is a multi-layered exercise. On the Linux side, permissions need to be in order and that involves some work with chmod (775 is my usual remedy) and chgrp to open things up to the vboxusers group. Adding in Windows’ foibles when it comes to networked drives and their mapping to drive letters brings extra complexity; shared folders are made visible to Windows as \\vboxsvr\shared_folder_name\. The solution is either a lot of rebooting, extensive use of the net use command or both. It induces the sort of toing and froing that makes copying things over and back as needed look less involved and more sensible if a little more manual than might be liked.

A self-hosted online photo album option

16th July 2009

I was perusing a recent copy of Linux Format and encountered a feature describing a self-hosted alternative to the likes of Flickr: Gallery. From my quick look, it looks fully featured, offering themes and even shopping cart facilities for those who want to sell their wares. The screenshots on the open-source project’s website look promising but, for a fuller appraisal, I would need to spend some time trying to bend it to my will. Before anyone mentions it, I am aware that WordPress can be used for photoblogging, but this tool seems to take things a bit further. It’s the sort of thing about which I might have wondered, given the pervasiveness of content management systems these days. My own custom-built photo gallery is devoid of a slick back end, hence why Gallery caught my eye, but I’ll continue with it and may even get to adding the needful myself.

ImageMagick and Ubuntu 9.04

5th May 2009

Using a command line tool like ImageMagick for image processing may sound a really counter-intuitive thing to do but there’s no need to do everything on a case by case interactive basis. Image resizing and format conversion come to mind here. Helper programs are used behind the scenes too with Ghostscript being used to create Postscript files, for example.

The subject of helper programs brings me to an issue that has hampered me recently. While I am aware that there are tools like F-Spot available, I am also wont to use a combination of shell scripting (BASH & KSH), Perl and ImageMagick for organising my digital photos. My preference for using Raw camera files (DNG & CRW) means that ImageMagick cannot access these without a little helper. In the case of Ubuntu, it’s UFRaw. However, Jaunty Jackalope appears to have seen UFRaw updated to a version that is incompatible with the included version of ImageMagick (6.4.5 as opposed to 3.5.2 at the time of writing). The result is that the command issued by ImageMagick to UFRaw -- issue the command man ufraw-batch to see the details -- is not accepted by the included version of the latter, 0.15 if you’re interested. It seems that an older release of UFRaw accepted the output device ppm16 (16-bit PPM files) but this should now be specified as ppm for the output device and 16 for the output depth. In a nutshell, where the parameter output-type did the lot, you now need both output-type and output-depth.

I thought of decoupling things by using UFRaw to create 16-bit PPM files for processing by ImageMagick but to no avail. The identify command wouldn’t return the date on which the image was taken. I even changed the type to 8-bit JPEG’s with added EXIF information but no progress was made. In the end, a mad plan came to mind: creating a VirtualBox VM running Debian. The logic was that if Debian deserves its reputation for solidity, dependencies like ImageMagick and UFRaw shouldn’t be broken and I wasn’t wrong. To make it fly though, I needed to see if I could get Guest Additions installed on Debian. Out of the box, the supported kernel version must be at least 2.6.27 and Debian’s is 2.6.26 so additional work was on the cards. First, GCC, Make and the correct kernel header files need to be installed. Once those are in place, the installation works smoothly and a restart sets the goodies in motion. To make the necessary Shared folder to be available, a command like the following was executed:

mount -t vboxfs [Shared Folder name] [mount point]

Once that deed was done and ImageMagick instated, the processing that I have been doing for new DSLR images was reinstated. Ironically, Debian’s version of ImageMagick, 6.3.7, is even older than Ubuntu’s but it works and that’s the main thing. There is an Ubuntu bug report for this on Launchpad so I hope that it gets fixed at some point in the near future. However, that may mean awaiting 9.10 or Karmic Koala so I’m glad to have the workaround in the meantime.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.