Technology Tales

Adventures & experiences in contemporary technology

Adding GNOME 3 to Linux Mint 11

3rd June 2011

On the surface of it, this probably sounds a very strange thing to do: choose Linux Mint because they plan to stick with their current desktop interface for the foreseeable future and then stick a brand new one on there. However, that’s what last weekend’s dalliance with Fedora 15 caused. Not only did I find that I could find my way around GNOME Shell but I actually got to like it so much that I missed it on returning to using my Linux Mint machine again.

The result was that I started to look on the web to see if there was anyone else like me who had got the same brainwave. In fact, it was Mint’s being based on Ubuntu that allowed me to get GNOME 3 on there. The task could be summarised as involving three main stages: getting GNOME 3 installed, adding extensions and adding the Cantarell font that is used by default. After these steps, I gained a well-running GNOME 3 desktop running on Linux Mint and it looks set to stay that way unless something untoward has yet to emerge.

Installing GNOME 3

The first step is to add the PPA repository for GNOME 3 using the following command:

sudo add-apt-repository ppa:gnome3-team/gnome3

The, it was a case of issuing my usual update/upgrade command:

sudo apt-get update && sudo apt-get upgrade && sudo apt-get dist-upgrade

When that had done its thing and downloaded and installed quite a few upgrades along the way, it was time to add GNOME Shell using this command:

sudo apt-get install gnome-shell

When that was done, I rebooted my system to be greeted by a login screen very reminiscent of what I had seen in Fedora. While compiling this piece, I noticed that GNOME Session could need to be added before GNOME Shell but I do not recall doing so myself. Maybe dependency resolution kept any problems at bay, but there weren’t any issues that I could remember beyond things not being configured as fully as I would have liked without further work. For the sake of safety, it might be a good idea to run the following before adding GNOME Shell to your PC.

sudo apt-get install gnome-session && sudo apt-get dist-upgrade

Configuration and Customisation

Once I had logged in, the desktop that I saw wasn’t at all unlike the Fedora one and everything seemed stable too. However, there was still work to do before I was truly at home with it. One thing that was needed was the ever useful GNOME Tweak Tool. This came in very handy for changing the theme that was on display to the standard Adwaita one that caught my eye while I was using Fedora 15. Adding buttons to application title bars for minimising and maximising their windows was another job that the tool allowed me to do. The command to get this goodness added in the first place is this:

sudo apt-get install gnome-tweak-tool

The next thing that I wanted to do was add some extensions so I added a repository from which to do this using the command below. Downloading them via Git and compiling them just wasn’t working for me so I needed another approach.

sudo add-apt-repository ppa:ricotz/testing

With that is place, I issued the following commands to gain the Dock, the Alternative Status Menu and the Windows Navigator. The second of these would have added a shutdown option in the me-menu but it seems to have got deactivated after a system update. Holding down the ALT key to change the Suspend entry to Power off… will have to do me for now. Having the dock is the most important and that, thankfully, is staying the course and works exactly as it does for Fedora.

sudo apt-get install gnome-shell-extensions-dock
sudo apt-get install gnome-shell-extensions-alternative-status-menu
sudo apt-get install gnome-shell-extensions-windows-navigator

Adding Cantarell

The default font used by GNOME 3 in various parts of its interface is Cantarell and it was defaulting to that standard sans-serif font on my system because this wasn’t in place. That font didn’t look too well so I set to tracking the freely available Cantarell down on the web.  When that search brought me to Font Squirrel, I downloaded the zip file containing the required TTF files. The next step was to install them and, towards that end, I added Fontmatrix using this command:

sudo apt-get install fontmatrix

That gave me a tool with a nice user interface but I made a mistake when using it. This was because I (wrongly) thought that it would copy files from the folder that I told the import function to use. Extracting the TTF files to /tmp meant that would have had to happen, but Fontmatrix just registered them instead. A reboot confirmed that they hadn’t been copied or moved at all and I had rendered the user interface next to unusable through my own folly; the default action on Ubuntu and Linux Mint is that files are deleted from /tmp on shutdown. The font selection capabilities of the GNOME Tweak Tool came in very handy for helping me to convert useless boxes into letters that I could read.

Another step was to change the font line near the top of the GNOME Shell stylesheet (never thought that CSS usage would end up in places like this…) so that Cantarell wasn’t being sought and text in sans-serif font replaced grey and white boxes. The stylesheet needs to be edited as superuser, so the following command is what’s needed for this and, while I used sudo, gksu is just as useful here if it isn’t what I should have been using.

sudo gedit /usr/share/gnome-shell/theme/gnome-shell.css

Once I had extricated my system from that mess, a more conventional approach was taken and the command sequence below was what I followed, with extensive use of sudo to get done what I wanted. A new directory was created and the TTF files copied in there.

cd /usr/share/fonts/truetype
sudo mkdir ttf-cantarell
cd ttf-cantarell
sudo mv /tmp/*.ttf .

To refresh the font cache, I resorted to the command described in a tutorial in the Ubuntu Wiki:

sudo fc-cache -f -v

Once that was done, it was then time to restore the reference to Cantarell in the GNOME Shell stylesheet and reinstate its usage in application windows using the GNOME Tweak Tool. Since then, I have suffered no mishap or system issue with GNOME 3. Everything seems to be working quietly and I am happy to see that replacement of Unity with the GNOME Shell will become an easier task in Ubuntu 11.10, the first alpha release of which is out at the time of my writing these words. Could it lure me back from my modified instance of Linux Mint yet? While I cannot say that I am sure of those but it certainly cannot be ruled out at this stage.

Sometimes, a firmware update is in order

28th February 2011

After a recent trip to Oxford, I have started to mull over adding a longer lens (could make more distant architectural detail photos a possibility) to complement my trusty Sigma 18-125mm f/3.8-5.6 DC HSM zoom lens that now is entering approaching its third year in my hands. While I have made no decision about the acquisition of another lens, there are some tempting bargains out there, it seems. However, the real draw on my attention is the lack of autofocus with the aforementioned Sigma and I now find it hard to believe that I was blaming the manufacturer for no keeping up with Pentax when it really was the other way around. A bit of poking around on the web revealed that all that I needed to do was download a firmware update from the Pentax website. While being slowed down by the lack of autofocus cannot have done bad things for my photography, I still wonder at why I didn’t try updating the camera for as long as I have.

In the file for updating my K10D, there was a README file containing the instructions for carrying out the update with the included binary file that was set to take the camera from version 1.00 to 1.30 (hold down the Menu button while starting the camera to see what you have). In summary, both files were copied onto an SD card that was inserted into the camera and it turned off. The next step was to power up the camera with the menu button held down to start the update. To stop erroneous updates, there is an “Are you sure?” style Yes/No menu popped up before anything else happens. Selecting Yes sets things into motion and you have to wait until the word “COMPLETE” appears in the bottom left corner before turning the camera and removing the card. Now that I think of it, I should have checked the battery before doing anything because the consequences of losing power in the middle of what I was doing would have been annoying, especially with my liking the photographic results produced by the camera.

Risk taking aside, the process was worth its while with HSM now working as it should have done all this time. It seems quiet and responsive too from my limited tests to date. Even better, the autofocus doesn’t hunt anywhere near as much as the 18-55 mm Pentax kit lens that came with the camera. The next decision is whether to stick  with my manual focussing ways or lapse into trusting autofocus from now on though my better reason is to stick with the slower approach unless the subjects are fast. Now that I think of it, train and bus photos for my transport website have become a whole lot easier as have any wildlife photos that I care to capture. Speaking of the latter brings me back to that telephoto quandary that I mentioned at the beginning. Well, there’s a tempting Sigma 50-200 mm that has caught my eye…

Why the manual step?

18th January 2010

One of the consequences of buying a new camera is that your current photo processing software may not be fully equipped for the job of handling the images that it creates. This is a particular issue with raw image files and Adobe Photoshop Elements 5 was unable to completely handle DNG files made with my Pentax K10D until I upgraded to version 7. Yes, I do realise that the upgrading camera should been in order but I only lost the white balance adjustments so I could with things as they until upgrading gained a more compelling case.

As things stood, Elements 7 was unable to import CR2 files from my Canon PowerShot G11 into the Organiser so it was off to the appropriate page on the Adobe website for a Camera Raw updater. I picked up the latest release of Camera Raw (5.6 at the time of writing) even though it was found in the Elements 8 category (don’t be put by this because release notes address the version compatibility question more extensively). Strangely, the updater doesn’t complete everything because you still need to copy Camera Raw.8bi from the zip archive and backup the original. Quite why this couldn’t have been more automated, even with user prompts for file names and locations, is beyond me but that is how it is. However, once all was in place, CR2 files were handled by Elements without missing a beat.

Command Line Software Management

2nd December 2009

One of the nice things about a Debian-based Linux distribution is that it is easy to pull a piece of software onto your system from a repository using either apt-get or aptitude. Some may prefer to have a GUI but I find that the command line offers certain extra transparency that stops the “what’s it doing?” type of question. that’s never to say that the GUI-based approach hasn’t a place and I only go using it when seeking out a piece of software without knowing its aptitude-ready name. Interestingly, there are signs that Canonical may be playing with the idea of making Ubuntu’s Software Centre a full application management tool with updates and upgrades getting added to the current searching, installation and removal facilities. That well may be but it’s going to take a lot of effort to get me away from the command line altogether.

Fedora and openSUSE have their software management commands too in the shape of yum and zypper, respectively. The recent flurry of new operating system releases has had me experimenting with both of those distros on a real test machine. As might be expected, the usual battery of installation, removal and update activities are well served and I have been playing with software searching using yum too. What has yet to mature is in-situ distribution upgrading à la Ubuntu. In principle, it is possible but I got a black screen when I tried moving from openSUSE 11.1 to 11.2 within VirtualBox using instructions on the openSUSE website. Not wanting to wait, I reached for a Live CD instead and that worked a treat on both virtual and real machines. Being in an experiment turn of mind, I attempted the same to get from Fedora 11 to the beta release of its version 12. A spot of repository trouble got me using a Live CD in its place. You can perform an in-situ upgrade from a full Fedora DVD but the only option is system replacement when you have a Live CD. Once installation is out of the way, YAST can be ignored in favour of zypper and yum is good enough that Fedora’s GUI-using alternative can be ignored. It’s nice to see good transparent ideas taking hold elsewhere and may make migration between distros much easier too.

Out of memory at line: 56

28th May 2009

This is an error that I have started to see a lot in the last few weeks. First, it was with Piwik and latterly with WordPress.com Stats. For the record, I have never seen it on up to date systems but always with IE6 and at page unloading time. The CPU usage hits 100% before the error is produced and that has had me blaming JavaScript in error; it isn’t the cause of all ills. In fact, the cause seems to be a bug in a certain release of Adobe Flash 9 but I am of the opinion that the inclusion of certain features in a Flash movie are needed to trigger it too. I don’t have the exact details of this but WordPress.com Stats worked without fault until a recent update and that is what is making me reach the conclusion that I have. That observation is making me wonder whether we are coming to a point where Flash compatibility is something that needs to factored into the use of the said technology in a website or web application. Updating Flash will solve the problem on the client but it might be better if it wasn’t triggered on the server side either.

Ubuntu upgrades: do a clean installation or use Update Manager?

9th April 2009

Part of some recent “fooling” brought on by the investigation of what turned out to be a duff DVD writer was a fresh installation of Ubuntu 8.10 on my main home PC. It might have brought on a certain amount of upheaval but it was nowhere near as severe as that following the same sort of thing with a Windows system. A few hours was all that was needed but the question as to whether it is better to do an upgrade every time a new Ubuntu release is unleashed on the world or to go for a complete virgin installation instead. With Ubuntu 9.04 in the offing, that question takes on a more immediate significance than it otherwise might do.

Various tricks make the whole reinstallation idea more palatable. For instance, many years of Windows usage have taught me the benefits of separating system and user files. The result is that my home directory lives on a different disk to my operating system files. Add to that the experience of being able to reuse that home drive across different Linux distros and even swapping from one distro to another becomes feasible. From various changes to my secondary machine, I can vouch that this works for Ubuntu, Fedora and Debian; the latter is what currently powers the said PC. You might have to user superuser powers to attend to ownership and access issues but the portability is certainly there and it applies anything kept on other disks too.

Naturally, there’s always the possibility of losing programs that you have had installed but losing the clutter can be liberating too. However, assembling a script made up up of one of more apt-get install commands can allow you to get many things back at a stroke. For example, I have a test web server (Apache/MySQL/PHP/Perl) set up so this would be how I’d get everything back in place before beginning further configuration. It might be no bad idea to back up your collection of software sources either; I have yet to add all of the ones that I have been using back into Synaptic. Then there are closed source packages such as VirtualBox (yes, I know that there is an open source edition) and Adobe Reader. After reinstating the former, all my virtual machines were available for me to use again without further ado. Restoring the latter allowed me to grab version 9.1 (probably more secure anyway) and it inveigles itself into Firefox now too so the number of times that I need to go through the download shuffle before seeing the contents of a PDF are much reduced, though not completely eliminated by the Windows-like ability to see a PDF loaded in a browser tab. Moving from software to hardware for a moment, it looks like any bespoke actions such as my activating an Epson Perfection 4490 Photo scanner need to be repeated but that was all that I needed to do. Getting things back into order is not so bad but you need to allow a modicum of time for this.

What I have discussed so far are what might be categorised as the common or garden aspects of a clean installation but I have seen some behaviours that make me wonder if the usual Ubuntu upgrade path is sufficiently complete in its refresh of your system. The counterpoint to all of this is that I may not have been looking for some of these things before now. That may apply to my noticing that DSLR support seems to be better with my Canon and Pentax cameras both being picked up and mounted for me as soon as they are connected to a PC, the caveat being that they are themselves powered on for this to happen. Another surprise that may be new is that the BBC iPlayer’s Listen Again works without further work from the user, a very useful development. It very clearly wasn’t that way before I carried out the invasive means. My previous tweaking might have prevented the in situ upgrade from doing its thing but I do see the point of not upsetting people’s systems with an overly aggressive update process, even if it means that some advances do not make themselves known.

So what’s my answer regarding which way to go once Ubuntu Jaunty Jackalope appears? For sake of avoiding initial disruption, I’d be inclined to go down the Update Manager route first while reserving the right to do a fresh installation later on. All in all, I am left with the gut feeling is that the jury is still out on this one.

Work locally, update remotely

4th December 2008

Here’s a trick that might have its uses: using a local WordPress instance to update your online blog (yes, there are plenty of applications that promise to edit your online blog but these need file permissions to the likes of xmlrpc.php to be opened up). Along with the right database access credentials and the ability to log in remotely, adding the following two lines to wp-config.php does the trick:

define('WP_SITEURL', 'http://localhost/blog');

define('WP_HOME', 'http://localhost/blog');

These two constants override what is in the database and allow to update the online database from your own PC using WordPress running on a local web server (Apache or otherwise). One thing to remember here is that both online and offline directory structures are similar. For example, if your online WordPress files are in blog in the root of the online web server file system (typically htdocs for Linux), then they need to be contained in the same directory in the root of the offline server too. Otherwise, things could get confusing and perhaps messy. Another thing to consider is that you are modifying your online blog so the usual rules about care and attention apply, particularly with respect to using the same version of WordPress both locally and remotely. This is especially a concern if you, like me, run development versions of WordPress to see if there are any upheavals ahead of us like the overhaul that is coming in with WordPress 2.7.

An alternative use of this same trick is to keep a local copy of your online database in case of any problems while using a local WordPress instance to work with it. I used to have to edit the database backup directly (on my main Ubuntu system), first with GEdit but then using a sed command like the following:

sed -e s/www\.onlinewebsite\.com/localhost/g backup.sql > backup_l.sql

The -e switch uses regular expression substitution that follows it to edit the input with the output being directed to a new file. It’s slicker than the interactive GEdit route but has been made redundant by defining constants for a local WordPress installation as described above.

Copying only updated and new files

20th October 2008

With Linux/UNIX, the command line remains ever useful and allows you to do all manner of things, including file copying that only adds new files to a destination. Here’s a command that accomplishes this in Linux:

cp -urv [source] [destination]

The u switch does the update while r ensures recursion (by default, cp only copies files from a source directory and not anything sitting in subfolders) and v tells the command to tell the user what is happening.

Though buried and hardly promoted, Windows also has its command line and here’s what accomplishes a similar result:

xcopy /d /u [source] [destination]

Anything’s better than having to approve or reject every instance where source and destination files are the same or, even worse, to overwrite a file when it is not wanted.

A quick way to do an update

8th August 2008

Here’s a quick way to get the latest updates on your PC using the command line if you are using Ubuntu or Debian:

sudo apt-get update && sudo apt-get upgrade

Of course, you can split these commands up if you prefer to look before you leap. At the very least least, it’s so much slicker than the GUI route.

Append or update?

25th November 2007

SAS can generate many types of output: plain text, XML, PDF, RTF, Excel, etc. With all of these and the SAS procedures like PROC REPORT, PROC TABULATE and so on, it might seem surprising for me to say that I have been generating output with data step PUT and FILE statements. There was, of course, a reason for this: creating text files for loading into a new database-driven software application. At one stage, I also did some data interleaving at the output stage and that’s when I discovered that the default behaviour for SAS FILE statements is to completely overwrite a file unless the MOD option was specified. Adding that switches on APPEND behaviour. The code below adds a header in one step while adding data below it in another. I know that there are slicker ways to achieve this like setting up your data as you want it or using _N_ to ensure that something only appears once but here’s another way. As per the Perl, there’s often more than one way to do something with SAS.

data _null_;
file ds_data;
put "fieldtype;datasetname;datasetlabel;datasetlayout;datasetclass;datasetstandardversion";
run;

data _null_;
set ds_ispec;
file ds_data mod;
line="datasetstandard;"||trim(memname)||";"||trim(memlabel)||";;;"||trim(memver);
put line;
run;

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.