Technology Tales

Adventures & experiences in contemporary technology

When a hard drive is unrecognised by the Linux hddtemp command

15th August 2021

One should not do a new PC build in the middle of a heatwave if you do not want to be concerned about how fast fans are spinning and how hot things are getting. Yet, that is what I did last month after delaying the act for numerous months.

My efforts mean that I have a system built around an AMD Ryzen 9 5950X CPU and a Gigabyte X570 Aorus Pro with 64 GB of memory and things are settling down after the initial upheaval. That also meant some adjustments to the CPU fan profile in the BIOS for quieter running while the the use of Be Quiet! Dark Rock 4 cooler also helps as does a Be Quiet! Silent Wings 3 case fan. All are components from trusted brands though I wonder how much abuse they got during their installation and subsequent running in.

Fan noise is a non-quantitative indicator of heat levels as much as touch so more quantitative means are in order. Aside from using a thermocouple device, there are in-built sensors too. My using Linux Mint means that I have the sensors command from the lm-sensors package for checking on CPU and other temperatures though hddtemp is what you need for checking on the same for hard drives. The latter can be used as follows:

sudo hddtemp /dev/sda /dev/sdb

This has to happen using administrator access and a list of drives needs to be provided because it cannot find them by itself. In my case, I have no mechanical hard drives installed in non-NAS systems and I even got to replacing a 6 TB Western Digital Green disk with an 8 TB SSD but I got the following when I tried checking on things with hddtemp:

WARNING: Drive /dev/sda doesn't seem to have a temperature sensor.
WARNING: This doesn't mean it hasn't got one.
WARNING: If you are sure it has one, please contact me ([email protected]).
WARNING: See --help, --debug and --drivebase options.
/dev/sda: Samsung SSD 870 QVO 8TB: no sensor

The cause of the message for me was that there is no entry for Samsung SSD 870 QVO 8TB in /etc/hddtemp.db so that needed to be added there. Before that could be rectified, I needed to get some additional information using smartmontools and these needed to be installed using the following command:

sudo apt-get install smartmontools

What I needed to do was check the drive’s SMART data output for extra information and that was achieved using the following command:

sudo smartctl /dev/sda -a | grep -i Temp

What this does is to look for the temperature information from smartctl output using the grep command with output from the first being passed to the second through a pipe. This yielded the following:

190 Airflow_Temperature_Cel 0x0032 072 050 000 Old_age Always - 28

The first number in the above (190) is the thermal sensor’s attribute identifier and that was needed in what got added to /etc/hddtemp.db. The following command added the necessary data to the aforementioned file:

echo \"Samsung SSD 870 QVO 8TB\" 190 C \"Samsung SSD 870 QVO 8TB\" | sudo tee -a /etc/hddtemp.db

Here, the output of the echo command was passed to the tee command for adding to the end of the file. In the echo command output, the first part is the name of the drive, the second is the heat sensor identifier, the third is the temperature scale (C for Celsius or F for Fahrenheit) and the last part is the label (it can be anything that you like but I kept it the same as the name). On re-running the hddtemp command, I got output like the following so all was as I needed it to be.

/dev/sda: Samsung SSD 870 QVO 8TB: 28°C

Since then, temperatures may have cooled and the weather become more like what we usually get but I am still keeping an eye on things, especially when the system is put under load using Perl, R, Python or SAS. There may be further modifications such as changing the case or even adding water cooling, not least to have a cooler power supply unit, but nothing is being rushed as I monitor things to my satisfaction.

Sorting out sluggish start-up and shutdown times in Linux Mint 19

9th August 2018

The Linux Mint team never pushes anyone into upgrading to the latest version of their distribution but curiosity often is strong enough an impulse to make me do just that. When it brings me across some rough edges, then the wisdom of leaving things alone is evident. Nevertheless, doing so also brings its share of learning and that is what I am sharing in this post. It also also me to collect a number of titbits that may be of use to others.

Again, I went with the in-situ upgrade option though the addition of the Timeshift backup tool means that it is less frowned upon than once would have been the case. It worked well too part from slow start-up and shutdown times so I set about track down the causes on the two machines that I have running Linux Mint. As it happens, the cause was different on each machine.

On one PC, it was networking that holding up things. The cause was my specifying a fixed IP address in /etc/network/interfaces instead of using the Network Settings GUI tool. Resetting the configuration file back to its defaults and using the Cinnamon settings interface took away the delays. It was inspecting /var/log/boot.log that highlighted problem so that is worth checking if I ever encounter slow start times again.

As I mentioned earlier, the second PC had a very different problem though it also involved a configuration file. What had happened was that /etc/initramfs-tools/conf.d/resume contained the wrong UUID for my system’s swap drive so I was seeing messages like the following:

W: initramfs-tools configuration sets RESUME=UUID=<specified UUID for swap partition>
W: but no matching swap device is available.
I: The initramfs will attempt to resume from <specified file system location>
I: (UUID=<specified UUID for swap partition>)
I: Set the RESUME variable to override this.

Correcting the file and executing the following command fixed the issue by updating the affected initramfs image for all installed kernels and speeded up PC start-up times:

sudo update-initramfs -u -k all

Though it was not a cause of system sluggishness, I also sorted another message that I kept seeing during kernel updates and removals on both machines. This has been there for a while and causes warning messages about my system locale not being recognised. The problem has been described elsewhere as follows: /usr/share/initramfs-tools/hooks/root_locale is expecting to see individual locale directories in /usr/lib/locale but locale-gen is configured to generate an archive file by default.  Issuing the following command sorted that:

sudo locale-gen --purge --no-archive

Following these, my new Linux Mint 19 installations have stabilised with more speedy start-up and shutdown times. That allows me to look at what is on Flathub to see what applications and if they get updated to the latest version on an ongoing basis. That may be a topic for another entry on here but the applications that I have tried work well so far.

Carrying out a hard reset of a home KVM switch

20th March 2017

During a recent upgrade from Linux Mint 18 to Linux Mint 18.1 on a secondary machine, I ran into bother with my Startech KVM (keyboard, video, mouse and audio sharing) switch. The PC failed to recognise the attachment of my keyboard and mouse so an internet search began.

Nothing promising came from it apart from resetting the KVM switch. In other words, the solution was to turn it off and back on again. That was something that I did try without success. What I had overlooked was that there USB connections to PC’s that fed the device with a certain amount of power and that was enough to keep it on.

Unplugging those USB cables as well as the power cable was needed to completely switch off the device. That provided the reset that I needed and all was well again. Otherwise, I would have been baffled enough to resort to buying a replacement KVM switch so the extra information avoided a purchase that could have cost in the region of £100. In other words, a little research had saved me money.

Trying out a new way to upgrade Linux Mint in situ while going from 17.3 to 18.1

19th March 2017

There was a time when the only recommended way to upgrade Linux Mint from one version to another was to do a fresh installation with back-ups of data and a list of the installed applications created from a special tool.

Even so, it never stopped me doing my own style of in situ upgrade though some might see that as a risky option. More often than not, that actually worked without causing major problems in a time when Linux Mint releases were more tightly tied to Ubuntu’s own six-monthly cycle.

In recent years, Linux Mint’s releases have kept in line with Ubuntu’s Long Term Support (LTS) editions instead. That means that any major change comes only every two years with minor releases in between those. The latter are delivered through Linux Mint’s Update Manager so the process is a simple one to implement. Still, upgrades are not forced on you so it is left to your discretion as to when you need to upgrade since all main and interim versions get the same extended level of support. In fact, the recommendation is not to upgrade at all unless something is broken on your own installation.

For a number of reasons, I stuck with that advice by sticking on my main machine with Linux Mint 17.3 instead of upgrading to Linux Mint 18. The fact that I broke things on another machine using an older method of upgrading provided even more encouragement.

However, I subsequently discovered another means of upgrading between major versions of Linux Mint that had some endorsement from the project. There still are warnings about testing a live DVD version of Linux Mint on your PC first and backing up your data beforehand. Another task is ensuring that you are upgraded from a fully up to data Linux Mint 17.3 installation.

When you are ready, you can install mintupgrade using the following command:

sudo apt-get install mintupgrade

When that is installed, there is a sequence of tasks that you need to do. The first of these is to simulate an upgrade to test for the appearance of untoward messages and resolve them. Repeating any checking until all is well gets a recommendation. The command is as follows:

mintupgrade check

Once you are happy that the system is ready, the next step is to download the updated packages so they are on your machine ahead of their installation. Only then should you begin the upgrade process. The two commands that you need to execute are below:

mintupgrade download
mintupgrade upgrade

Once these have completed, you can restart your system. In my case the whole process worked well with only my PHP installation needing attention. A clash between different versions of the scripting interpretor was addressed by removing the older one since PHP 7 is best kept for sake of testing. Beyond that, a reinstallation of VMware Player and the move from version 18 to version 18.1, there hardly was anything more to do and there was next to no real disruption. That is just as well since I depend heavily on my main PC these days. The backup option of a full installation would have left me clearing up things for a few days afterwards since I use a bespoke selection of software.

Pondering travel device consolidation using an Apple iPad Pro 12.9″

18th September 2016

It was a change of job in 2010 that got me interested in using devices with internet connectivity on the go. Until then, the attraction of smartphones had not been strong, but I got myself a Blackberry on a pay as you go contract, but the entry device was painfully slow, and the connectivity was 2G. It was a very sluggish start.

It was supplemented by an Asus Eee PC that I connected to the internet using broadband dongles and a Wi-Fi hub. This cumbersome arrangement did not work well on short journeys and the variability of mobile network reception even meant that longer journeys were not all that successful either. Usage in hotels and guest houses though went better and that has meant that the miniature laptop came with me on many a journey.

In time, I moved away from broadband dongles to using smartphones as Wi-Fi hubs and that largely is how I work with laptops and tablets away from home unless there is hotel Wi-Fi available. Even trips overseas have seen me operate in much the same manner.

One feature is that we seem to carry quite a number of different gadgets with us at a time and that can cause inconvenience when going through airport security since they want to screen each device separately. When you are carrying a laptop, a tablet, a phone and a camera, it does take time to organise yourself and you can meet impatient staff, as I found recently when returning from Oslo. Checking in whatever you can as hold luggage helps to get around at least some of the nuisance and it might be time for the use of better machinery to cut down on having to screen everything separately.

When you come away after an embarrassing episode as I once did, the attractions of consolidating devices start to become plain. In fact, most probably could get with having just their phone. It is when you take activities like photography more seriously that the gadget count increases. After all, the main reason a laptop comes on trips beyond Britain and Ireland at all is to back up photos from my camera in case an SD card fails.

Apple iPad Pro 12.9″

Parking that thought for a while, let’s go back to March this year when temptation overcame what should have been a period of personal restraint. The result was that a 32 GB 12.9″ Apple iPad Pro came into my possession along with an Apple Pencil and a Logitech CREATE Backlit Keyboard Case. It should have done so, but the size of the screen did not strike me until I got it home from the Apple Store. That was one of the main attractions because maps can be shown with a greater field of view in a variety of apps, a big selling point for a hiker with a liking for maps, who wants more than what is on offer from Apple, Google or even Bing. The precision of the Pencil is another boon that makes surfing the website so much easier and the solid connection between the case and the iPad means that keyboard usage is less fiddly than it would if it used Bluetooth. Having tried them with the BBC iPlayer app, I can confirm that the sound from the speakers is better than any other mobile device that I have used.

Already, it has come with me on trips around England and Scotland. These weekend trips saw me leave the Asus Eee PC stay at home when it normally might have come with me and taking just a single device along with a camera or two had its uses too. The screen is large for reading on a train but I find that it works just as well so long as you have enough space. Otherwise, combining use of a suite of apps with recourse to the web does much of the information seeking needed while on a trip away and I was not found wanting. Battery life is good too, which helps.

Those trips allowed for a little light hotel room blog post editing too and the iPad Pro did what was needed, though the ergonomics of reaching for the screen with the Pencil meant that my arm was held aloft more than was ideal. Another thing that raised questions in my mind is the appearance of word suggestions at the bottom of the screen as if this were a mobile phone since I wondered if these were more of a hindrance than a help given that I just fancied typing and not pointing at the screen to complete words. Copying and pasting works too but I have found the screen-based version a little clunky so I must see if the keyboard one works just as well, though the keyboard set up is typical of a Mac so that affects word selection. You need to use the OPTION key in the keyboard shortcut that you use for this and not COMMAND or CONTROL as you might do on a PC.

Transcend JetDrive Go 300

Even with these eccentricities, I was left wondering if it had any utility when it came to backing up photos from digital cameras and there is an SD card adapter that makes this possible. A failure of foresight on my part meant that the 32 GB capacity now is an obvious limitation but I think I might have hit on a possible solution that does not need to upload to an iCloud account. It involves clearing off the photos onto a 128 GB Transcend JetDrive Go 300 so they do not clog up the iPad Pro’s storage. That the device has both Lightning and USB connectivity means that you can plug it into a laptop or desktop PC afterwards too. If that were to work as I would hope, then the laptop/tablet combination that I have been using for all overseas trips could be replaced to allow a weight reduction as well as cutting the hassle at airport security.

Trips to Ireland still may see my sticking with a tried and tested combination though because I often have needed to do some printing while over there. While I have been able to print a test document from an iPad Mini on my home network-connected printer, not every model supports this and that for NFC or Air Print is not universal either. If this were not an obstacle, apps like Pages, Numbers and Keynote could have their uses for business-related work and there are web-based offerings from Google, Microsoft and others too.

In conclusion, I have found that my iPad Pro does so much of what I need on a trip away that retiring the laptop/tablet combination for most of these is not as outrageous as it once would have seemed. In some ways, iOS has a way to go yet before it could take over from macOS but it remains in development so it will be interesting to see what happens next. All the while, hybrid devices running Windows 10 are becoming more pervasive and that might provide Apple with the encouragement that it needs.

Using PowerShell to reinstall Windows Apps

9th September 2016

Recently, I managed to use 10AppsManager to remove most of the in-built apps from a Windows 10 virtual machine that I have for testing development versions in case anything ugly were to appear in a production update. Curiosity is my excuse for letting the tool do what it did and some could do with restoration. Out of the lot, Windows Store is the main one that I have sorted so far.

The first step of the process was to start up PowerShell in administrator mode. On my system, this is as simple as clicking on the relevant item in the menu popped up by right clicking on the Start Menu button and clicking on the Yes button in the dialogue box that appears afterwards. In your case, it might be a case of right clicking on the appropriate Start Menu programs entry, selecting the administrator option and going from there.

With this PowerShell session open, the first command to issue is the following:

Get-Appxpackage -Allusers > c:\temp\appxpackage.txt

This creates a listing of Windows app information and pops it into a text file in your choice of directory. Opening the text file in Notepad allows you to search it more easily and there is an entry for Windows Store:

Name                   : Microsoft.WindowsStore
Publisher              : CN=Microsoft Corporation, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
Architecture           : X64
ResourceId             :
Version                : 11607.1001.32.0
PackageFullName        : Microsoft.WindowsStore_11607.1001.32.0_x64__8wekyb3d8bbwe
InstallLocation        : C:\Program Files\WindowsApps\Microsoft.WindowsStore_11607.1001.32.0_x64__8wekyb3d8bbwe
IsFramework            : False
PackageFamilyName      : Microsoft.WindowsStore_8wekyb3d8bbwe
PublisherId            : 8wekyb3d8bbwe
PackageUserInformation : {S-1-5-21-3224249330-198124288-2558179248-1001
IsResourcePackage      : False
IsBundle               : False
IsDevelopmentMode      : False
Dependencies           : {Microsoft.VCLibs.140.00_14.0.24123.0_x64__8wekyb3d8bbwe,
Microsoft.NET.Native.Framework.1.3_1.3.24201.0_x64__8wekyb3d8bbwe,
Microsoft.NET.Native.Runtime.1.3_1.3.23901.0_x64__8wekyb3d8bbwe,
Microsoft.WindowsStore_11607.1001.32.0_neutral_split.scale-100_8wekyb3d8bbwe}

Using the information from the InstallLocation field, the following command can be built and executed (here, it has gone over several lines so you need to get your version onto a single one):

Add-AppxPackage -register “C:\Program Files\WindowsApps\Microsoft.WindowsStore_11607.1001.32.0_x64__8wekyb3d8bbwe\AppxManifest.xml” -DisableDevelopmentMode

Once the above has completed, the app was installed and ready to use again. As the mood took me, I installed other apps from the Windows Store as I saw fit.

Forcing an upgrade to Windows 10 Anniversary Update

6th September 2016

There remain people who advise those on Windows 7 or 8.x to hold fire on upgrading to Windows 10. Now that the free upgrade no longer is available, that advice may hold more weight than it did. Even so, there are those among us who jumped ship who do not mind having the latest versions of things at no monetary cost to see what is available and I must admit to being one of those.

After all, I do have a virtual machine with a pre-release version of the next update to Windows 10 installed on there to see what might be coming our way and to get a sense of what changes that may bring so that I am ready for those. Otherwise, I usually am happy to wait but I noticed that the Windows 10 Anniversary Update only came to my HP Pavilion dm4 laptop and not other machines with Windows 10 installed so I started to wonder why there was a lag when it came to automatic upgrades.

So that these things do not arrive when it is least convenient, I took advantage of a manual method in order to choose my timing. This did not involve installation from a disk image but was in-situ. The first part of the process is standard enough in that the Settings app was started and the Update & security item chosen. That dropped me onto the Windows Update and I first clicked on the Check for updates button to see what would happen. When nothing came of that, the Learn more link was clicked to bring me onto part of the Microsoft support website where I found that the Windows 10 Anniversary Update installer could be downloaded so I duly did just that.

Running it produced a screen asking whether or not I wanted to proceed. Since I wanted to go ahead, the appropriate button was clicked and the machine left alone until the process complete. Because the installer purely is a facilitator, the first stage is to download the rest of the files needed and that will take a while on any connection. Once downloading was completed, the actual process of installation commenced with several restarts before a log-in screen was again on offer. On logging in to the machine, the last part of the process started.

The process took quite a while but seemingly worked without a hitch. If there was anything that I needed to do, it was the re-installation of VirtualBox Guest Additions to restore access to shared folders as well as dealing with a self-inflicted irritation. Otherwise, I have found that previously installed software worked as expected and no file has been missed. Waiting a while may have had its advantages too because initial issues with the Anniversary Update will have been addressed but it is best not to leave it too long or you could have the feeling of being forgotten. A happy balance needs striking.

Initial impressions of Windows 10

31st October 2014

Being ever curious on the technology front, the release of the first build of a Technical Preview of Windows 10 was enough to get me having a look at what was on offer. The furore regarding Windows 8.x added to the interest so I went to the download page to get a 64-bit installation ISO image.

That got installed into a fresh VirtualBox virtual machine and the process worked smoothly to give something not so far removed from Windows 8.1. However, it took until release 4.3.18 of VirtualBox before the Guest additions had caught up with the Windows prototype so I signed up for the Windows Insider program and got a 64-bit ISO image to install the Enterprise preview of Windows 10 into a VMware virtual machine since and that supported full screen display of the preview while VirtualBox caught up with it.

Of course, the most obvious development was the return of the Start Menu and it works exactly as expected too. Initially, the apparent lack of an easy way to disable App panels had me going to Classic Shell for an acceptable Start Menu. It was only later that it dawned on me that unpinning these panels would deliver to me the undistracting result that I wanted.

Another feature that attracted my interest is the new virtual desktop functionality. Here I was expecting something like what I have used on Linux and UNIX. There, each workspace is a distinct desktop with only the applications open in a given workspace showing on a panel in there. Windows does not work that way with all applications visible on the taskbar regardless of what workspace they occupy, which causes clutter. Another deficiency is not having a desktop indicator on the taskbar instead of the Task View button. On Windows 7 and 8.x, I have been a user of VirtuaWin and this still works largely in the way that I expect of it too, except for any application windows that have some persistence associated with them; the Task Manager is an example and I include some security software in the same category too.

Even so, here are some keyboard shortcuts for anyone who wants to take advantage of the Windows 10 virtual desktop feature:

  • Create a new desktop: Windows key + Ctrl + D
  • Switch to previous desktop: Windows key + Ctrl + Left arrow
  • Switch to next desktop: Windows key + Ctrl + Right arrow

Otherwise, stability is excellent for a preview of a version of Windows that is early on its road to final release. An upgrade to a whole new build went smoothly when initiated following a prompt from the operating system itself. All installed applications were retained and a new taskbar button for notifications made its appearance alongside the existing Action Centre icon. So far, I am unsure what this does and whether the Action Centre button will be replaced in the fullness of time but I am happy to await where things go with this.

All is polished up to now and there is nothing to suggest that Windows 10 will not be to 8.x what 7 was to Vista. The Start Screen has been dispatched after what has proved to be a misadventure on the part of Microsoft. The PC still is with us and touchscreen devices like tablets are augmenting it instead of replacing it for any tasks involving some sort of creation. If anything, we have seen the PC evolve with laptops perhaps becoming more like the Surface Pro, at least when it comes to hybrid devices. However, we are not as happy smudge our PC screens quite like those on phones and tablets so a return to a more keyboard and mouse centred approach for some devices is a welcome one.

What I have here are just a few observations and there is more elsewhere, including a useful article by Ed Bott on ZDNet. All in all, we are early in the process for Windows 10 and, though it looks favourable so far, I will continue to keep an eye on how it progresses. It needs to be less experimental than Windows 8.x and it certainly is less schizophrenic and should not be a major jump for users of Windows 7.

A reappraisal of Windows 8 and 8.1 licensing

15th November 2013

With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC’s, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC’s that needed a new Windows installation and that the licence that you bought was then attached to the machine from when it got installed on there.

As is usual with Microsoft, the situation is less clear cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC’s. With Windows & and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it was a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance if not the actual installation. At times, Microsoft muddies waters so as to keep its support costs down.

Even with Microsoft’s track record in mind, it still did surprise me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discovering the same for software purchased from the Microsoft web site. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90 day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. This not only be an actual in-situ installation but there options to create an ISO or USB image of the installation disk for later installation.

In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested and I am unsure that I will go beyond what I have done already.

My main machine has seen a change of motherboard, CPU and memory so it could have de-activated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely so does it so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the mainboard and I have seen suggestions that this is the critical component for Windows licence activation and it would make sense if that was the case.

However, this rule is not hard and fast either since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you and I have seen that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be even when they are impossible to keep like that?

A need to update graphics hardware

16th June 2013

Not being a gaming enthusiast, having to upgrade graphics cards in PC’s is not something that I do very often or even rate as a priority. However, two PC’s in my possession have had that very piece of hardware upgraded on them and it’s not because anything was broken either. My backup machine has seen quite a few Linux distros on there since I built it nearly four years ago. The motherboard is an ASRock K10N78 that sourced from MicroDirect and it has onboard an NVIDIA graphics chip that has performed well if not spectacularly. One glitch that always existed was a less than optimal text rendering in web browsers but that never was enough to get me to add a graphics card to the machine.

More recently, I ran into trouble with Sabayon 13.04 with only the 2D variant of the Cinnamon desktop environment working on it and things getting totally non-functional when a full re-installation of the GNOME edition was attempted. Everything went fine until I added the latest updates to the system when a reboot revealed that it was impossible to boot into a desktop environment. Some will relish this as a challenge but I need to admit that I am not one of those. In fact, I tried out two Arch-based distros on the same PC and got the same results following a system update on each. So, my explorations of Antergos and Manjaro have to continue in virtual machines instead.

To get a working system, I gave Linux Mint 15 Cinnamon a go and that worked a treat. However, I couldn’t ignore that the cutting edge distros that I tried before it all took exception to the onboard NVIDIA graphics. systemd has been implemented in all of these and it seems reasonable to think that it is coming to Linux Mint at some stage in the future so I went about getting a graphics card to add into the machine. Having had good experiences with ATi’s Radeon in the past, I stuck with it even though it now is in the hands of AMD. Not being that fussed so long there was Linux driver support, I picked up a Radeon HD 6450 card from PC World. Adding it into the PC was a simpler of switching off the machine, slotting in the card, closing it up and powering it on again. Only later on did I set the BIOS to look for PCI Express graphics before anything else and I could have got away without doing that. Then, I made use of the Linux Mint Additional Driver applet in its setting panel to add in the proprietary driver before restarting the machine to see if there were any visual benefits. To sort out the web browser font rendering, I used the Fonts applet in the same settings panel and selected full RGBA hinting. The improvement was unmissable if not still like the appearance of fonts on my main machine. Overall, there had been an improvement and a spot of future proofing too.

That tinkering with the standby machine got me wondering about what I had on my main PC. As well as onboard Radeon graphics, it also gained a Radeon 4650 card for which 3D support wasn’t being made available by Ubuntu GNOME 12.10 or 13.04 to VMware Player and it wasn’t happy about this when a virtual machine was set to have 3D support. Adding the latest fglrx driver only ensured that I got a command line instead of a graphical interface. Issuing one of the following commands and rebooting was the only remedy:

sudo apt-get remove fglrx

sudo apt-get remove fglrx-updates

Looking at the AMD website revealed that they no longer support 2000, 3000 or 4000 series Radeon cards with their latest Catalyst driver the last version that did not install on my machine since it was built for version 3.4.x of the Linux kernel. A new graphics card then was in order if I wanted 3D graphics in VWware VM’s and both GNOME and Cinnamon appear to need this capability. Another ASUS card, a Radeon HD 6670, duly was acquired and installed in a manner similar to the Radeon HD 6450 on the standby PC. Apart from not needing to alter the font rendering (there is a Font tab on Gnome Tweak Tool where this can be set), the only real exception was to add the Jockey software to my main PC for installation of the proprietary Radeon driver. The following command does this:

sudo apt-get install jockey-kde

When that was done I issue the jockey-kde command and selected the first entry on the list. The machine worked as it should on restarting apart from an AMD message at the bottom right hand corner bemoaning unrecognised hardware. There had been two entries on that Jockey list with exactly the same name so it was time to select the second of these and see how it went. On restarting, the incompatibility message had gone and all was well. VMware even started virtual machines with 3D support without any messages so the upgrade did the needful there.

Hearing of someone doing two PC graphics card upgrades in a weekend may make you see them as an enthusiast but my disinterest in computer gaming belies this. Maybe it highlights that Linux operating systems need 3D more than might be expected. The Cinnamon desktop environment now issues messages if it is operating in 2D mode with software 3D rendering and GNOME always had the tendency to fall back to classic mode, as it had been doing when Sabayon was installed on my standby PC. However, there remain cases where Linux can rejuvenate older hardware and I installed Lubuntu onto a machine with 10 year old technology on there (an 1100 MHz Athlon CPU, 1GB of RAM and 60GB of hard drive space in case dating from 1998) and it works surprisingly well too.

It seems that having fancier desktop environments like GNOME Shell and Cinnamon means having the hardware on which it could run. For a while, I have been tempted by the possibility of a new PC since even my main machine is not far from four years old either. However, I also spied a CPU, motherboard and RAM bundle featuring an Intel Core i5-4670 CPU, 8GB of Corsair Vengence Pro Blue memory and a Gigabyte Z87-HD3 ATX motherboard included as part of a pre-built bundle (with a heatsink and fan for the CPU) for around £420. Even for someone who has used AMD CPU’s since 1998, that does look tempting but I’ll hold off before making any such upgrade decisions. Apart from exercising sensible spending restraint, waiting for Linux UEFI support to mature a little more may be no bad idea either.

Update 2013-06-23: The new graphics card in my main machine is working as it should and has reduced the number of system error report messages turning up too; maybe Ubuntu GNOME 13.04 didn’t fancy the old graphics card all that much. A rogue .fonts.conf file was found in my home area on the standby machine and removing it has improved how fonts are displayed on there immeasurably. If you find one on your system, it’s worth doing the same or renaming it to see if it helps. Otherwise, tinkering with the font rendering settings is another beneficial act and it even helps on Debian 6 too and that uses GNOME 2! Seeing what happens on Debian 7.1 could be something that I go testing sometime.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.