Technology Tales

Adventures & experiences in contemporary technology

Installing Firefox Developer Edition in Linux Mint

22nd April 2018

Having moved beyond the slow response and larger memory footprint of Firefox ESR, I am using Firefox Developer Edition in its place even if it means living without a status bar at the bottom of the window. Hopefully, someone will create an equivalent of the old add-on bar extensions that worked before the release of Firefox Quantum.

Firefox Developer Edition may be pre-release software with some extras for web developers like being able to to drill into an HTML element and see its properties but I am finding it stable enough for everyday use. It is speedy too, which helps, and it has its own profile so it can co-exist on the same machine as regular releases of Firefox like its ESR and Quantum variants.

Installation takes a little added effort though and there are various options available. My chosen method involved Ubuntu Make. Installing this involves setting up a new PPA as the first step and the following commands added the software to my system:

sudo add-apt-repository ppa:ubuntu-desktop/ubuntu-make
sudo apt-get update
sudo apt-get install ubuntu-make

With the above completed, it was simple to install Firefox Developer edition using the following command:

umake web firefox-dev

Where things got a bit more complicated was getting entries added to the Cinnamon Menu and Docky. The former was sorted using the cinnamon-menu-editor command but the latter needed some tinkering with my firefox-developer.desktop file found in .local/share/applications/ within my user area to get the right icon shown. Discovering this took me into .gconf/apps/docky-2/Docky/Interface/DockPreferences/%gconf.xml where I found the location of the firefox-developer.desktop that needed changing. Once this was completed, there was nothing else to do from the operating system side.

Within Firefox itself, I opted to turn off warnings about password logins on non-https websites by going to about:config using the address bar, then looking for security.insecure_field_warning.contextual.enabled and changing its value from True to False. Some may decry this but there are some local websites on my machine that need attention at times. Otherwise, Firefox is installed with user access so I can update it as if it were a Windows or MacOS application and that is useful given that there are frequent new releases. All is going as I want it so far.

Fixing Background Image Display in GNOME Shell 3.10

2nd May 2014

On upgrading from Ubuntu GNOME 13.10 to Ubuntu GNOME 14.04, a few rough edges were to be noticed. One was the display of my chosen background image: it was garbled. Later, I discovered that there is a maximum width of 2560 px for background images in GNOME Shell these days and that things get messy beyond that.

In my case, the image width was around 6000 px and I was used to its getting resized in GNOME Shell 3.8 and its predecessors. It seems that the functionality got removed after that though so the workaround of manual image resizing in the GIMP needed to be employed. Though having big images open in memory creates an additional overhead, not handling them very well at all looks like a bug caused by setting 2560 px as a maximum screen width for the GNOME Shell panel and the complete removal of Nautilus from desktop rendering duties. Let’s hope that sense is seen with ever larger screen sizes and resolutions coming our way.

It’s the sort of thing that did get me looking at adding on Cinnamon 2.2 for a while before setting background image scaling using the indispensable GNOME Tweak Tool was discovered. LinuxG.net has a useful tutorial on this for anyone with such an adventurous streak in them. For now though, I am OK with my set up but the GNOME project’s focus on minimalism could affect us in other ways so I can see why Clem Lefebvre started the Cinnamon one primarily for Linux Mint and the desktop environment is appearing elsewhere too. After all, Gedit lost its menu bar in GNOME 3.12 so it’s just as well that we have alternative choices.

Update 2014-05-06: It seems that the desktop image bug that afflicts GNOME Shell 3.10 got sorted for GNOME Shell 3.12. At least, that is the impression that an Antergos instance in a VirtualBox virtual machine gives me.

A reappraisal of Windows 8 and 8.1 licensing

15th November 2013

With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC’s, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC’s that needed a new Windows installation and that the licence that you bought was then attached to the machine from when it got installed on there.

As is usual with Microsoft, the situation is less clear cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC’s. With Windows & and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it was a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance if not the actual installation. At times, Microsoft muddies waters so as to keep its support costs down.

Even with Microsoft’s track record in mind, it still did surprise me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discovering the same for software purchased from the Microsoft web site. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90 day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. This not only be an actual in-situ installation but there options to create an ISO or USB image of the installation disk for later installation.

In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested and I am unsure that I will go beyond what I have done already.

My main machine has seen a change of motherboard, CPU and memory so it could have de-activated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely so does it so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the mainboard and I have seen suggestions that this is the critical component for Windows licence activation and it would make sense if that was the case.

However, this rule is not hard and fast either since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you and I have seen that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be even when they are impossible to keep like that?

A display of brand loyalty

12th July 2013

Since 2007, my main camera has been a Pentax K10D DSLR and it has gone on many journeys with me. In fact, more than 15,000 images have been captured with it and I have classed it as an unfailing servant. The autofocus may not be the fastest but my subjects tend to be stationary: landscapes, architecture, flora and transport. Even any bus and train photos have included parked vehicles rather than moving ones so there never have been issues. The hint of underexposure in any photos always can be sorted because DNG files are what I create, with all the raw capture information that is possible to retain. In fact, it has been hard to justify buying another SLR because the K10D has done so well for me.

In recent months, I have looking at processed photos and asking myself if time has moved along for what is not far from being a six year old camera. At various times, I have been looking at higher members of the Pentax while wondering if an upgrade would be a good idea. First, there was the K7 and then the K5 before the K5 II got launched. Even though its predecessor is still to be found on sale, it was the newer model that became my choice.

Pentax K5-II

My move to Pentax in 2007 was a case of brand disloyalty since I had been a Canon user from when I acquired my first SLR, an EOS 300. Even now, I still have a Powershot G11 that finds itself slipped into a pocket on many a time. Nevertheless, I find that Canon images feel a little washed out prior to post processing and that hasn’t been the case with the K10D. In fact, I have been hearing good things about Nikon cameras delivering punchy results so one of them would be a contender were it not for how well the Pentax performed.

So, what has my new K5 II body gained me that I didn’t have before? For one thing, the autofocus is a major improvement on that in the K10D. It may not stop me persevering with manual focusing for most of the time but there are occasions the option of solid autofocus is good to have. Other advances include a 16.3 megapixel sensor with a much larger ISO range. The advances in sensor technology since when the K10D appeared may give me better quality photos and noise is something that my eyes may have begun to detect in K10D photos even at my usual ISO of 400.

There have been innovations that I don’t need too. Live View is something that I use heavily with the Powershot G11 because it has such a pitiful optical viewfinder. The K5 II has a very bright and sharp one so that function lays dormant, especially when I witnessed dodgy autofocus performance with it in use; manual focusing should be OK, I reckon. By default too, the screen stays on all the time and that’s a nuisance for an optical viewfinder user like me so I looked through the manual and the menus to switch off the thing. My brief flirtation with the image level display met an end for much the same reason though it’s good that it’s there. There is some horizon auto-correction available as a feature and this is left on to see what it offers since there have been a multitude of times when I needed to sort out crooked horizons caused by my handholding the camera.

The K5 II may have a 3″ screen on its back but it has done nothing to increase the size of the camera. If anything, it is smaller that the K10D and that usefully means that I am not on the lookout for a new camera holster. Not having a bigger body also means there is little change in how the much camera feels in the hand compared with the older one.

In many ways, the K5 II works very like the K10D once I took control over settings that didn’t suit me. Both have Shake Reduction in their camera bodies though the setting has been moved into the settings menu in the new camera when the older one had a separate switch on its body. Since I’d be inclined to leave it on all the time and prefer not to have it knocked off accidentally, this is not an issue.  Otherwise, many of the various switches are in the same places so it’s not that hard to find my way around them.

That’s not to say that there aren’t other changes like the addition of a lock to the mode dial but I have used Canon EOS camera bodies with that feature so I do not consider it a step backwards. The exposure compensation button has been moved to the top of the camera where I found it very easily and have been using perhaps more than on the K10D; it’s also something that I use on the G11 so the experimentation is being brought across to the K5 II now as well. Beside it, there’s a new ISO button so further experimentation can be attempted with that to see how it does.

If I have any criticism, it’s about the clutter of the menus on the K5 II. The long lists through you scrolled on the K10D have been replaced with a series of extra tabs so that on-screen scrolling is not needed as before. However, I reckon that this breaks up things too much and makes working through the settings look more foreboding to anyone who is not so technical in mindset. Nevertheless, settings such as the the type of file to capture are there and I continue to use RAW DNG files as is usual for me though JPEG and Pentax’s own RAW format also are there. For a while, I forgot to set the date, soon found out what I did and the situation was remedied. The same sort of thing applied to storing files in different folders according to the capture date. For my own reasons, I turned this off to put everything into a single PENTX directory to suit my own workflow. My latest discovery among the menus was the ability to add photographer and copyright holder information to the EXIF metadata attached to the image files created by the camera. With legislative proposals that dilute the automatic rights of copyright holders going through the U.K. parliament, this seems a very timely inclusion even if most would prefer that there was no change to copyright law.

Of course, the worth of any camera is in the images that it produces and I have been happy with what I have been getting so far. The bigger files mean less images fit on a memory card as before. Thankfully, SDHC card capacities have grown even if I don’t wish to machine gun my photography altogether. While out and about, I was surprised to apertures like F/14 and F/18 when I was more accustomed to a progression like F/11, F/13, F/16, F/19, F/22, etc. Most of those older values still are there though so there hasn’t been a complete break with convention. The same comment applies to shutter speeds where ones like 1/100 and 1/160 made there appearance where I might have expected just ones like 1/90, 1/125, 1/250 and so on. The extra possibilities, and that is what they are, do allow more flexibility I suppose and may even make it easier to make correct exposures though any judgement of correctness has to be in the eye of a photographer and not what a computer algorithm in a camera determines. For much of the time until now, I have stuck with an ISO of 400 apart from a little testing in a woodland area of an evening soon after the camera arrived.

Since the K5 II came my way a few months ago, I have been meaning to collect my thoughts on here and there has been a delay while I brought mu thinking to a sensible close.At one point, it felt like there was so much to say that the piece became larger in my mind that even what you have been reading now. After all, there are other things that I can adjust to see how the resulting images look and white balance is but one of these.The K10D isn’t beyond experimentation either, especially since I discovered that shake reduction was switched off and it has me asking if that lacking in quality that I mentioned earlier has another explanation. Of course, actually making use of my tripod would be another good suggestion so it’s safe to say that yet more photographic explorations await.

A need to update graphics hardware

16th June 2013

Not being a gaming enthusiast, having to upgrade graphics cards in PC’s is not something that I do very often or even rate as a priority. However, two PC’s in my possession have had that very piece of hardware upgraded on them and it’s not because anything was broken either. My backup machine has seen quite a few Linux distros on there since I built it nearly four years ago. The motherboard is an ASRock K10N78 that sourced from MicroDirect and it has onboard an NVIDIA graphics chip that has performed well if not spectacularly. One glitch that always existed was a less than optimal text rendering in web browsers but that never was enough to get me to add a graphics card to the machine.

More recently, I ran into trouble with Sabayon 13.04 with only the 2D variant of the Cinnamon desktop environment working on it and things getting totally non-functional when a full re-installation of the GNOME edition was attempted. Everything went fine until I added the latest updates to the system when a reboot revealed that it was impossible to boot into a desktop environment. Some will relish this as a challenge but I need to admit that I am not one of those. In fact, I tried out two Arch-based distros on the same PC and got the same results following a system update on each. So, my explorations of Antergos and Manjaro have to continue in virtual machines instead.

To get a working system, I gave Linux Mint 15 Cinnamon a go and that worked a treat. However, I couldn’t ignore that the cutting edge distros that I tried before it all took exception to the onboard NVIDIA graphics. systemd has been implemented in all of these and it seems reasonable to think that it is coming to Linux Mint at some stage in the future so I went about getting a graphics card to add into the machine. Having had good experiences with ATi’s Radeon in the past, I stuck with it even though it now is in the hands of AMD. Not being that fussed so long there was Linux driver support, I picked up a Radeon HD 6450 card from PC World. Adding it into the PC was a simpler of switching off the machine, slotting in the card, closing it up and powering it on again. Only later on did I set the BIOS to look for PCI Express graphics before anything else and I could have got away without doing that. Then, I made use of the Linux Mint Additional Driver applet in its setting panel to add in the proprietary driver before restarting the machine to see if there were any visual benefits. To sort out the web browser font rendering, I used the Fonts applet in the same settings panel and selected full RGBA hinting. The improvement was unmissable if not still like the appearance of fonts on my main machine. Overall, there had been an improvement and a spot of future proofing too.

That tinkering with the standby machine got me wondering about what I had on my main PC. As well as onboard Radeon graphics, it also gained a Radeon 4650 card for which 3D support wasn’t being made available by Ubuntu GNOME 12.10 or 13.04 to VMware Player and it wasn’t happy about this when a virtual machine was set to have 3D support. Adding the latest fglrx driver only ensured that I got a command line instead of a graphical interface. Issuing one of the following commands and rebooting was the only remedy:

sudo apt-get remove fglrx

sudo apt-get remove fglrx-updates

Looking at the AMD website revealed that they no longer support 2000, 3000 or 4000 series Radeon cards with their latest Catalyst driver the last version that did not install on my machine since it was built for version 3.4.x of the Linux kernel. A new graphics card then was in order if I wanted 3D graphics in VWware VM’s and both GNOME and Cinnamon appear to need this capability. Another ASUS card, a Radeon HD 6670, duly was acquired and installed in a manner similar to the Radeon HD 6450 on the standby PC. Apart from not needing to alter the font rendering (there is a Font tab on Gnome Tweak Tool where this can be set), the only real exception was to add the Jockey software to my main PC for installation of the proprietary Radeon driver. The following command does this:

sudo apt-get install jockey-kde

When that was done I issue the jockey-kde command and selected the first entry on the list. The machine worked as it should on restarting apart from an AMD message at the bottom right hand corner bemoaning unrecognised hardware. There had been two entries on that Jockey list with exactly the same name so it was time to select the second of these and see how it went. On restarting, the incompatibility message had gone and all was well. VMware even started virtual machines with 3D support without any messages so the upgrade did the needful there.

Hearing of someone doing two PC graphics card upgrades in a weekend may make you see them as an enthusiast but my disinterest in computer gaming belies this. Maybe it highlights that Linux operating systems need 3D more than might be expected. The Cinnamon desktop environment now issues messages if it is operating in 2D mode with software 3D rendering and GNOME always had the tendency to fall back to classic mode, as it had been doing when Sabayon was installed on my standby PC. However, there remain cases where Linux can rejuvenate older hardware and I installed Lubuntu onto a machine with 10 year old technology on there (an 1100 MHz Athlon CPU, 1GB of RAM and 60GB of hard drive space in case dating from 1998) and it works surprisingly well too.

It seems that having fancier desktop environments like GNOME Shell and Cinnamon means having the hardware on which it could run. For a while, I have been tempted by the possibility of a new PC since even my main machine is not far from four years old either. However, I also spied a CPU, motherboard and RAM bundle featuring an Intel Core i5-4670 CPU, 8GB of Corsair Vengence Pro Blue memory and a Gigabyte Z87-HD3 ATX motherboard included as part of a pre-built bundle (with a heatsink and fan for the CPU) for around £420. Even for someone who has used AMD CPU’s since 1998, that does look tempting but I’ll hold off before making any such upgrade decisions. Apart from exercising sensible spending restraint, waiting for Linux UEFI support to mature a little more may be no bad idea either.

Update 2013-06-23: The new graphics card in my main machine is working as it should and has reduced the number of system error report messages turning up too; maybe Ubuntu GNOME 13.04 didn’t fancy the old graphics card all that much. A rogue .fonts.conf file was found in my home area on the standby machine and removing it has improved how fonts are displayed on there immeasurably. If you find one on your system, it’s worth doing the same or renaming it to see if it helps. Otherwise, tinkering with the font rendering settings is another beneficial act and it even helps on Debian 6 too and that uses GNOME 2! Seeing what happens on Debian 7.1 could be something that I go testing sometime.

Best left until later in the year?

26th January 2010

In the middle of last year, my home computing experience was one of feeling displaced. A combination of a stupid accident and a power outage had rendered my main PC unusable. What followed was an enforced upgrade that use combination that was familiar to me: Gigabyte motherboard, AMD CPU and Crucial memory. However, assembling that lot and attaching components from the old system from the old system resulted in the sound of whirring fans but nothing appearing on-screen. Not having useful beeps to guide me meant that it was a case of undertaking educated guesswork until the motherboard was found to be at fault. In a situation like this, a deeper knowledge of electronics would have been handy and might have saved me money too. As for the motherboard, it is hard to say whether it was a faulty set from the outset or whether there was a mishap along the way, either due ineptitude with static or incompatibility with a power supply. What really tells the tale on the mainboard was the fact that all of the other components are working well in other circumstances, even that old power supply.

A few years back, I had another experience with a problematic motherboard, an Asus this time, that ate CPU’s and damaged a hard drive before I stabilised things. That was another upgrade attempted in the first half of the year. My first round of PC building was in the third quarter of 1998 and that went smoothly once I realised that a new case was needed. Similarly, another PC rebuild around the same time of year in 2005 was equally painless. Based on these experiences, I should not be blamed for waiting until later in the year before doing another rebuild, preferably a planned one rather than an emergency.

Of course, there may be another factor involved too. The hint was a non-working Sony DVD writer that was acquired early last year when it really was obvious that we were in the middle of a downturn. Could older unsold inventory be a contributor? Well, it fits in with seeing poor results twice, In addition, it would certainly tally with a problematical PC rebuild in 2002 following the end of the Dot Com bubble and after the deadly Al Qaeda attack on New York’s World Trade Centre. An IBM hard drive that was acquired may not have been the best example of the bunch and the same comment could apply to the Asus motherboard. The resulting construction may have been limping but it was working and I tolerated.

In contrast, last year’s episode had me launched into using a Toshiba laptop and a spare older PC for my needs with an external hard drive enclosure used to extract my data onto other external hard drives to keep me going. It felt a precarious arrangement but it was a useful experience in ways too. There was cause for making acquaintance with nearby PC component stores that I hadn’t visited before and I got to learning about things that otherwise wouldn’t have come my way. Using an external hard drive enclosure for accessing data on hard drives from a non-functioning PC is one of these. Discovering that it is possible to boot from external optical and hard disk drives came as a surprise too and will work so long as there is motherboard support for it. Another experience came from a crisis of confidence that had me acquiring a bare-bones system from Novatech and populating it with optical and hard disk drives. Then, I discovered that I have no need for power supplies rated more than 300 watts (around 200 W suffices). Turning my PC off more often became a habit friendly both to the planet and to household running costs too. Then, there’s the beneficial practice of shopping locally and it can suffice even if what PC magazines stick on their hot lists but shopping online for those pieces doesn’t guarantee success either. All of these were useful lessons and, while I’d rather not throw away good money after bad, it goes to show that even unsuccessful acquisitions had something to offer in the form of learning opportunities. Whether you consider that is worthwhile is up to you.

Still able to build PC systems

25th October 2009

This weekend has been something of a success for me on the PC hardware front. Earlier this year, a series of mishaps rendering my former main home PC unusable; it was a power failure that finished it off for good. My remedy was a rebuild using my then usual recipe of a Gigabyte motherboard, AMD CPU and crucial memory. However, assembling the said pieces never returned the thing to life and I ended up in no man’s land for a while, dependent on and my backup machine and laptop. That wouldn’t have been so bad but for the need for accessing data from the old behemoth’s hard drives but an external drive housing set that in order. Nevertheless, there is something unfinished about work with machines having a series of external drives hanging off them. That appearance of disarray was set to rights by the arrival of a bare bones system from Novatech in July with any assembly work restricted to the kitchen table.There was a certain pleasure in seeing a system come to life after my developing a fear that I had lost all of my PC building prowess.

That restoration of order still left finding out why those components bought earlier in the year didn’t work together well enough to give me a screen display on start-up. Having electronics testing equipment and the knowledge of its correct use would make any troubleshooting far easier but I haven’t got these. There is a place near to me where I could go for this but you are left wondering what might be said to a PC build gone wrong. Of course, the last thing that you want to be doing is embarking on a series of purchases that do not fix the problem, especially in the current economic climate.

One thing to suspect when all doesn’t turn out as hoped is the motherboard and, for whatever reason, I always suspect it last. It now looks as if that needs to change after I discovered that it was the Gigabyte motherboard that was at fault. Whether it was faulty from the outset or it came a cropper with a rogue power supply or careless with static protection is something that I’ll never know. An Asus motherboard did go rogue on me in the past and it might be that it ruined CPU’s and even a hard drive before I laid it to rest. Its eventual replacement put a stop to a year of computing misfortune and kick-started my reliance on Gigabyte. That faith is under question now but the 2009 computing hardware mishap seems to be behind me and any PC rebuilds will be done on tables and motherboards will be suspected earlier when anything goes awry.

Returning to the present, my acquisition of an ASRock K10N78 and subsequent building activities has brought a new system using an AMD Phenom X4 CPU and 4 GB of memory into use. In fact, I writing these very words using the thing. It’s all in a new TrendSonic case too (placing an elderly behemoth into retirement) and with a SATA hard drive and DVD writer. The new motherboard has onboard audio and graphics so external cards are not needed unless you are an audiophile and/or a gamer; for the record, I am neither. Those additional facilities make for easier building and fault-finding should the undesirable happen.

The new box is running the release candidate of Ubuntu 9.10 and it seems to be working without a hitch too. Earlier builds of 9.10 broke in their VirtualBox VM so you should understand the level of concern that this aroused in my mind; the last thing that you want to be doing is reinstalling an operating system because its booting capability breaks every other day. Thankfully, the RC seems to have none of these rough edges so I can upgrade the Novatech box, still my main machine and likely to remain so for now, with peace of mind when the time comes.

A restoration of order

12th July 2009

This weekend, I finally put my home computing displacement behind me. My laptop had become my main PC with a combination of external hard drives and an Octigen external hard drive enclosure keeping me motoring in laptop limbo. Having had no joy in the realm of PC building, I decided to go down the partially built route and order a bare-bones system from Novatech. That gave me a Foxconn case and motherboard loaded up with an AMD 7850 dual-core CPU and 2 GB of RAM. With the motherboard offering onboard sound and video capability, all that was needed was to add drives. I added no floppy drive but instead installed a SATA DVD Writer (not sure that it was a successful purchase, though, but that can be resolved at my leisure) and the hard drives from the old behemoth that had been serving me until its demise. A session of work on the kitchen table and some toing and froing ensued as I inched my way towards a working system.

Once I had set all the expected hard disks into place, Ubuntu was capable of being summoned to life with the only impediment being an insistence of scanning the 1 TB Western Digital and getting stuck along the way. Not having the patience, I skipped this at start up and later unmounted the drive to let fsck do its thing while I got on with other tasks; the hold up had been the presence of VirtualBox disk images on the drive. Speaking of VirtualBox, I needed to scale back the capabilities of Compiz, so things would work as they should. Otherwise, it was a matter of updating various directories with files that had appeared on external drives without making it into their usual storage areas. Windows would never have been so tolerant and, as if to prove the point, I needed to repair an XP installation in one of my virtual machines.

In the instructions that came with the new box, Novatech stated that time was a vital ingredient for a build and they weren’t wrong. The delivery arrived at 09:30 and I later got a shock when I saw the time to be 15:15! However, it was time well spent and I noticed the speed increase when putting ImageMagick through its paces with a Perl script. In time, I might get brave and be tempted to add more memory to get up to 4 GB; the motherboard may only have two slots, but that’s not such a problem with my planning on sticking with 32-bit Linux for a while to come. My brief brush with its 64-bit counterpart revealed some roughness that warded me off for a little while longer. For now, I’ll leave well alone and allow things to settle down again. Lessons for the future remain and I may even mull over them in another post…

Out of memory at line: 56

28th May 2009

This is an error that I have started to see a lot in the last few weeks. First, it was with Piwik and latterly with WordPress.com Stats. For the record, I have never seen it on up to date systems but always with IE6 and at page unloading time. The CPU usage hits 100% before the error is produced and that has had me blaming JavaScript in error; it isn’t the cause of all ills. In fact, the cause seems to be a bug in a certain release of Adobe Flash 9 but I am of the opinion that the inclusion of certain features in a Flash movie are needed to trigger it too. I don’t have the exact details of this but WordPress.com Stats worked without fault until a recent update and that is what is making me reach the conclusion that I have. That observation is making me wonder whether we are coming to a point where Flash compatibility is something that needs to factored into the use of the said technology in a website or web application. Updating Flash will solve the problem on the client but it might be better if it wasn’t triggered on the server side either.

Whither Fedora?

10th January 2009

There is a reason why things have got a little quieter on this blog: my main inspiration for many posts that make their way on here, Ubuntu, is just working away without much complaint. I have to say that BBC iPlayer isn’t working so well for me at the moment so I need to take a look at my setup. Otherwise, everything is continuing quietly. In some respects, that’s no bad thing and allows me to spend my time doing other things like engaging in hill walking, photography and other such things. I suppose that the calm is also a reflection of the fact that Ubuntu has matured but there is a sense that some changes may be on the horizon. For one thing, there are the opinions of a certain Mark Shuttleworth but the competition is progressing too.

That latter point brings me to Linux Format’s recently published verdict that Fedora has overtaken Ubuntu. I do have a machine with Fedora on there and it performs what I ask of it without any trouble. However, I have never been on it trying all of the sorts of things that I ask of Ubuntu so my impressions are not in-depth ones. Going deeper into the subject mightn’t be such a bad use of a few hours. What I am not planning to do is convert my main Ubuntu machine to Fedora. I moved from Windows because of constant upheavals and I have no intention to bring those upon me without good reason and that’s just not there at the moment.

Speaking of upheavals, one thought that is entering my mind is that of upgrading that main machine. Its last rebuild was over three years ago and computer technology has moved on a bit since then with dual and quad core CPU‘s from Intel and AMD coming into the fray. Of course, the cost of all of this needs to be considered too and that is never more true than of these troubled economic times. If you asked me about the prospect of a system upgrade a few weeks ago, I would have ruled it out of hand. What has got me wondering is my continued used of virtualisation and the resources that it needs. I am getting mad notions like the idea of running more than one VM at once and I do need to admit that it has its uses, even if it puts CPU’s and memory through their paces. Another attractive idea would be getting a new and bigger screen, particularly with what you can get for around £100 these days. However, my 17″ Iiyama is doing very well so this is one for the wish list more than anything else. None of the changes that I have described are imminent but I have noticed how fast I am filling disks up with digital images so an expansion of hard disk capacity has come much higher up the to do list.

If I ever get to doing a full system rebuild with a new CPU, memory and motherboard (I am not so sure about graphics since I am no gamer),  the idea of moving into the world of 64-bit computing comes about. The maximum amount of memory usable by 32-bit software is 4 GB so 64-bit is a must if I decide to go beyond this limit. That all sounds very fine but for the possibility of problems arising with support for legacy hardware. It sounds like another bridge to be assessed before its crossing, even if two upheavals can be made into one.

Aside from system breakages, the sort of hardware and software changes over which I have been musing here are optional and can be done in my own time. That’s probably just as for a very good reason that I have mentioned earlier. Being careful with money becomes more important at times like these and it’s good that free software not only offers freedom of choice and usage but also a way to leave the closed commercial software acquisition treadmill with all of its cost implications, leaving money for much more important things.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.