Technology Tales

Adventures & experiences in contemporary technology

A reappraisal of Windows 8 and 8.1 licensing

15th November 2013

With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC’s, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC’s that needed a new Windows installation and that the licence that you bought was then attached to the machine from when it got installed on there.

As is usual with Microsoft, the situation is less clear cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC’s. With Windows & and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it was a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance if not the actual installation. At times, Microsoft muddies waters so as to keep its support costs down.

Even with Microsoft’s track record in mind, it still did surprise me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discovering the same for software purchased from the Microsoft web site. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90 day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. This not only be an actual in-situ installation but there options to create an ISO or USB image of the installation disk for later installation.

In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested and I am unsure that I will go beyond what I have done already.

My main machine has seen a change of motherboard, CPU and memory so it could have de-activated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely so does it so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the mainboard and I have seen suggestions that this is the critical component for Windows licence activation and it would make sense if that was the case.

However, this rule is not hard and fast either since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you and I have seen that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be even when they are impossible to keep like that?

Pondering storage options

1st June 2011

The combination of curiosity and a little spare time had me browsing online computing technology stores recently. A spot of CD and DVD burning brought on by a flurry of Linux distribution testing reminded me of the possibility. Because I have built up a sizeable library of digital photos, ensuring that I have backups of them is something that needs doing. A 2 GB Samsung external hard drive is brought to life every now and again for that purpose but the prospect of using Blu-Ray discs has appealed to me. After all capacities of 25 GB for single layer discs and 50 GB for dual layer ones sound not inappropriate for my purposes. However, they aren’t a cheap option at the time of writing with each disc costing in the region of £3-4 at one place where I was looking. The cost of BD writers themselves seems not to be so bad though with a few in the £60-100 bracket; any lower than this and you could end up with a combo drive that reads Blu-Ray discs and writes to DVD’s and CD’s so a modicum of concentration is needed. As attractive as the idea might be, the cost of BD media means that I’ll wait a little while before deciding to take the plunge. The price premium at the moment is a reminder of the way that things used to be when CD and DVD writers first came on the market. It is very telling when discs come packaged in jewel cases, something that you won’t see too often with CD’s or DVD’s.

Another piece of storage excitement that hasn’t escaped me is the advent of SSD hard drives. With no moving parts like in conventional hard drives, they bring a speed boost. Concerns about their lifetimes and the numbers of read/write events per drive would stall me when it comes to storing personal data on them but using them for the likes of operating system files sounds attractive, especially with my partiality to Linux perhaps not hammering drives so much. As with any new technology, there is a price premium though a drive big enough for hosting an operating system can be acquired for less than £100. As with many of my hardware purchase brainwaves, there’s no rush but this is an option that I’ll keep at the back of my mind.

Another appealing notion is the idea of getting a NAS so that files can be shared between a few computers. While I have seen prices starting at just above £70 for single disk enclosures, these generally are a more expensive option than external drives and that’s before you consider the cost of any hard drives. Nevertheless, the advantages of a unit containing more than a single hard drive while operating as a print server for any compatible printer too. When you get to 4 or 5 hard drive trays, then the cost has mounted but that could be when they pay their way too. What reminded me of these was a bookazine on home networking that I recently found at a branch of WHSmith’s and their attractions are subject to the networking side of things being made to work without a drama. Once that’s out of the way, then their usefulness really does appeal.

Mulling over all these brainwaves is one thing but it doesn’t mean that the purse strings will become too loose in this age of economic constraint. In fact, pondering them may serve to staunch any impulse purchases. Sometimes, a spot of virtual shopping serves to control things rather than losing the run of oneself.

On Upgrading to Linux Mint 11

31st May 2011

For a Linux distribution that focuses on user-friendliness, it does surprise me that Linux Mint offers no seamless upgrade path. In fact, the underlying philosophy is that upgrading an operating system is a risky business. However, I have been doing in-situ upgrades with both Ubuntu and Fedora for a few years without any real calamities. A mishap with a hard drive that resulted in lost data in the days when I mainly was a Windows user places this into sharp relief. These days, I am far more careful but thought nothing of sticking a Fedora DVD into a drive to move my Fedora machine from 14 to 15 recently. Apart from a few rough edges and the need to get used to GNOME 3 together with making a better fit for me, there was no problem to report. The same sort of outcome used to apply to those online Ubuntu upgrades that I was accustomed to doing.

The recommended approach for Linux Mint is to back up your package lists and your data before the upgrade. Doing the former is a boon because it automates adding the extras that a standard CD or DVD installation doesn’t do. While I did do a little backing up of data, it wasn’t total because I know how to identify my drives and take my time over things. Apache settings and the contents of MySQL databases were my main concern because of where these are stored.

When I was ready to do so, I popped a DVD in the drive and carried out a fresh installation into the partition where my operating system files are kept. Being a Live DVD, I was able to set up any drive and partition mappings with reference to Mint’s Disk Utility. What didn’t go so well was the GRUB installation, and it was due to the choice that I made on one of the installation screens. Despite doing an installation of version 10 just over a month ago, I had overlooked an intricacy of the task and placed GRUB on the operating system files partition rather than at the top level for the disk where it is located. Instead of trying to address this manually, I took the easier and more time-consuming step of repeating the installation like I did the last time. If there was a graphical tool for addressing GRUB problems, I might have gone for that instead, but am left wondering at why there isn’t one included at all. Maybe it’s something that the people behind GRUB should consider creating unless there is one out there already about which I know nothing.

With the booting problem sorted, I tried logging in only to find a problem with my desktop that made the system next to unusable. It was back to the DVD and I moved many of the configuration files and folders (the ones with names beginning with a “.”) from my home directory in the belief that there might have been an incompatibility. That action gained me a fully usable desktop environment but I now think that the cause of my problem may have been different to what I initially suspected. Later I discovered that ownership of files in my home area elsewhere wasn’t associated with my user ID though there was no change to it during the installation. As it happened, a few minutes with the chown command were enough to sort out the permissions issue.

The restoration of the extra software that I had added beyond what standardly gets installed was took its share of time but the use of a previously prepared list made things so much easier. That it didn’t work smoothly because some packages couldn’t be found the first time around, so another one was needed. Nevertheless, that is nothing compared to the effort needed to do the same thing by issuing an installation command at a time. Once the usual distribution software updates were in place, all that was left was to update VirtualBox to the latest version, install a Citrix client and add a PHP plugin to NetBeans. Then, next to everything was in place for me.

Next, Apache settings were restored as were the databases that I used for offline web development. That nearly was all that was needed to get offline websites working but for the need to add an alias for localhost.localdomain. That required installation of the Network Settings tool so that I could add the alias in its Hosts tab. With that out of the way, the system had been settled in and was ready for real work.

Network Settings on Linux Mint 11

In light of some of the glitches that I saw, I can understand the level of caution regarding a more automated upgrade process on the part of the Linux Mint team. Even so, I still wonder if the more manual alternative that they have pursued brings its own problems in the form of those that I met. The fact that the whole process took a few hours in comparison to the single hour taken by the in-situ upgrades that I mentioned earlier is another consideration that makes you wonder if it is all worth it every six months or so. Saying that, there is something to letting a user decide when to upgrade rather than luring one along to a new version, a point that is more than pertinent in light of the recent changes made to Ubuntu and Fedora. Whichever approach you care to choose, there are arguments in favour as well as counterarguments too.

Restoring the MBR for Windows 7

25th November 2010

During my explorations of dual-booting of Windows 7 and Ubuntu 10.10, I ended up restoring the master boot record (MBR) so that Windows 7 could load again or to find out if it wouldn’t start for me. The first hint that came to me when I went searching was the bootsect command but this only updates the master boot code on the partition so it did nothing for me. What got things going again was the bootrec command.

To use either of these, I needed to boot from a Windows 7 installation DVD. With my Toshiba Equium laptop, I needed to hold down the F12 key until I was presented with a menu that allowed me to choose from what drive I wanted to boot the machine, the DVD drive in this case. Then, the disk started and gave me a screen where I selected my location and moved to the next one where I selected the Repair option. After that, I got a screen where my Windows 7 installation was located. Once that was selected, I moved on to another screen from I started a command line session. Then, I could issue the commands that I needed.

bootsect /nt60 C:

This would repair the boot sector on the C: drive in a way that is compatible with BOOTMGR. This wasn’t enough for me but was something worth trying anyway in case there was some corruption.

bootrec /fixmbr
bootrec /fixboot

The first of these restores the MBR and the second sorts out the boot sector on the system drive (where the Windows directory resides on your system. In the event, I ran both of these and Windows restarted again, proving that it had come through disk partition changes without a glitch, though CHKDISK did run in the process but that’s understandable. There’s another option for those wanting to get back a boot menu and here it is:

bootrec /rebuildbcd

Though I didn’t need to do so, I ran that too but later used EasyBCD to remove the boot menu from the start-up process because it was surplus to my requirements. That’s a graphical tool that has gained something of a reputation since Microsoft dispensed with the boot.ini file that came with Windows XP for later versions of the operating system.

On upgrading from Fedora 13 to Fedora 14

7th November 2010

My Fedora box recently got upgraded to the latest version of the distribution (14) and I stuck to a method that I have used successfully before and one that isn’t that common with variants of Linux either. What I did was to go to the Fedora website and download a full DVD image, burn it to a disk and boot from that. Then, I chose the upgrade option from the menus and all went smoothly with only commonplace options needing selection from the menus and no data was lost either. Apparently. this way of going about things is only offered by the DVD option because the equivalent Live CD versions only do full installations.

However, there was another option that I fancied trying but was stymied by messages about a troublesome Dropbox repository. As I later discovered, that would have been easily sorted but I went for a tried and tested method instead. This was a pity because only two commands would have needed to be issued when logged in as root and it would have been good to have had a go with them:

yum update yum
yum --releasever=14 update --skip-broken

These may have done what I habitually do with Ubuntu upgrades but trying them out either will have to await the release of the next version or my getting around to setting up a Fedora virtual machine to see what happens. The latter course of action might be sensible anyway to see if all works without any problem before doing it for a real PC installation.

Upgrading to Fedora 13

1st June 2010

After having a spin of Fedora’s latest in a Virtualbox virtual machine on my main home PC, I decided to upgrade my Fedora box. First, I needed to battle imperfect Internet speeds to get an ISO image that I could burn to a DVD. Once that was in place, I rebooted the Fedora machine using the DVD and chose the upgrade option to avoid bringing a major upheaval upon myself. You need the full DVD for this because only a full installation is available from Live ISO images and CD’s.

All was graphical easiness and I got back into Fedora again without a hitch. Along with other bits and pieces, MySQL, PHP and Apache are working as before. If there was any glitch, it was with Netbeans 6.8 because the upgrade from the previous version didn’t seem to be a complete as hoped. However, it was nothing that an update of the open source variant of Java and Netbeans itself couldn’t resolve. There may have been untidy poking around before the solution was found but all has been well since then.

Best left until later in the year?

26th January 2010

In the middle of last year, my home computing experience was one of feeling displaced. A combination of a stupid accident and a power outage had rendered my main PC unusable. What followed was an enforced upgrade that use combination that was familiar to me: Gigabyte motherboard, AMD CPU and Crucial memory. However, assembling that lot and attaching components from the old system from the old system resulted in the sound of whirring fans but nothing appearing on-screen. Not having useful beeps to guide me meant that it was a case of undertaking educated guesswork until the motherboard was found to be at fault. In a situation like this, a deeper knowledge of electronics would have been handy and might have saved me money too. As for the motherboard, it is hard to say whether it was a faulty set from the outset or whether there was a mishap along the way, either due ineptitude with static or incompatibility with a power supply. What really tells the tale on the mainboard was the fact that all of the other components are working well in other circumstances, even that old power supply.

A few years back, I had another experience with a problematic motherboard, an Asus this time, that ate CPU’s and damaged a hard drive before I stabilised things. That was another upgrade attempted in the first half of the year. My first round of PC building was in the third quarter of 1998 and that went smoothly once I realised that a new case was needed. Similarly, another PC rebuild around the same time of year in 2005 was equally painless. Based on these experiences, I should not be blamed for waiting until later in the year before doing another rebuild, preferably a planned one rather than an emergency.

Of course, there may be another factor involved too. The hint was a non-working Sony DVD writer that was acquired early last year when it really was obvious that we were in the middle of a downturn. Could older unsold inventory be a contributor? Well, it fits in with seeing poor results twice, In addition, it would certainly tally with a problematical PC rebuild in 2002 following the end of the Dot Com bubble and after the deadly Al Qaeda attack on New York’s World Trade Centre. An IBM hard drive that was acquired may not have been the best example of the bunch and the same comment could apply to the Asus motherboard. The resulting construction may have been limping but it was working and I tolerated.

In contrast, last year’s episode had me launched into using a Toshiba laptop and a spare older PC for my needs with an external hard drive enclosure used to extract my data onto other external hard drives to keep me going. It felt a precarious arrangement but it was a useful experience in ways too. There was cause for making acquaintance with nearby PC component stores that I hadn’t visited before and I got to learning about things that otherwise wouldn’t have come my way. Using an external hard drive enclosure for accessing data on hard drives from a non-functioning PC is one of these. Discovering that it is possible to boot from external optical and hard disk drives came as a surprise too and will work so long as there is motherboard support for it. Another experience came from a crisis of confidence that had me acquiring a bare-bones system from Novatech and populating it with optical and hard disk drives. Then, I discovered that I have no need for power supplies rated more than 300 watts (around 200 W suffices). Turning my PC off more often became a habit friendly both to the planet and to household running costs too. Then, there’s the beneficial practice of shopping locally and it can suffice even if what PC magazines stick on their hot lists but shopping online for those pieces doesn’t guarantee success either. All of these were useful lessons and, while I’d rather not throw away good money after bad, it goes to show that even unsuccessful acquisitions had something to offer in the form of learning opportunities. Whether you consider that is worthwhile is up to you.

Command Line Software Management

2nd December 2009

One of the nice things about a Debian-based Linux distribution is that it is easy to pull a piece of software onto your system from a repository using either apt-get or aptitude. Some may prefer to have a GUI but I find that the command line offers certain extra transparency that stops the “what’s it doing?” type of question. that’s never to say that the GUI-based approach hasn’t a place and I only go using it when seeking out a piece of software without knowing its aptitude-ready name. Interestingly, there are signs that Canonical may be playing with the idea of making Ubuntu’s Software Centre a full application management tool with updates and upgrades getting added to the current searching, installation and removal facilities. That well may be but it’s going to take a lot of effort to get me away from the command line altogether.

Fedora and openSUSE have their software management commands too in the shape of yum and zypper, respectively. The recent flurry of new operating system releases has had me experimenting with both of those distros on a real test machine. As might be expected, the usual battery of installation, removal and update activities are well served and I have been playing with software searching using yum too. What has yet to mature is in-situ distribution upgrading à la Ubuntu. In principle, it is possible but I got a black screen when I tried moving from openSUSE 11.1 to 11.2 within VirtualBox using instructions on the openSUSE website. Not wanting to wait, I reached for a Live CD instead and that worked a treat on both virtual and real machines. Being in an experiment turn of mind, I attempted the same to get from Fedora 11 to the beta release of its version 12. A spot of repository trouble got me using a Live CD in its place. You can perform an in-situ upgrade from a full Fedora DVD but the only option is system replacement when you have a Live CD. Once installation is out of the way, YAST can be ignored in favour of zypper and yum is good enough that Fedora’s GUI-using alternative can be ignored. It’s nice to see good transparent ideas taking hold elsewhere and may make migration between distros much easier too.

Rough?

11th November 2009

Was it because Canonical and friends kept Ubuntu in such a decent state from 8.04 through to 9.04 that things went a little quiet in the blogosphere on the subject of the well-known Linux distribution? If so, 9.10 might be proving more of a talking point and you have to wonder if this is such a good thing with the appearance of Windows 7 on the scene. Looking on the bright side, 10.04 will be an LTS release so there is some chance that any rough edges that are on display now could be resolved by next April. Even so, it might have been better not to see anything so obvious at all.

In truth, Ubuntu always has had its gaps and I have seen a few of their ilk over the last two years. Of these, a few have triggered postings on here. In fact, issues with accessing the BBC iPlayer still bring a goodly number of folk to this website. That may just be a matter of grabbing RealPlayer, now helpfully available as a DEB package, from the requisite place on the web and ensuring that Ubuntu-Restricted-Extras is in place too but you have to know that in the first place. Even so, unexpected behaviours like Palimpsest seeing every partition on a disk as a different drive and SIL Raid mappings being seen for hard drives that used to live on the main home PC that bit the dust earlier this year; it only happens on one of the machines that I have running Ubuntu so it may be hardware thing and newly added hard drive uses none of the SIL mapping either. Perhaps more seriously (is it something that a new user should be encountering?), a misfiring variant of Brasero had me moving to K3b. Then UFRaw was sluggish in batch but that’s nothing that having a Debian VM won’t overcome. Rough edges like these do get you asking if 9.10 was ready for the big time while making you reluctant to recommend it to mainstream users like my brother.

The counterpoint to the above is that 9.10 includes a host of under the bonnet changes like the introduction of Ext4 hard drive formatting, Xsplash to allow the faster system loading to occur unseen and GNOME 2.28. To someone looking in from outside like me, that looks like a lot of work and might explain the ingress of the annoyances that I have seen. Add to that the fact that we are between Debian releases so things like the optimised packaging of ImageMagick or UFRaw may not be so high up the list of the things to do, especially with the more general speed optimisations that were put in place for 9.10. With 10.04 set to be an LTS release so I’d be hoping that consolidation is the order of the day over the next five or six months but it seems to be the inclusion of new features and other such progress that get magazine reviewers giving higher ratings (Linux Format has given it a mark of 9 out of 10). With the mooted inclusion of GNOME 3 and its dramatically different interface in 10.10, they should get their fill of that. However, I’d like to see some restraint for the take of a smooth transition from the familiar GNOME 2.x to the new. If GNOME 3 stays very like its alpha builds, then the question as how users will take to it arises. Of course, there’s some time yet before we see GNOME 3 and, having seen how the Ubuntu developers transformed GNOME 2.28, I wouldn’t be surprised if the impact of any change could be dulled.

In summary, my few weeks with Ubuntu 9.10 as my main OS have thrown up no major roadblocks that would cause me to look at moving elsewhere; Fedora would be tempting if that situation were to arise. The irritations that I have seen are more like signs of a lack of polish and remain peripheral to day-to-day working if you discount CD/DVD burning. To be honest, there always have been roughnesses in Ubuntu but has the lack of sizeable change spoilt us? Whatever about how things feel afterwards, big changes can mean new problems to resolve and inspire blog posts describing any solutions so it’s not all bad. If that’s what Canonical wants to see, they might get it and the year ahead looks as if it is going to be an interesting one after a recent quieter period.

Still able to build PC systems

25th October 2009

This weekend has been something of a success for me on the PC hardware front. Earlier this year, a series of mishaps rendering my former main home PC unusable; it was a power failure that finished it off for good. My remedy was a rebuild using my then usual recipe of a Gigabyte motherboard, AMD CPU and crucial memory. However, assembling the said pieces never returned the thing to life and I ended up in no man’s land for a while, dependent on and my backup machine and laptop. That wouldn’t have been so bad but for the need for accessing data from the old behemoth’s hard drives but an external drive housing set that in order. Nevertheless, there is something unfinished about work with machines having a series of external drives hanging off them. That appearance of disarray was set to rights by the arrival of a bare bones system from Novatech in July with any assembly work restricted to the kitchen table.There was a certain pleasure in seeing a system come to life after my developing a fear that I had lost all of my PC building prowess.

That restoration of order still left finding out why those components bought earlier in the year didn’t work together well enough to give me a screen display on start-up. Having electronics testing equipment and the knowledge of its correct use would make any troubleshooting far easier but I haven’t got these. There is a place near to me where I could go for this but you are left wondering what might be said to a PC build gone wrong. Of course, the last thing that you want to be doing is embarking on a series of purchases that do not fix the problem, especially in the current economic climate.

One thing to suspect when all doesn’t turn out as hoped is the motherboard and, for whatever reason, I always suspect it last. It now looks as if that needs to change after I discovered that it was the Gigabyte motherboard that was at fault. Whether it was faulty from the outset or it came a cropper with a rogue power supply or careless with static protection is something that I’ll never know. An Asus motherboard did go rogue on me in the past and it might be that it ruined CPU’s and even a hard drive before I laid it to rest. Its eventual replacement put a stop to a year of computing misfortune and kick-started my reliance on Gigabyte. That faith is under question now but the 2009 computing hardware mishap seems to be behind me and any PC rebuilds will be done on tables and motherboards will be suspected earlier when anything goes awry.

Returning to the present, my acquisition of an ASRock K10N78 and subsequent building activities has brought a new system using an AMD Phenom X4 CPU and 4 GB of memory into use. In fact, I writing these very words using the thing. It’s all in a new TrendSonic case too (placing an elderly behemoth into retirement) and with a SATA hard drive and DVD writer. The new motherboard has onboard audio and graphics so external cards are not needed unless you are an audiophile and/or a gamer; for the record, I am neither. Those additional facilities make for easier building and fault-finding should the undesirable happen.

The new box is running the release candidate of Ubuntu 9.10 and it seems to be working without a hitch too. Earlier builds of 9.10 broke in their VirtualBox VM so you should understand the level of concern that this aroused in my mind; the last thing that you want to be doing is reinstalling an operating system because its booting capability breaks every other day. Thankfully, the RC seems to have none of these rough edges so I can upgrade the Novatech box, still my main machine and likely to remain so for now, with peace of mind when the time comes.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.