TOPIC: INSTALLATION
Ubuntu upgrades: do a clean installation or use Update Manager?
9th April 2009Part of some recent "fooling" brought on by the investigation of what turned out to be a duff DVD writer was a fresh installation of Ubuntu 8.10 on my main home PC. It might have brought on a certain amount of upheaval, but it was nowhere near as severe as that following the same sort of thing with a Windows system. While a few hours was all that was needed, whether it is better to perform just an upgrade every time a new Ubuntu release is unleashed on the world or to go for a complete virgin installation instead. With Ubuntu 9.04 in the offing, that question takes on a more immediate significance than it otherwise might do.
Various tricks make the whole reinstallation idea more palatable. For instance, many years of Windows usage have taught me the benefits of separating system and user files. The result is that my home directory lives on a different disk to my operating system files. Add to that the experience of being able to reuse that home drive across different Linux distros, and even swapping from one distro to another becomes feasible. From various changes to my secondary machine, I can vouch that this works for Ubuntu, Fedora and Debian; the latter is what currently powers the said PC. Though you might have to use superuser powers to attend to ownership and access issues, the portability is certainly there, and it applies to anything kept on other disks too.
Naturally, there's always the possibility of losing programs that you have had installed, but losing the clutter can be liberating too. However, assembling a script made up of one or more apt-get install
commands can allow you to get many things back at a stroke. For example, I have a test web server (Apache/MySQL/PHP/Perl) set up, so this would be how I'd get everything back in place before beginning further configuration. It might be no bad idea to back up your collection of software sources, either; I have yet to add all the ones that I have been using back into Synaptic. Then there are closed source packages such as VirtualBox (yes, I know that there is an open-source edition) and Adobe Reader. After reinstating the former, all my virtual machines were available for me to use again, without further ado. Restoring the latter allowed me to grab version 9.1 (probably more secure anyway) and it inveigles itself into Firefox now too so the number of times that I need to go through the download shuffle before seeing the contents of a PDF are much reduced, though not eliminated by the Windows-like ability to see a PDF loaded in a browser tab. Moving from software to hardware for a moment, it looks like any bespoke actions such as my activating an Epson Perfection 4490 Photo scanner need to be repeated, but that was all that I had to do. Getting things back into order is not so bad, even if you have to allow a modicum of time for this.
What I have discussed so far are what might be categorised as the common or garden aspects of a clean installation, yet I have seen some behaviours that make me wonder if the usual Ubuntu upgrade path is sufficiently complete in its refresh of your system. The counterpoint to all of this is that I may not have been looking for some of these things before now. That may apply to my noticing that DSLR support seems to be better with my Canon and Pentax cameras both being picked up and mounted for me as soon as they are connected to a PC, the caveat being that they are themselves powered on for this to happen. Another surprise that may be new is that the BBC iPlayer's Listen Again works without further work from the user, a very useful development. It obviously wasn't that way before I carried out the invasive means. My previous tweaking might have prevented the in situ upgrade from doing its thing, but I do see the point of not upsetting people's systems with an overly aggressive update process, even if it means that some advances do not make themselves known.
So what's my answer regarding which way to go once Ubuntu Jaunty Jackalope appears? For the sake of avoiding initial disruption, I'd be inclined to go down the Update Manager route first, while reserving the right to do a fresh installation later on. All in all, I am left with the gut feeling is that the jury is still out on this one.
Finding out what kernel version is running
5th May 2008Here's the command that does the deed for me on Ubuntu:
uname -a
Usually, I only need it to find out what header files I need for any VMware repeat installations or reconfigurations.
Getting VMware Workstation working on Ubuntu 8.04
28th April 2008With every change of kernel, a re-installation of VMware becomes necessary, and my move to Ubuntu "Hardy Heron" 8.04 was punctuated by the same activity. However, the advent of the 2.6.24.x kernel meant that my usual means were no longer successful, so a new approach was needed.
That involved the mysteriously named vmware-any-any
patch, and version 116 of this seemed to set things to rights for me. Stopping the installation before vmware-config.pl
runs is the best course of action, since it will only fail anyway. Downloading vmware-any-any-update-116.tgz
, extracting from the archive and running runme.pl using sudo continues the process.
While it seemed to have worked for me, I must wonder at why VMware seems unbothered by the idea of keeping up with Linux kernels and C compilers. It would certainly have removed the need for the user community needing to do anything about the problems that others and I keep seeing; it's a very unusual arrangement.
Do I still need serial numbers?
19th November 2007My spot of bad luck with Windows in August highlighted the importance of hanging on to serial numbers for software that I had purchased over the internet and downloaded. Though I could get at the ones that I needed, they were retained in a motley mix of text files and emails; one even was rediscovered by pottering back to the website of the purveyor. While the security of the installation files themselves was another matter of some concern, I was rather more organised in that regard. Both of these are things that need checking before Windows falls to pieces on you and needs to be reinstalled. Of course, human nature, being what it is, means that we often find ourselves picking up the pieces after a calamity has struck when a spot of planning would have made things that bit easier.
Linux does make life easier on this front: commercial applications are anything but the dominant force that they are in the world of Windows. That means that serial numbers are few and far between, and I only need the one for VMware Workstation. The mention of VMware brings me to my retention of Windows, so knowing where serial numbers are located remains a good idea. Even so, I cloned my Windows VM so that any Windows restoration following a destructive crash should be a quicker affair. Now that I am a Linux user, Windows crashes should not encroach as much on my home computing any more and Linux should be more stable anyway...
Setting up openSUSE in VMware Workstation
12th November 2007While it should have been as straightforward as following the instructions on the openSUSE website, a bug in VMware Tools derailed things for me. The usual procedure would have you starting by selecting Install VMware Tools from the VM menu before popping into the virtual machine to do the rest. Once binutils
, gcc
, gcc-c++
, kernel-source and make are in place, the next steps should involve using YaST to install the RPM for you to run the vmware-config-tools.pl script from the terminal.
However, a bug in compat_slab.h puts a stop to any hopes of installing the vmhgfs
component. That's needed if you like to enable the shared folders feature; looking in /mnt/hgfs
then would get you to any shared folders. While everything else will be there, why miss out on one piece of functionality when it comes in useful?
Having found a useful thread on the subject, here's my way forward: it is as the expected procedure up to the point of installing the RPM. With VMware Tool installation on a Linux guest, you have two options: use RPM as described or use the compressed tarball. The latter seems the better course. Extract the contents into a folder and navigate to that folder. When there, go into vmware-tools-distrib/lib/modules/source
and extract the file vmhgfs.tar
. Proceed into the resulting vmhgfs-only
contained wherever you put it and perform the following edit of compat_slab.h:
Change
#if LINUX_VERSION_CODE < KERNEL_VERSION(2, 6, 22) || defined(VMW_KMEMCR_HAS_DTOR)
to
#if LINUX_VERSION_CODE <= KERNEL_VERSION(2, 6, 22) || defined(VMW_KMEMCR_HAS_DTOR)
After that, recreate and replace vmhgfs.tar before issuing the following command in the terminal window while in the vmware-tools-distrib
directory: ./vmware-config-tools.pl (anything prefixed with "./" picks up the file from the current working directory rather than where system binaries are stored). Though a kernel compilation will be involved, all the defaults should be sensible. Hopefully, all will work well after this.
Update: I am left with a number of outstanding issues that I need to resolve. Lack of internet access from the VM is one of them, and a constant forgetfulness regarding the nationality of my keyboard (it's British) might be another. In the interim, I have removed VMware tools until I can spend some time setting these to rights. That means internet access has returned, and the British keyboard layout is being interpreted correctly for now...
A fallback installation routine?
9th November 2007In a previous sustained spell of Linux meddling, the following installation routine was one that I encountered rather too often when RPM's didn't do what I required of them (having a SUSE distro in a world dominated by a Red Hat standard didn't make things any easier...):
tar xzvf progname.tar.gz; cd progname
The first part of the command extracts from a tarball compressed using gzip
and the second one changes into the new directory created by the extraction. For files compressed with bzip
use:
tar xjvf progname.tar.bz2; cd progname
The command below configures, compiles and installs the package, running the last part of the command in its own shell.
./configure; make; su -c make install
Yes, the procedure is a bit convoluted, but it would have been fine if it always worked. My experience was that the process was a far from foolproof one. For instance, an unsatisfied dependency is all that is needed to stop you in your tracks. Attempting to install a GNOME application on a KDE-based system is as good a way to encounter this result as any. Other horrid errors also played havoc with hopeful plans from time to time.
It shouldn't surprise you to find that I will be staying away from the compilation/installation business with my main Ubuntu system. Synaptic Package Manager and its satisfactory dependency resolution fulfil my needs well and there is the Update Manager too; I'll be leaving it for Canonical to do the testing and make the decisions regarding what is ready for my PC as they maintain their software repositories. My past tinkering often created a mess, and I'll be leaving that sort of experimentation for the safe confines of a virtual machine from now on...
Turning the world on its head: running VMware on Ubuntu
2nd November 2007When Windows XP was my base operating system, I used VMware Workstation to peer into the worlds of Windows 2000, Solaris and various flavours of Linux, including Ubuntu. Now that I am using Ubuntu instead of what became a very flaky XP instance, VMware is still with me, which I am using it to keep a foot in the Windows universe. In fact, I have Windows 2000 and Windows XP virtual machines available to me that should supply my Windows needs.
An evaluation version of Workstation 6 is what I am using to power them and I must admit that I am likely to purchase a licence before the evaluation period expires. Installation turned out to be a relatively simple affair, starting with my downloading a compressed tarball from the VMware website. The next steps were to decompress the tarball (Ubuntu has an excellent tool, replete with a GUI, for this) and run vmware-install.pl. I didn't change any of the defaults and everything was set up without a bother.
In use, a few things have come to light. The first is that virtual machines must be stored on drives formatted with EXt3 or some other native Linux file system rather than on NTFS. Do the latter, and you get memory errors when you try starting a virtual machine; I know that I did and that every attempt resulted in failure. After a spot of backing up files, I converted one of my SATA drives from NTFS to Ext3. For sake of safety, I also mounted it as my home directory; the instructions on Ubuntu Unleashed turned out to be invaluable for this. I moved my Windows 2000 VM over and it worked perfectly.
Next on the list was a series of peculiar errors that cam to light when I was attempting to install Windows XP in a VM created for it. VMware was complaining about a CPU not being to run fast enough; 2 MHz was being stated for an Athlon 64 3000+ chip running at 1,58 GHz! Clearly, something was getting confused. Also, my XP installation came to a halt with a BSOD stating that a driver had gone into a loop, with Framebuf fingered as the suspect. I was seeing two symptoms of the same problem and its remedy was unclear. A message on a web forum put the idea of rebooting Ubuntu into my head, and that resolved the problem. For now, I'll be keeping an eye on it, though.
Otherwise, everything seems to be going well with this approach, and that's an encouraging sign. It looks as if my current Linux-based set up is one with which I am going to stay. This week has been an interesting one already, and I have no doubt that I'll continue to learn more as time goes on.
Setting up a test web server on Ubuntu
1st November 2007Installing all the bits and pieces is painless enough so long as you know what's what; Synaptic does make it thus. Interestingly, Ubuntu's default installation is a lightweight affair with the addition of any additional components involving downloading the packages from the web. The whole process is all very well integrated and doesn't make you sweat every time you need to install additional software. In fact, it resolves any dependencies for you so that those packages can be put in place too; it lists them, you select them and Synaptic does the rest.
Returning to the job in hand, my shopping list included Apache, Perl, PHP and MySQL, the usual suspects in other words. Perl was already there, as it is on many UNIX systems, so installing the appropriate Apache module was all that was needed. PHP needed the base installation as well as the additional Apache module. MySQL needed the full treatment too, though its being split up into different pieces confounded things a little for my tired mind. Then, there were the MySQL modules for PHP to be set in place too.
The addition of Apache preceded all of these, but I have left it until now to describe its configuration, something that took longer than for the others; the installation itself was as easy as it was for the others. However, what surprised me were the differences in its configuration set up when compared with Windows. There are times when we get the same software but on different operating systems, which means that configuration files get set up differently. The first difference is that the main configuration file is called apache2.conf
on Ubuntu rather than httpd.conf
as on Windows. Like its Windows counterpart, Ubuntu's Apache does use subsidiary configuration files. However, there is an additional layer of configurability added courtesy of a standard feature of UNIX operating systems: symbolic links. Rather than having a single folder with the all configuration files stored therein, there are two pairs of folders, one pair for module configuration and another for site settings: mods-available/mods-enabled and sites-available/sites-enabled, respectively. In each pair, there is a folder with all the files and another containing symbolic links. It is the presence of a symbolic link for a given configuration file in the latter that activates it. I learned all this when trying to get mod_rewrite going and changing the web server folder from the default to somewhere less susceptible to wrecking during a re-installation or, heaven forbid, a destructive system crash. It's unusual, but it does work, even if it takes that little bit longer to get things sorted out when you first meet up with it.
Apart from the Apache set up and finding the right things to install, getting a test web server up and running was a fairly uneventful process. All's working well now, and I'll be taking things forward from here; making website Perl scripts compatible with their new world will be one of the next things that need to be done.
A move to Ubuntu?
30th October 2007After a pretty rotten weekend attempting to keep Windows XP running, I finally lost the will to persevere and began yearning for stability. That has taken me into the world of Ubuntu; I am writing this in Firefox running on the said Linux distribution. Thanks to the wonders of VMware, I have been able to observe the swish and slick nature of Ubuntu, and I must that it did sway me. Installation has been slick and efficient and is a dream compared to XP, let alone previous Linux incarnations that I have encountered over the years. Start up is also speedy. All in all, there appears to be a certain confidence about the OS that was sadly absent from my Windows experience in recent times.
Still, I am not deserting the world of Windows completely, though. As it happens, I installed Ubuntu on a spare hard drive that I had, so the Windows installation is still out there. In addition, VMware virtual machines should allow me to stay in there without the ever present risk of a PC getting rendered inoperable. There is also the unfinished business of making myself at home on Ubuntu, hopefully without my wrecking anything. I have yet to give my hardware a full workout to check that all is well. Setting up a web development capability is also on the cards, as is getting those virtual machines. Assuming that there are no showstoppers, it could be an interesting ride.
A simple to create a batch file running a load of files one after another
25th October 2007Repairing Windows installations like I have had to do all too often in the last few weeks means that I have a load of updates that need to be performed. My preference for using Shavlik NetChk Protect that I have a folder full of executable path files. That encapsulates the first step: creating a folder and adding the files that you want to run. The next step is to run a command like this:
dir /b * > exec.bat
The /b
is the switch that gives a bare file list and that is stored in exec.bat. Running exec.bat affords a bit more automation. While it is that this might need a spot more sophistication to be truly automatic, it's still a good start.