Choices, choices…
10th November 2007While choice is a great thing, too much of it can be confusing, and the world of Linux is a one very full of decisions. The first of these centres around the distro to use when taking the plunge; you quickly find that there can be quite a lot to it. In fact, it is a little like buying your first SLR/DSLR or your first car: you only really know what you are doing after your first one. Putting it another way, you only know how to get a house built after you have done just that.
With that in mind, it is probably best to play a little on the fringes of the Linux world before committing yourself. It used to be that you had two main choices for your dabbling:
- using a spare PC
- dual booting with Windows by either partitioning a hard drive or dedicating one for your Linux needs.
In these times, innovations such as Live CD distributions and virtualisation technology keep you away from such measures. In fact, I would suggest starting with the former and progressing to the latter for more detailed perusal; it's always easy to wipe and restore virtual machines anyway, so you can evaluate several distros at the same time if you have the hard drive space. It also a great way to decide which desktop environment you like. Otherwise, terms like KDE, GNOME, XFCE, etc. might not mean much.
The mention of desktop environments brings me to software choices because they do drive what software is available to you. For instance, the Outlook lookalike that is Evolution is more likely to appear where GNOME is installed than where you have KDE. The opposite applies to the music player Amarok. Nevertheless, you do find certain stalwarts making a regular appearance; Firefox, OpenOffice and the GIMP all fall into this category.
The nice thing about Linux is that distros more often than not contain all the software that you are likely to need. However, that doesn't mean that it is all on the disk and that you have to select what you need during the installation. Though there might have been a time when it might have felt like that, my recent experience has been that a minimum installation is set in place that does all the basics for you to easily add the extras later on an as needed basis. I have also found that online updates are a strong feature too.
Picking up what you need when you need it has major advantages, the big one being that Linux grows with you. You can add items like Apache, PHP and MySQL when you know what they are and why you need them. It's a long way from picking applications of which you know very little at installation time and with the suspicion that any future installation might land you in dependency hell while performing compilation of application source code; the temptation to install everything that you saw was a strong one. The "learn before you use" approach favoured by how things are done nowadays is an excellent one.
Even if life is easier in the Linux camp these days, there is no harm in sketching out your software needs. Any distribution should be able to fulfil most if not all of them. As it happened, the only third party application that I have needed to install on Ubuntu without recourse to Synaptic was VMware Workstation, and that procedure thankfully turned out to be pretty painless.
A fallback installation routine?
9th November 2007In a previous sustained spell of Linux meddling, the following installation routine was one that I encountered rather too often when RPM's didn't do what I required of them (having a SUSE distro in a world dominated by a Red Hat standard didn't make things any easier...):
tar xzvf progname.tar.gz; cd progname
The first part of the command extracts from a tarball compressed using gzip and the second one changes into the new directory created by the extraction. For files compressed with bzip use:
tar xjvf progname.tar.bz2; cd progname
The command below configures, compiles and installs the package, running the last part of the command in its own shell.
./configure; make; su -c make install
Yes, the procedure is a bit convoluted, but it would have been fine if it always worked. My experience was that the process was a far from foolproof one. For instance, an unsatisfied dependency is all that is needed to stop you in your tracks. Attempting to install a GNOME application on a KDE-based system is as good a way to encounter this result as any. Other horrid errors also played havoc with hopeful plans from time to time.
It shouldn't surprise you to find that I will be staying away from the compilation/installation business with my main Ubuntu system. Synaptic Package Manager and its satisfactory dependency resolution fulfil my needs well and there is the Update Manager too; I'll be leaving it for Canonical to do the testing and make the decisions regarding what is ready for my PC as they maintain their software repositories. My past tinkering often created a mess, and I'll be leaving that sort of experimentation for the safe confines of a virtual machine from now on...
Importing bookmarks into Firefox
8th November 2007Moving from one operating system to another like I have means that a certain amount of migration is in order. While I have already talked about migrating my email, there are lesser acts too. One of these is carrying across bookmarks into the new world. This should be an easy thing to achieve and, for the most part, it is. However, the Import... entry on the File menu of the main browser only brings in bookmarks from other applications. To get more flexibility, you need to open up the Bookmarks Manager window from the Bookmarks menu (Organise Bookmarks... is the entry that you need). The File menu of the Bookmarks Manager has entries named Import... and Export...; their functions should be very apparent. The former will read from a file, very useful if you do not want to disrupt what you already have. Another migration option is the potentially disruptive act of copying in an alternative bookmarks.html file into your Firefox profile folder and overwriting the one that's already there.
The case of a wide open restriction
7th November 2007The addition of IMAP capability to Gmail attracted a lot of attention in the blogosphere last week, and I managed to flick the switch for the beast courtesy of the various instructions that were out there. However, when I pottered back to the settings, the IMAP settings had disappeared.
A brief look at the Official Gmail Blog confirmed why: the feature wasn't to be available to those who hadn't set their language as US English. My setting of UK English explained why I wasn't seeing it again, a strange observation given that they are merely variants of the same language; I have no idea why I saw it the first time around.
My initial impression was that the language setting used was an operating system or browser one, but this is not how it is. In fact, it is the language that you set for Gmail itself in its settings; choosing US English was sufficient to make the IMAP settings reappear, while choosing UK English made them disappear again.
Personally, I am not certain why the distinction was made in the first place, but I have Evolution merrily working away with Gmail's IMAP interface without a bother. To get it going, I needed that imap.gmail.com needed an SSL connection while smtp.gmail.com needed a TLS one. After that, I was away and no port numbers needed to be supplied, unlike Outlook.
A matter of fonts…
6th November 2007It's when you pop from one operating system to another that you realise how operating system specific it is that fonts are. For instance, only one of the names in the following list are understood by Firefox on Ubuntu, the last one: Trebuchet MS, Lucida Grande, Verdana, Arial, Sans-Serif. The reason that San-Serif is understood is that it's a general font class name in the world of CSS. However, that does not mean that you still are not at the mercy of operating system fonts. In fact, font sizes vary and 16px in one font isn't the same as 16px in another; that can mean broken layouts if you are sufficiently clumsy.
As it happens, the main menu bar on my hillwalking blog should all fit on one line, yet it took up two lines when viewed on Linux. If it did that neatly, there wouldn't be much of a problem, but it didn't. While some CSS hacking could have repaired the situation, I went for a simpler solution for now: picking a Linux sans serif font that fitted the bill better. So popping in mentions of "Nimbus Sans L" in appropriate places in my stylesheet was the way that I went. Since I don't know how this appears in other Linux distributions, the wonders of virtualisation should allow me to find out.
If I was really concerned about the fonts that were being used, I could have gone with a server-side approach: embedded fonts. I haven't tried this for a while but differing browser support was a major issue when I did: you had to create a set of files for IE and for Netscape when I was investigating such things, hardly convenient even in those days when Opera was merely a speck on the horizon and Mozilla was nascent. Though it's a valid approach for those exclusive fonts, so is questioning why you are using them in the first place. Adobe's Flash is another option for those who obsess with fonts, though how users take to this remains an open question, as does the accessibility of the approach.
For now, I plan to continue evaluating how applications appear across different operating systems. For this purpose, virtualisation serves as an excellent tool, as do Live CDs. The latter is particularly useful for Linux distributions which the former has application with more scenarios: names OpenSolaris and, with a spot of tinkering, OS X come to mind. This presents an appealing concept, especially considering Firefox has essentially become a cross-platform standard in today's computing environment. Mind you, seeing how websites are rendered by Safari running on OS X might be of interest to some.
Looking at from the user's point of view rather than the web developer's, there remains a question regarding the visiting of websites that break because of the font conundrum. If you find this happening to you a lot, it may be an idea to bring in some TrueType or OpenType fonts. With Ubuntu, this is straightforward: fire up Synaptic, search for msttcorefonts and install that package along with any of its dependencies. Logging off from and on to the system will make the new fonts available. There was a time when more work was needed than that...
Restoring Anquet Mapping
4th November 2007One of the issues with an upheaval like that which I have experienced over the last week is that some things get messed up. Because of my hillwalking, I have electronic mapping for planning my excursions into the outdoors. My choice for this has been Anquet; I evaluated it, and it did what I needed. While I could have gone further with the evaluation, I couldn't be bothered with the hassle. Since it's a Windows-only offering as far as I know, good old VMware really proves its worth for this.
Though the software itself is a free download, it's the maps where they make their money, particularly with the 1:25000 scale mapping. I have a lot of 1:50000 mapping so it was with some disappointment that I discovered that it was no longer usable because licences had got lost. Some of the data files needed updating anyway, so I went ahead and did that to get something back. While that exercise cost me some money, I got licences and unlocking was possible. For the mapping where there were no updates, I needed to delete it and download it again, a slow process taking up many hours due to the size of the files and the way that Anquet delivers them. The updating had taken a similar amount of time.
At the end of this, all was well again. However, it would have been better if the licences hadn't disappeared and Anquet had a better way of restoring things than they do. I shouldn't have had to re-download several gigabytes of data, even if it was better than having to fork out more cash to replace what I had.
Moving Emails from Outlook to Evolution
3rd November 2007It seems a little strange to my eyes, but Evolution cannot import Outlook PST files. On one level, I see a certain amount of sense: after all, Outlook is a Windows application and Evolution remains resolutely on the Linux side of the divide. Nevertheless, it is still a pesky nuisance.
The cure is, very oddly, to import data from Outlook into Mozilla Thunderbird and pop the Thunderbird files into the Evolution mail folder. Both Evolution and Thunderbird share the same file formats, so all is hunky-dory, since Evolution should just realise that they are there and bring them in.
That's what happened for me, and I have now migrated all of my old emails. Evolution's single file import wizard is there for those times when a spot of extra persuasion is needed; the data files are those without the file extensions. As it happened, I didn't need it.
Turning the world on its head: running VMware on Ubuntu
2nd November 2007When Windows XP was my base operating system, I used VMware Workstation to peer into the worlds of Windows 2000, Solaris and various flavours of Linux, including Ubuntu. Now that I am using Ubuntu instead of what became a very flaky XP instance, VMware is still with me, which I am using it to keep a foot in the Windows universe. In fact, I have Windows 2000 and Windows XP virtual machines available to me that should supply my Windows needs.
An evaluation version of Workstation 6 is what I am using to power them and I must admit that I am likely to purchase a licence before the evaluation period expires. Installation turned out to be a relatively simple affair, starting with my downloading a compressed tarball from the VMware website. The next steps were to decompress the tarball (Ubuntu has an excellent tool, replete with a GUI, for this) and run vmware-install.pl. I didn't change any of the defaults and everything was set up without a bother.
In use, a few things have come to light. The first is that virtual machines must be stored on drives formatted with EXt3 or some other native Linux file system rather than on NTFS. Do the latter, and you get memory errors when you try starting a virtual machine; I know that I did and that every attempt resulted in failure. After a spot of backing up files, I converted one of my SATA drives from NTFS to Ext3. For sake of safety, I also mounted it as my home directory; the instructions on Ubuntu Unleashed turned out to be invaluable for this. I moved my Windows 2000 VM over and it worked perfectly.
Next on the list was a series of peculiar errors that cam to light when I was attempting to install Windows XP in a VM created for it. VMware was complaining about a CPU not being to run fast enough; 2 MHz was being stated for an Athlon 64 3000+ chip running at 1,58 GHz! Clearly, something was getting confused. Also, my XP installation came to a halt with a BSOD stating that a driver had gone into a loop, with Framebuf fingered as the suspect. I was seeing two symptoms of the same problem and its remedy was unclear. A message on a web forum put the idea of rebooting Ubuntu into my head, and that resolved the problem. For now, I'll be keeping an eye on it, though.
Otherwise, everything seems to be going well with this approach, and that's an encouraging sign. It looks as if my current Linux-based set up is one with which I am going to stay. This week has been an interesting one already, and I have no doubt that I'll continue to learn more as time goes on.
Setting up a test web server on Ubuntu
1st November 2007Installing all the bits and pieces is painless enough so long as you know what's what; Synaptic does make it thus. Interestingly, Ubuntu's default installation is a lightweight affair with the addition of any additional components involving downloading the packages from the web. The whole process is all very well integrated and doesn't make you sweat every time you need to install additional software. In fact, it resolves any dependencies for you so that those packages can be put in place too; it lists them, you select them and Synaptic does the rest.
Returning to the job in hand, my shopping list included Apache, Perl, PHP and MySQL, the usual suspects in other words. Perl was already there, as it is on many UNIX systems, so installing the appropriate Apache module was all that was needed. PHP needed the base installation as well as the additional Apache module. MySQL needed the full treatment too, though its being split up into different pieces confounded things a little for my tired mind. Then, there were the MySQL modules for PHP to be set in place too.
The addition of Apache preceded all of these, but I have left it until now to describe its configuration, something that took longer than for the others; the installation itself was as easy as it was for the others. However, what surprised me were the differences in its configuration set up when compared with Windows. There are times when we get the same software but on different operating systems, which means that configuration files get set up differently. The first difference is that the main configuration file is called apache2.conf on Ubuntu rather than httpd.conf as on Windows. Like its Windows counterpart, Ubuntu's Apache does use subsidiary configuration files. However, there is an additional layer of configurability added courtesy of a standard feature of UNIX operating systems: symbolic links. Rather than having a single folder with the all configuration files stored therein, there are two pairs of folders, one pair for module configuration and another for site settings: mods-available/mods-enabled and sites-available/sites-enabled, respectively. In each pair, there is a folder with all the files and another containing symbolic links. It is the presence of a symbolic link for a given configuration file in the latter that activates it. I learned all this when trying to get mod_rewrite going and changing the web server folder from the default to somewhere less susceptible to wrecking during a re-installation or, heaven forbid, a destructive system crash. It's unusual, but it does work, even if it takes that little bit longer to get things sorted out when you first meet up with it.
Apart from the Apache set up and finding the right things to install, getting a test web server up and running was a fairly uneventful process. All's working well now, and I'll be taking things forward from here; making website Perl scripts compatible with their new world will be one of the next things that need to be done.
Ubuntu: an appraisal of hardware support
31st October 2007After a painless start with Ubuntu, I have been able to overcome the obstacles placed in my way thus far. In fact, it is certain to yield a goodly number of blog posts, never a bad thing from my point of view. And so to this instalment...
For this post, I'll stick with the hardware side of things. Compared with previous voyages into the Linux universe, I have not encountered any "brick walls" placed in my path. Though audio support was one bugbear in the past, Ubuntu simply took care of that with no intervention from me. Then, I popped in a CD and music was played back to me, leaving me with the same confidence with MP3 files. In the same way, graphics were set up to my liking with having to lift a finger; while there is a proprietary ATI driver available, I'll stick with the standard set up since it easily works well enough for me. Printer set up needed a prod from my end, but it got on with things and found my HP LaserJet 1018 with nary a bother and all was set up rapidly. All other items of hardware but one scarcely merit a mention, so seamless was their detection and set up.
The one piece of hardware that made me work was my Epson Perfection 4490 Photo scanner. Though it wasn't supported out of the box, a spot of googling was all that it took to find out how to set things to rights. In fact, the best answer turned out to be on Ubuntu's forum, hardly a surprise really. The step-by-step instructions sent me over to Epson's repository of open source Linux drivers for the correct files; I did need to make sure I wasn't selecting 4990 in place of 4490, a straightforward thing to do. I snagged Debian RPM's and used alien to convert them to DEB files. Running dpkg as root did the installation and quick checks with the sane-find-scanner and scanimage commands revealed that all was well, to my clear relief.
Hardware support has always been an Achilles heel for Linux but, based on this experience, the Linux community seem to be more on top of it than ever before. The proprietary nature of the devices is an ever present challenge for driver developers, so getting as far as they have is an impressive achievement. It's a long way from roadblocks due to tempestuous support of modems, sound cards, printers and scanners and I seem to have got over the biggest hurdle on my Linux journey this time around.