Technology Tales

Adventures & experiences in contemporary technology

Sorting out MySQL on Arch Linux

5th November 2011

Seeing Arch Linux running so solidly in a VirtualBox virtual box has me contemplating whether I should have it installed on a real PC. Saying that, recent announcements regarding the implementation of GNOME 3 in Linux Mint have caught my interest even if the idea of using a rolling distribution as my main home operating system still has a lot of appeal for me. Having an upheaval come my way every six months when a new version of Linux Mint is released is the main cause of that.

While remaining undecided, I continue to evaluate the idea of Arch Linux acting as my main OS for day-to-day home computing. Towards that end, I have set up a working web server instance on there using the usual combination of Apache, Perl, PHP and MySQL. Of these, it was MySQL that went the least smoothly of all because the daemon wouldn’t start for me.

It was then that I started to turn to Google for inspiration and a range of actions resulted that combined to give the result that I wanted. One problem was a lack of disk space caused by months of software upgrades. Since tools like it in other Linux distros allow you to clear some disk space of obsolete installation files, I decided to see if it was possible to do the same with pacman, the Arch Linux command line package manager. The following command, executed as root, cleared about 2 GB of cruft for me:

pacman -Sc

The S in the switch tells pacman to perform package database synchronization while the c instructs it to clear its cache of obsolete packages. In fact, using the following command as root every time an update is performed both updates software and removes redundant or outmoded packages:

pacman -Syuc

So I don’t forget the needful housekeeping, this will be what I use in future with the y being the switch for a refresh and the u triggering a system upgrade. It’s nice to have everything happen together without too much effort.

To do the required debugging that led me to the above along with other things, I issued the following command:

mysqld_safe --datadir=/var/lib/mysql/ &

This starts up the MySQL daemon in safe mode if all is working properly and it wasn’t in my case. Nevertheless, it creates a useful log file called myhost.err in /var/lib/mysql/. This gave me the messages that allowed the debugging of what was happening. It led me to installing net-tools and inettools using pacman; it was the latter of these that put hostname on my system and got the MySQL server startup a little further along. Other actions included unlocking the ibdata1 data file and removing the ib_logfile0 and ib_logfile1 files so as to gain something of a clean sheet. The kill command was used to shut down any lingering mysqld sessions too. To ensure that the ibdata1 file was unlocked, I executed the following commands:

mv ibdata1 ibdata1.bad
cp -a ibdata1.bad ibdata1

These renamed the original and then crated a new duplicate of it with the -a switch on the cp command forcing copying with greater integrity than normal. Along with the various file operations, I also created a link to my.cnf, the MySQL configuration file on Linux systems, in /etc using the following command executed by root:

ln -s /etc/mysql/ my.cnf /etc/my.cnf

While I am unsure if this made a real difference, uncommenting the lines in the same file that pertained to InnoDB tables. What directed me to these were complaints from mysqld_safe in the myhost.err log file. All I did was to uncomment the lines beginning with “innodb” and these were 116-118, 121-122 and 124-127 in my configuration file but it may be different in yours.

After all the above, the MySQL daemon ran happily and, more importantly, started when I rebooted the virtual machine. Thinking about it now, I believe that was a lack of disk space, the locking of a data file and the lack of InnoDB support that was stopping the MySQL service from running.Running commands like mysqld start weren’t yielding useful messages so a lot of digging was needed to get the result that I needed. In fact, that’s one of the reasons why I am sharing my experiences here.

In the end, creating databases and loading them with data was all that was needed for me to start see functioning websites on my (virtual) Arch Linux system. It turned out to be another step on the way to making it workable as a potential replacement for the Linux distributions that I use most often (Linux Mint, Fedora and Ubuntu).

A waiting game

20th August 2011

Having been away every weekend in July, I was looking forward to a quiet one at home to start August. However, there was a problem with one of my websites hosted by Fasthosts that was set to occupy me for the weekend and a few weekday evenings afterwards.

The issue appeared to be slow site response so I followed advice given to me by second line support when this website displayed the same type of behaviour: upgrade from Apache 1.3 to 2.2 using the control panel. Unfortunately for me, that didn’t work smoothly at all and there seemed to be serious file loss as a result. Raising a ticket with the support desk only got me the answer that I had to wait for completion and I now have come to the conclusion that the migration process may have got stuck somewhere along the way. Maybe another ticket is in order.

There were a number of causes of the waiting that gave rise to the title of this post. Firstly, support for low costing isn’t exactly timely and I do wonder if it’s any better for more prominent websites. Restoration of websites by FTP is another activity that takes up plenty of time as does rebuilding databases and populating them with data. Lastly, there’s changing the DNS details for a website. In hindsight, there may be ways of reducing the time demands of these. For instance, contacting a support team by telephone may be quicker unless there is a massive queue awaiting attention and there was a wait of several hours one night when a security changeover affected a multitude of Fasthosts users. Of course, it is not a panacea at the best of times as we have known since all those stories began to do the rounds in the middle of the 1990’s. Doing regular backups would help the second though the ones that I was using for the restoration weren’t too bad at all. Nevertheless, they weren’t complete so there was unfinished business that required resolution later. The last of these is helped along by more regular PC restarts so that unexpected discovery will remain a lesson for the future though I don’t plan on moving websites around for a while. After all, getting DNS details propagated more quickly really is a big help.

While awaiting a response from Fasthosts, I began to ponder the idea of using an alternative provider. Perusal of the latest digital edition of .Net (I now subscribe to the non-paper edition so as to cut down on the clutter caused by having paper copies about the place) ensued before I decided to investigate the option of using Webfusion. Having decided to stick with shared hosting, I gave their Unlimited Linux option a go. For someone accustomed to monthly billing, it was unusual to see annual biannual and triannual payment schemes too. The first of these appears to be the default option so a little care and attention is needed if you want something else. In order to encourage you to stay with Webfusion longer, the per month is on sliding scale: the longer the period you buy, the lower the cost of a month’s hosting.

Once the account was set up, I added a database and set to the long process of uploading files from my local development site using FileZilla. Having got a MySQL backup from the Fasthosts site, I used the provided PHPMyAdmin interface to upload the data in pieces not exceeding the 8 MB file size limitation. It isn’t possible to connect remotely to the MySQL server using the likes of MySQL Administrator so I bear with this not so smooth process. SSH is another connection option that isn’t available but I never use it much on Fasthosts sites anyway. There were some questions to the support people along and the first of these got a timely answer though later ones took longer before I got an answer. Still, getting advice on the address of the test website was a big help while I was sorting out the DNS changeover.

Speaking of the latter, it took a little doing and not little poking around Webfusion’s FAQ’s before I made it happen. First, I tried using name servers that I found listed in one of the articles but this didn’t seem to achieve the end that I needed. Mind you, I would have seen the effects of this change a little earlier if I had rebooted my PC earlier than I did than I did but it didn’t occur to me at the time. In the end, I switched to using my domain provider’s name servers and added the required information to them to get things going. It was then that my website was back online in some fashion so I could any outstanding loose ends.

With the site essentially operating again, it was time to iron out the rough edges. The biggest of these was that MOD_REWRITE doesn’t seem to work the same on the Webfusion server like it does on the Fasthosts ones. This meant that I needed to use the SCRIPT_URI CGI variable instead of PATH_INFO in order to keep using clean URL’s for a PHP-powered photo gallery that I have. It took me a while to figure that out and I felt much better when I managed to get the results that I needed. However, I also took the chance to tidy up site addresses with redirections in my .htaccess file in an attempt to ensure that I lost no regular readers, something that I seem to have achieved with some success because one such visitor later commented on a new entry in the outdoors blog.

Once any remaining missing images were instated or references to them removed, it was then time to do a full backup for sake of safety. The first of these activities was yet another consumer while the second didn’t take so long and I need to do this more often in case anything happens. Hopefully though, the relocated site’s performance continues to be as solid as it is now.

The question as to what to do with the Fasthosts webspace remains outstanding. Currently, they are offering free upgrades to existing hosting packages so long as you commit for a year. After my recent experience, I cannot say that I’m so sure about doing that kind of thing. In fact, the observation leaves me wondering if instating that very extension was the cause of breaking my site. In fact, it appears that the migration from Apache 1.3 to 2.2 seems to have got stuck for whatever reason. Maybe another ticket should be raised but I am not decided on that yet. All in all, what happened to that Fasthosts website wasn’t the greatest of experiences but the service offered by Webfusion is rock solid thus far. While wondering if the service from Fasthosts wasn’t as good as it once was, I’ll keep an open mind and wait to see if my impressions change over time.

Tinkering with Textpattern

26th April 2011

Textpattern 5 may be on the way but that isn’t to say that work on the 4.x branch is completely stopped though it is less of a priority at the moment. After all, version 4.40 was slipped out not so long ago as a security release, a discovery that I made while giving a section of my outdoors website a spring refresh. During that activity, the TinyMCE plugin started to grate with its issuing of error messages in the form of dialogue boxes needing user input to get rid of them every time an article was opened or saved. Because of that nuisance, the guilty hak_tinymce plugin was ejected with joh_admin_ckeditor replacing it and bringing CKEditor into use for editing my Textpattern articles. It is working well though the narrow editing area is causing the editor toolbars to take up too much vertical space but you can resize the editor to solve this though it would be better if it could be made to remember those size settings.

Another find was atb_editarea, a plugin that colour codes (X)HTML, PHP and CSS by augmenting the standard text editing for pages and stylesheets in the Presentation part of the administration interface. If I had this at the start of my redesign, it would have made doing the needful that bit more user-friendly than the basic editing facilities that Textpattern offers by default. Of course, the tinkering never stops so there’s no such thing as finding something too late in the day for it to be useful.

Textpattern may not be getting the attention that some of its competitors are getting but it isn’t being neglected either; its users and developer community see to that. Saying that, it needs to get better at announcing new versions of the CMS so they don’t slip by the likes of me who isn’t looking all the time. With a major change of version number involved, curiosity is aroused as what is coming next. So far, Textpattern appears to be taking an evolutionary course and there’s a lot to be said for such an approach.

Moving from Ubuntu 10.10 to Linux Mint 10

23rd April 2011

With a long Easter weekend available to me and with thoughts of forthcoming changes in the world of Ubuntu, I got to wondering about the merits of moving my main home PC to Linux Mint instead. Though there is a rolling variant based on Debian, I went for the more usual one based on Ubuntu that uses GNOME. For the record, Linux Mint isn’t just about the GNOME desktop but you also can have it with Xfce, LXDE and KDE desktops as well. While I have been known to use Lubuntu and like its LXDE implementation, I stuck with the option of which I have most experience.

Once I selected the right disk for the boot loader, the main installation of Mint went smoothly. By default, Ubuntu seems to take care of this but Mint leaves it to you. When you have your operating system files on sdc, installation on the default of sda isn’t going to produce a booting system. Instead, I ended up with GRUB errors and, while I suppose that I could have resolved these, the lazier option of repeating the install with the right boot loader location was the one that I chose. It produced the result that I wanted: a working and loading operating system.

However, there was not something not right about the way that the windows were displayed on the desktop with title bars and window management not working as they should. Creating a new account showed that it was the settings that were carried over from Ubuntu in my home area that were the cause. Again, I opted for a less strenuous option and moved things from the old account to the new one. One outcome of that decisions was that there was a lot of use of the chown command in order to get file and folder permissions set for the new account. In order to make this all happen, the new account needed to be made into an Administrator just like its predecessor; by default, more restrictive desktop accounts are created using the Users and Groups application from the Administration submenu. Once I was happy that the migration was complete, I backed up any remaining files from the old user folder and removed it from the system. Some of the old configuration files were to find a new life with Linux Mint.

In the middle of the above, I also got to customising my desktop to get the feel that is amenable. For example, I do like a panel at the top and another at the bottom. By default, Linux Mint only comes with the latter. The main menu was moved to the top because I have become used to having there and switchers for windows and desktops were added at the bottom. They were only a few from what has turned out not to be a short list of things that I fancied having: clock, bin, clearance of desktop, application launchers, clock, broken application killer, user switcher, off button for PC, run command and notification area. It all was gentle tinkering but still is the sort of thing that you wouldn’t want to have to do over and over again. Let’s hope that is the case for Linux Mint upgrades in the future. That the configuration files for all of these are stored in home area hopefully should make life easier, especially when an in-situ upgrade like that for Ubuntu isn’t recommended by the Mint team.

With the desktop arranged to my liking, the longer job of adding to the collection of software on there while pruning a few unwanted items too was next. Having had Apache, PHP and MySQL on the system before I popped in that Linux Format magazine cover disk for the installation, I wanted to restore them. To get the off-line websites back, I had made copies of the old Apache settings that simply were copied over the defaults in /etc/apache (in fact, I simply overwrote the apache directory in /etc but the effect was the same). MySQL Administrator had been used to take a backup of the old database too. In the interests of spring cleaning, I only migrated a few of the old databases from the old system to the new one. In fact, there was an element of such tidying in my mind when I decided to change Linux distribution in the first place; Ubuntu hadn’t been installed from afresh onto the system for a while anyway and some undesirable messages were appearing at update time though they were far from being critical errors.

The web server reinstatement was only part of the software configuration that I was doing and there was a lot of use of apt-get while this was in progress. A rather diverse selection was added: Emacs, NEdit, ClamAV, Shotwell (just make sure that your permissions are sorted first before getting this to use older settings because anything inaccessible just gets cleared out; F-Spot was never there is the first place in my case but it may differ for you), UFRaw, Chrome, Evolution (never have been a user of Mozilla Thunderbird, the default email client on Mint), Dropbox, FileZilla, MySQL Administrator, MySQL Query Browser, NetBeans, POEdit, Banshee (Rhythmbox is what comes with Mint but I replaced it with this), VirtualBox and GParted. This is quite a list and while I maybe should have engaged the services of dpkg to help automate things, I didn’t on this occasion though Mint seems to have a front end for it that does the same sort of thing. Given that the community favour clean installations, it’s little that something like this is on offer in the suite of tools in the standard installation. This is the type of rigmarole that one would not draw on themselves too often.

With desktop tinkering and software installations complete, it was time to do a little more configuration. In order to get my HP laser printer going, I ran hp-setup to download the (proprietary, RMS will not be happy…) driver for it because it otherwise wouldn’t work for me. Fortune was removed from the terminal sessions because I like them to be without such things. To accomplish this, I edited /etc/bash.bashrc and commented out the /usr/games/fortune line before using apt-get to clear the software from my system. Being able to migrate my old Firefox and Evolution profiles, albeit manually, has become another boon. Without doubt, there are more adjustments that I could be making but I am happy to do these as and when I get to them. So far, I have a more than usable system, even if I engaged in more customisation than many users would go doing.

It probably is useful to finish this by sharing my impressions of Linux Mint. What goes without saying is that some things are done differently and that is to be expected. Distribution upgrades are just one example but there are tools available to make clean installations that little bit easier. To my eyes, the desktop looks very clean and fond display is carried over from Ubuntu, not at all a bad thing. That may sound a small matter but it does appear to me that Fedora and openSUSE could learn a thing or too about how to display fonts on screen on their systems. It is the sort of thing that adds the spot of polish that leaves a much better impression. So far, it hasn’t been any hardship to find my way around and I can make the system fit my wants and needs. That it looks set to stay that way is another bonus. We have a lot of change coming in the Linux world with GNOME 3 on the way and Ubuntu’s decision to use Unity as their main desktop environment. While watching both of these developments mature, it looks as if I’ll be happily using Mint. Change can refresh but a bit of stability is good too.

An avalanche of innovation?

23rd September 2010

It seems that, almost in spite of the uncertain times or maybe because of them, it feels like an era of change on the technology front. Computing is the domain of many of the postings on this blog and a hell of a lot seems to be going mobile at the moment. For a good while, I managed to stay clear of the attractions of smartphones until a change of job convinced me that having a BlackBerry was a good idea. Though the small size of the thing really places limitations on the sort of web surfing experience that you can have with it, you can keep an eye on the weather, news, traffic, bus and train times so long as the website in question is built for mobile browsing. Otherwise, it’s more of a nuisance than a patchy phone network (in the U.K., T-Mobile could do better on this score as I have discovered for myself; thankfully, a merger with the Orange network is coming next month).

Speaking of mobile websites, it almost feels as if a free for all has recurred for web designers. Just when the desktop or laptop computing situation had more or less stabilised, along come a whole pile of mobile phone platforms to make things interesting again. Familiar names like Opera, Safari, Firefox and even Internet Explorer are to be found popping up on handheld devices these days along with less familiar ones like Web ‘n’ Walk or BOLT. The operating system choices vary too with iOS, Android, Symbian, Windows and others all competing for attention. It is the sort of flowering of innovation that makes one wonder if a time will come when things begin to consolidate but it doesn’t look like that at the moment.

The transformation of mobile phones into handheld computers isn’t the only big change in computing with the traditional formats of desktop and laptop PC’s being flexed in all sorts of ways. First, there’s the appearance of netbooks and I have succumbed to the idea of owning an Asus Eee. Though you realise that these are not full size laptops, it still didn’t hit me how small these were until I owned one.  They are undeniably portable and tablets look even more interesting in the aftermath of Apple’s iPad. You may call them over-sized mobile photos but the idea of making a touchscreen do the work for you has made the concept fly for many. Even so, I cannot say that I’m overly tempted though I have said that before about other things.

Another area of interest for me is photography and it is around this time of year that all sorts of innovations are revealed to the public. It’s a long way from what we thought was the digital photography revolution when digital imaging sensors started to take the place of camera film in otherwise conventional compact and SLR cameras, making the former far more versatile than they used to be. Now, we have SLD cameras from Olympus, Panasonic, Samsung and Sony that eschew the reflex mirror and prism arrangement of an SLR using digital sensor and electronic viewfinders while offering the possibility of lens interchangeability and better quality than might be expected from such small cameras. In recent months, Sony has offered SLR-style cameras with translucent mirror technology instead of the conventional mirror that is flipped out of the way when a photographic image is captured.  Change doesn’t end there with movie making capabilities being part of the toolset of many a newly launch  compact, SLD and SLR camera. The pixel race also seems to have ended though increases still happen as with the Pentax K-5 and Canon EOS 60D (both otherwise conventional offerings that have caught my eye though so much comes on the market at this time of year that waiting is better for the bank balance).

The mention of digital photography brings to mind the subject of digital image processing and Adobe Photoshop Elements 9 is just announced after Photoshop CS5 appeared earlier this year. It almost feels as if a new version of Photoshop or its consumer cousin are released every year, causing me to skip releases when I don’t see the point. Elements 6 and 8 were such versions for me and I’ll be in no hurry to upgrade to 9 yet either though the prospect of using content aware filling to eradicate unwanted objects from images is tempting. Nevertheless, that shouldn’t stop anyone trying to exclude them in the first place. In fact, I may need to reduce the overall number of images that I collect in favour of bringing away only good ones. The outstanding question on this is can I slow down and calm my eagerness to bring at least one good image away from an outing by capturing anything that seems promising at the time. Some experimentation but being a little more choosy can  save work later on.

While back on the subject of software, I’ll voyage in to the world of the web before bringing these meanderings to a close. It almost feels as if there is web-based application following web-based application these days when Twitter and Facebook nearly have become household names and cloud computing is a phrase that turns up all over the place.  In fact, the former seems to have encouraged a whole swathe of applications all of itself. Applications written using technologies well used on the web must stuff many a mobile phone app store too and that brings me full circle for it is these that put so much functionality on our handsets with Java seemingly powering those I use on my BlackBerry. Them there’s spat between Apple and Adobe regarding the former’s support for Flash.

To close this mental amble, there may be technologies that didn’t come to mind while I was pondering this piece but they doubtless enliven the technological landscape too. However, what I have described is enough to take me back more than ten years ago when desktop computing and the world of the web were a lot more nascent than is the case today. Then, the changes that were ongoing felt a little exciting now that I look back on them and it does feel as if the same sort of thing is recurring though with things like phones creating the interest in place of new developments in desktop computing such as a new version of Window (though 7 was anticipated after Vista). Web designers may complain about a lack of standardisation and they’re not wrong but this may be an ear of technological change that in time may be remembered with its own fondness too.

On web browsers for BlackBerry devices

8th August 2010

The browser with which my BlackBerry Curve 8520 came is called Web’n’Walk and, while it does have its limitations, it works well enough for much of what I want to do. Many of the sites that I want to visit while away from a PC have mobile versions that are sufficiently functionality for much of what I needed to do. Names like GMail, Google Reader, Met Office and National Rail come to mind here and the first two are regularly visited while on the move. They work well to provide what I need too. Nevertheless, one of the things that I have found with mobile web browsing is that I am less inclined to follow every link that might arouse my interest. Sluggish response times might have something to do with it but navigating the web on a small screen is more work too. Therefore, I have been taking a more functional approach to web usage on the move rather than the more expansive one that tends to happen on a desktop PC.

For those times when the default browser was not up to the task, I installed Opera Mini. It certainly has come in very useful for keeping an the Cheshire East bus tracker and looking at any websites without mobile versions for when I decide to look at such things. Downloading any of these does take time and there’s the reality of navigating a big page on a small screen. However, I have discovered that the browser has an annoying tendency to crash and it did it once while I was awaiting a bus. The usual solution, rightly or wrongly, has been to delete the thing and reinstall it again with the time and device restarts that entails. While I got away with it once, it seems to mean losing whatever bookmarks or favourites that you have set up too, a real nuisance. Because of this, I am not going to depend on it as much any more. Am I alone in experiencing this type of behaviour?

Because of Opera’s instability, I decided on seeking alternative approaches. One of these was to set up bookmarks for the aforementioned bus tracker on Web ‘n’ Web. What is delivered in the WAP version of the site and it’s not that user friendly at all. When it comes to selecting a bus stop to monitor, it asks for a stance number. Only for my nous, I wouldn’t have been able to find the ID’s that I needed. That’s not brilliant but I worked around it to make things work for me. The observation is one for those who design mobile versions of websites for public use.

Another development is the discovery of Bolt Browser and, so far, it seems a worthy alternative to Opera Mini too. There are times when it lives up to the promise of faster web page loading but that is dependent on the strength of the transmission signal. A trial with the Met Office website showed it to be capable though there were occasions when site navigation wasn’t as smooth as it could have been. Up to now, there have been no crashes like what happened to Opera Mini so it looks promising. If there is any criticism, it is that it took me a while to realise how to save favourites (or bookmarks). While the others that I have used have a button on the screen for doing so, Bolt needs you to use the application menu. Other than that, the software seems worthy of further exploration.

All in all, surfing the mobile remains an area of continued exploration for me. Having found my feet with it, I remain on the lookout for other web browsers for the BlackBerry platform. It is true that OS 6 features a Webkit-powered browser but I’m not buying another device to find out how good that is. What I am after are alternatives that work on the device that I have. Porting of Firefox’s mobile edition would be worthwhile but its availability seems to be limited to Nokia’s handsets for now. Only time will reveal where things are going.

Worth the attention?

21st July 2010

The latest edition of Web Designer has features and tutorials on modern trends one new ways to use fonts and typography in websites. One thing that’s at the heart of the attention is the @font-face CSS selector. It’s what allows you to break away from the limitations of whatever fonts your visitors might have on their PC’s to use something available remotely.

In principle, that sounds a great idea but there are caveats. The first of these is the support for the @font-face selector in the first place though the modern browsers that I have tried seem to do reasonably OK on this score. These include the latest versions of Firefox, Internet Explorer, Opera and Chrome. The new fonts may render OK but there’s a short delay in the full loading of a web page. With Firefox, the rendering seems to treat the process like an interleaved image so you may see fonts from your own PC before the remote ones come into place, a not too ideal situation in my opinion. Also, I have found that this is more noticeable on the Linux variant of the browser than its Windows counterpart. Loading a page that is predominantly text is another scenario where you’ll see the behaviour more clearly. Having a sizeable image file loading seems to make things less noticeable. Otherwise, you may see a short delay to the loading of a web page because the fonts have to be downloaded first. Opera is a particular offender here with IE8 loading things quite quickly and Chrome not being too bad either.

In the main, I have been using Google’s Fonts Directory but, in the interests of supposedly getting a better response, I tried using font files stored on a test web server only to discover that there was more of a lag with the fonts on the web server. While I do not know what Google has done with their set up, using their font delivery service appears to deliver better performance in my testing so it’ll be my choice for now. There’s Typekit too but I’ll be hanging onto to my money in the light of my recent experiences.

After my brush with remote font loading, I am inclined to wonder if the current hype about fonts applied using the @font-face directive is deserved until browsers get better and faster at loading them. As things stand, they may be better than before but the jury’s still out for me with Firefox’s rendering being a particular irritant. Of course, things can get better…

Exploring the mobile web

16th July 2010

With a change of job ahead of me, I decided to make my web usage a little more mobile. The result was the purchase of a Blackberry 8520 Curve on a T-Mobile pay-as-you-go tariff to complement my existing phone. Part of the attraction was having email on the move and a little web access too. On both accounts it hasn’t though GPRS isn’t the speediest for web browsing and you get to appreciating mobile versions of websites. It’s just as well that this website that you’re reading has a mobile version.

Hooking the Blackberry up to GMail was no problem once I had paid my dues and the necessary set up was done for me; it was only then that the required option was available through the set up screens. RIM’s own web browser may be no slouch when it comes to rendering websites but I put Opera Mini in place as well for those times when the default option could be bettered and they exist too. Speaking of RIM applications, there’s one for Twitter too though I added Übertwitter for sake of greater flexibility (it can handle more than one account at a time, for example). In addition, I have instated applications for WordPress and LinkedIn too and it was then that I stopped myself spending too much time in Blackberry App World. If I was of the Facebook persuasion, I might be interested in the default offering for that as well but I have learnt to contain myself.

Of course, there are limitations to the device’s capabilities with regards to email and web on the move. Long emails still need desktop access (messages can get truncated) and mobile unfriendly websites will take an age to load and explore; a small screen means much more finger work. After all, this is a small device so the observations aren’t really surprising; it’s just that I encounter the reality of life on a small screen now. Nevertheless, useful site like those from Google and the Met Office have a mobile variant though I’d like to see the latter including its rain radar as part of the package.

Speaking of life on a smaller scale, there’s the size of the keyboard to consider too. So far, I haven’t had much practice with it but I am unsure as how some craft longer blog entries with the the tiny keys. Then, there’s the ever-present threat of arm discomfort and RSI that you have to watch. For that reason, I’ll stick with use for an hour at a time rather than going mad altogether. Navigating around the screen using the tiny trackpad is something to which I am adjusting and it works well enough too so long as you’re not looking through long web pages or emails.

To bring this piece to a close, the new gadget has been finding uses and I don’t plan on leaving it idle after paying over £150 for it. Apart from acting as an expensive calculator, it has already travelled abroad with me with roaming not being a problem; I may have failed to get it to work with hotel broadband but there was EDGE availability to keep things connected together. All in all, the device is earning its keep and teaching me a few things about mobile handheld computing with my main website in process of being made more mobile compatible with the front page and the photo gallery gaining versions for handheld devices after the same was done for the outdoors blog earlier this year (might make the design look more like the rest of the site though). Without something on which to do some real testing, that idea may not have become reality like it is. It may be no desktop substitute but that’s never to say that these devices may never get near that situation. After all, there was a time when no one could imagine the same for laptop PC’s and we all know what has happened with them.

Easier to print?

20th February 2010

One matter that really came to light was how well or not the pages on here and on my hill walking and photography website came out on the printed page. After spotting a WordPress Codex article and with an eye on making things better, I have made a distinction between screen and print stylesheets. The code in the XHTML looks like this:

<link rel=”stylesheet” href=”/style.css” type=”text/css” media=”screen” />
<link rel=”stylesheet” href=”/style_print.css” type=”text/css” media=”print” />

The media attribute seems to be respected by the browsers that I have been using for testing (latest versions of Firefox, MSIE and Opera) so it then was a matter of using CSS to control what was shown and how it was displayed. Extraneous items like sidebars were excluded from the printed page in favour of the real content that visitors would be wanting anyway and everything else was made as monochrome as possible with images being the only things to escape. After all, people don’t want to be wasting paper and ink in this cash strained times and there’s no need to have any more colour than necessary either. Then, there’s the distraction caused by non-functioning hyperlinks that has inspired the sharing of some wisdom on A List Apart. Returning to my implementation, please let me know in the comments what you think of what I have done on here and if there remains any room for improvement.

A self-hosted online photo album option

16th July 2009

I was perusing a recent copy of Linux Format and encountered a feature describing a self-hosted alternative to the likes of Flickr: Gallery. From my quick look, it looks fully featured, offering themes and even shopping cart facilities for those who want to sell their wares. The screenshots on the open-source project’s website look promising but, for a fuller appraisal, I would need to spend some time trying to bend it to my will. Before anyone mentions it, I am aware that WordPress can be used for photoblogging, but this tool seems to take things a bit further. It’s the sort of thing about which I might have wondered, given the pervasiveness of content management systems these days. My own custom-built photo gallery is devoid of a slick back end, hence why Gallery caught my eye, but I’ll continue with it and may even get to adding the needful myself.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.