Technology Tales

Adventures & experiences in contemporary technology

Wiping of hard drives with Linux

2nd December 2013

More than a decade of computer upgrades and rebuilds can leave obsolete kit in your hands and the arrival of legislation controlling the dumping of electronic goods during this time can leave one wondering how anyone can dispose of them. Thankfully, I discovered that the local council refuse site only a few miles away from me accepts such things for recycling and saw me a good few times over the last summer with obsolete and non-working gadgets that has stayed with me far too long. Some were as bulky as a computer monitor and a printer but others were relatively diminutive.

Disposing of non-working and utterly obsolete equipment is an easy choice but I find this is harder when a device still works as intended and even might have a use yet. When you realise that computer motherboards still come with PS/2, floppy and IDE ports, things get trickier. My Gigabyte Z87-HD3 mainboard just has one PS/2 when predecessors would have had two and the same applies to IDE sockets and there still is a floppy drive socket on there too, a surprising sight for anyone used to thinking that such things are utterly outmoded these days. So, PC technology isn’t relinquishing backwards compatibility just yet since that mainboard is part of a system with an Intel Core i5-4670K CPU and 24 GB of RAM on there.

Even with that presence of an IDE port, I was not tempted to use leftover 10 GB and 20GB hard drives that I have had for just over a decade. Ten years ago, that sort of capacity would been respectable were it not for our voracious appetite for data storage thanks to photography, video and music. Apart from the size constraints, the speed of those drives cannot compare well with what we have today either and I quickly saw that when I replaced a Samsung 160 HD of a similar age with a Samsung SSD.

The result of this line of thought was that I was minded to recycle the drives so I started to think about wiping and Linux has a good tool for this in the form of the dd command. It can overwrite data on the disks so as to render the information virtually irretrievable. Also, Linux has a number of dummy devices that can supply junk data for overwriting purposes. They are like /dev/null which is used to suppress the issuing of output to the command. The first is /dev/zero which supplies octal zeros and I have used this. However, there also is /dev/random and /dev/urandom for those wanting a more random element to the overwriting.

To overwrite data on a disk with zeroes while having feedback on progress, the following command achieves the required result:

sudo dd if=/dev/zero | pv | sudo dd of=/dev/sdd bs=16M

The whole operation needs to be executed with root privileges and the if parameter of dd specifies the input data and this is sent to a pv command that shows a progress bar that dd would not produce by itself while sending the output on to another dd command with the disk to be overwritten specified using the of parameter. The bs parameter in that second dd command specifies the block size for the disk writing job. Unfortunately, pv is not installed by default so you need to add it yourself. On a Debian, Ubuntu or Linux Mint system, the command is the following:

sudo apt-get install pv

That pv sandwich also is invaluable for those times when dd is needed to copy partitions between different physical or virtual (in a virtual machine) disks. Without it, you might wonder what exactly is happening in the silence and that especially is concerning when you are retrying an operation that failed previously and it takes a while to complete each time.

A reappraisal of Windows 8 and 8.1 licensing

15th November 2013

With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC’s, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC’s that needed a new Windows installation and that the licence that you bought was then attached to the machine from when it got installed on there.

As is usual with Microsoft, the situation is less clear cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC’s. With Windows & and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it was a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance if not the actual installation. At times, Microsoft muddies waters so as to keep its support costs down.

Even with Microsoft’s track record in mind, it still did surprise me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discovering the same for software purchased from the Microsoft web site. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90 day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. This not only be an actual in-situ installation but there options to create an ISO or USB image of the installation disk for later installation.

In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested and I am unsure that I will go beyond what I have done already.

My main machine has seen a change of motherboard, CPU and memory so it could have de-activated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely so does it so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the mainboard and I have seen suggestions that this is the critical component for Windows licence activation and it would make sense if that was the case.

However, this rule is not hard and fast either since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you and I have seen that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be even when they are impossible to keep like that?

Surveying changes coming in GNOME 3.10

20th October 2013

GNOME 3.10 came out last month but it took until its inclusion into the Arch and Antergos repositories for me to see it in the flesh. Apart from the risk of instability, this is the sort of thing at which rolling distributions excel. They can give you a chance to see the latest software before it is included anywhere else. For the GNOME desktop environment, it might have meant awaiting the next release of Fedora in order to glimpse what is coming. This is not always a bad thing because Ubuntu GNOME seems to be sticking with using a release behind the latest version. With many GNOME Shell extension writers not updating their extensions until Fedora has caught up with the latest release of GNOME for a stable release, this is no bad thing and it means that a version of the desktop environment has been well bedded in by the time it reaches the world of Ubuntu too. Debian takes this even further by using a stable version from a few years ago and there is an argument in favour of that from a solidity perspective.

Being in the habit of kitting out GNOME Shell with extensions, I have a special interest in seeing which ones still work or could work with a little tweaking and those which have fallen from favour. In the top panel, the major change has been to replace the sound and user menus with a single aggregate menu. The user menu in particular has been in receipt of the attentions of extension writers and their efforts either need re-work or dropping after the latest development. The GNOME project seems to have picked up an annoying habit from WordPress in that the GNOME Shell API keeps changing and breaking extensions (plugins in the case of WordPress). There is one habit from the WordPress that needs copying though and that is with documentation, especially of that API for it is hardly anywhere to be found.

GNOME Shell theme developers don’t escape and a large border appeared around the panel when I used Elementary Luna 3.4 so I turned to XGnome Enhanced (found via GNOME-Look.org) instead. The former no longer is being maintained since the developer no longer uses GNOME Shell and has not got the same itch to scratch; maybe someone else could take it over because it worked well enough until 3.8? So far, the new theme works for me so that will be an option should there a move to GNOME 3.10 on one of my PC’s at some point in the future.

Returning to the subject of extensions, I had a go at seeing how the included Applications Menu extension works now since it wasn’t the most stable of items before. That has improved and it looks very usable too so I am not awaiting the updating of the Frippery equivalent. That the GNOME Shell backstage view has not moved on that much from how it was in 3.8 could be seen as a disappointed but the workaround will do just fine. Aside from the Frippery Applications Menu, there are other extensions that I use heavily that have yet to be updated for GNOME Shell 3.10. After a spot of success ahead of a possible upgrade to Ubuntu GNOME 13.10 and GNOME Shell 3.8 (though I remain with version 13.04 for now), I decided to see I could port a number of these to the latest version of the user interface. Below, you’ll find the results of my labours so feel free to make use of these updated items if you need them before they are update on the GNOME Shell Extensions website:

Frippery Bottom Panel

Frippery Move Clock

Remove App Menu

Show Desktop

There have been more changes coming in GNOME 3.10 than GNOME Shell, which essentially is a JavaScript construction. The consolidation of application title bars in GNOME applications continues but a big exit button has appeared in the affected applications that wasn’t there before. Also there remains the possibility of applying the previously shared modifications to Nautilus (also known as Files) and a number of these usefully extend themselves to other applications such as Gedit too. Speaking of Gedit, this gains a very useful x of y numbering for the string searching functionality with x being the actual number of the occurrence of a certain piece of text in a file and y being its total number of occurrences.GNOME Tweak Tool has got an overhaul too and lost the setting that makes a folder path box appear in Nautilus instead of a location part, opening Dconf-Editor and going to org > gnome > nautilus > preferences and completing the tick box for always-use-location-entry will do the needful.

Essentially, the GNOME project is continuing along the path on which it set a few years ago. Though I would rather that GNOME Shell would be more mature, invasive changes are coming still and it leaves me wondering if or when this might stop. Maybe that was the consequence of mounting a controversial experiment when users were happy with what was there in GNOME 2. The arrival of Fedora 20 should bring with it an increase in the number of GNOME shell extensions that have been updated. So long as it remains stable Antergos is good have a look at the latest version of GNOME for now and Cinnamon fans may be pleased the Cinnamon 2.0 is another desktop option for the Arch-based distribution. An opportunity to say more about that may arrive yet once the Antergos installer stops failing at a troublesome package download; a separate VM is being set aside for a look at Cinnamon because it destabilised GNOME during a previous look.

A Look at a Compact System Camera

4th September 2013

During August, I acquired an Olympus Pen E-PL5 and it is an item to which I still am becoming accustomed and it looks as if that is set to continue. The main reason that it appealed to me was the idea of having a camera with much of the functionality of an SLR but with many of the dimensions of a compact camera. In that way, it was a step up from my Canon PowerShot G11 without carrying around something that was too bulky.

Olympus Pen E-PL5

Before I settled on the E-PL5, I had been looking at Canon’s EOS M and got to hear about its sluggish autofocus. That it had no mode dial on its top plate was another consideration though it does pack in an APS-C sized sensor (with Canon’s tendency to overexpose finding a little favour with me too on inspection of images from an well aged Canon EOS 10D) at a not so unappealing price of around £399. A sighting of a group of it and similar cameras in Practical Photography was enough to land that particular issue into my possession and they liked the similarly priced Olympus Pen E-PM2 more than the Canon. Though it was a Panasonic that won top honours in that test, I was intrigued enough by the Olympus option that I had a further look. Unlike the E-PM2 and the EOS M, the E-PL5 does have a mode dial on its top plate and an extra grip so that got my vote even it meant paying a little extra for it. There was a time when Olympus Pen models attracted my attention before now due to sale prices but this investment goes beyond that opportunism.

The E-PL5 comes in three colours: black, silver and white. Though I have a tendency to go for black when buying cameras, it was the silver option that took my fancy this time around for the sake of a spot of variety. The body itself is a very compact affair so it is the lens that takes up the most of the bulk. The standard 14-42 mm zoom ensures that this is not a camera for a shirt pocket and I got a black Lowepro Apex 100 AW case for it; the case fits snugly around the camera, so much so that I was left wondering if I should have gone for a bigger one but it’s been working out fine anyway. The other accessory that I added was a 37 mm Hoya HMC UV filter so that the lens doesn’t get too knocked about while I have the camera with me on an outing of one sort or another, especially when its plastic construction protrudes a lot further than I was expecting and doesn’t retract fully into its housing like some Sigma lenses that I use.

When I first gave the camera a test run, I had to work out how best to hold it. After all, the powered zoom and autofocus on my Canon PowerShot G11 made that camera more intuitive to hold and it has been similar for any SLR that I have used. Having to work a zoom lens while holding a dinky body was fiddly at first until I worked out how to use my right thumb to keep the body steady (the thumb grip on the back of the camera is curved to hold a thumb in a vertical position) while the left hand adjusted the lens freely. Having an electronic viewfinder instead of using the screen would have made life a little easier but they are not cheap and I already had spent enough money.

The next task after working out how to hold the camera was to acclimatise myself to the exposure characteristics of the camera. In my experience so far, it appears to err on the side of overexposure. Because I had set it to store images as raw (ORF) files, this could be sorted later but I prefer to have a greater sense of control while at the photo capture stage. Until now, I have not found a spot or partial metering button like what I would have on an SLR or my G11. That has meant either using exposure compensation to go along with my preferred choice of aperture priority mode or go with fully manual exposure. Other modes are available and they should be familiar to any SLR user (shutter priority, program, automatic, etc.). Currently, I am using bracketing while finding my feet after setting the ISO setting to 400, increasing the brightness of the screen and adding histograms to the playback views. With my hold on the camera growing more secure, using the dial to change exposure settings such as aperture (f/16 remains a favourite of mine in spite what others may think given the size of a micro four thirds sensor) and compensation while keeping the scene exactly the same to test out what the response to any changes might be.

While I still am finding my feet, I am seeing some pleasing results so far that encourage me to keep going; some remind me of my Pentax K10D. The E-PL5 certainly is slower to use than the G11 but that often can be a good thing when it comes to photography. That it forces a little relaxation in this often hectic world is another advantage. The G11 is having a quieter time at the moment and any episodes of sunshine offer useful opportunities for further experimentation and acclimatisation too. So far, my entry in the world of compact system cameras has revealed them to be of a very different form to those of compact fixed lens cameras or SLR’s. Neither truly get replaced and another type of camera has emerged.

A display of brand loyalty

12th July 2013

Since 2007, my main camera has been a Pentax K10D DSLR and it has gone on many journeys with me. In fact, more than 15,000 images have been captured with it and I have classed it as an unfailing servant. The autofocus may not be the fastest but my subjects tend to be stationary: landscapes, architecture, flora and transport. Even any bus and train photos have included parked vehicles rather than moving ones so there never have been issues. The hint of underexposure in any photos always can be sorted because DNG files are what I create, with all the raw capture information that is possible to retain. In fact, it has been hard to justify buying another SLR because the K10D has done so well for me.

In recent months, I have looking at processed photos and asking myself if time has moved along for what is not far from being a six year old camera. At various times, I have been looking at higher members of the Pentax while wondering if an upgrade would be a good idea. First, there was the K7 and then the K5 before the K5 II got launched. Even though its predecessor is still to be found on sale, it was the newer model that became my choice.

Pentax K5-II

My move to Pentax in 2007 was a case of brand disloyalty since I had been a Canon user from when I acquired my first SLR, an EOS 300. Even now, I still have a Powershot G11 that finds itself slipped into a pocket on many a time. Nevertheless, I find that Canon images feel a little washed out prior to post processing and that hasn’t been the case with the K10D. In fact, I have been hearing good things about Nikon cameras delivering punchy results so one of them would be a contender were it not for how well the Pentax performed.

So, what has my new K5 II body gained me that I didn’t have before? For one thing, the autofocus is a major improvement on that in the K10D. It may not stop me persevering with manual focusing for most of the time but there are occasions the option of solid autofocus is good to have. Other advances include a 16.3 megapixel sensor with a much larger ISO range. The advances in sensor technology since when the K10D appeared may give me better quality photos and noise is something that my eyes may have begun to detect in K10D photos even at my usual ISO of 400.

There have been innovations that I don’t need too. Live View is something that I use heavily with the Powershot G11 because it has such a pitiful optical viewfinder. The K5 II has a very bright and sharp one so that function lays dormant, especially when I witnessed dodgy autofocus performance with it in use; manual focusing should be OK, I reckon. By default too, the screen stays on all the time and that’s a nuisance for an optical viewfinder user like me so I looked through the manual and the menus to switch off the thing. My brief flirtation with the image level display met an end for much the same reason though it’s good that it’s there. There is some horizon auto-correction available as a feature and this is left on to see what it offers since there have been a multitude of times when I needed to sort out crooked horizons caused by my handholding the camera.

The K5 II may have a 3″ screen on its back but it has done nothing to increase the size of the camera. If anything, it is smaller that the K10D and that usefully means that I am not on the lookout for a new camera holster. Not having a bigger body also means there is little change in how the much camera feels in the hand compared with the older one.

In many ways, the K5 II works very like the K10D once I took control over settings that didn’t suit me. Both have Shake Reduction in their camera bodies though the setting has been moved into the settings menu in the new camera when the older one had a separate switch on its body. Since I’d be inclined to leave it on all the time and prefer not to have it knocked off accidentally, this is not an issue.  Otherwise, many of the various switches are in the same places so it’s not that hard to find my way around them.

That’s not to say that there aren’t other changes like the addition of a lock to the mode dial but I have used Canon EOS camera bodies with that feature so I do not consider it a step backwards. The exposure compensation button has been moved to the top of the camera where I found it very easily and have been using perhaps more than on the K10D; it’s also something that I use on the G11 so the experimentation is being brought across to the K5 II now as well. Beside it, there’s a new ISO button so further experimentation can be attempted with that to see how it does.

If I have any criticism, it’s about the clutter of the menus on the K5 II. The long lists through you scrolled on the K10D have been replaced with a series of extra tabs so that on-screen scrolling is not needed as before. However, I reckon that this breaks up things too much and makes working through the settings look more foreboding to anyone who is not so technical in mindset. Nevertheless, settings such as the the type of file to capture are there and I continue to use RAW DNG files as is usual for me though JPEG and Pentax’s own RAW format also are there. For a while, I forgot to set the date, soon found out what I did and the situation was remedied. The same sort of thing applied to storing files in different folders according to the capture date. For my own reasons, I turned this off to put everything into a single PENTX directory to suit my own workflow. My latest discovery among the menus was the ability to add photographer and copyright holder information to the EXIF metadata attached to the image files created by the camera. With legislative proposals that dilute the automatic rights of copyright holders going through the U.K. parliament, this seems a very timely inclusion even if most would prefer that there was no change to copyright law.

Of course, the worth of any camera is in the images that it produces and I have been happy with what I have been getting so far. The bigger files mean less images fit on a memory card as before. Thankfully, SDHC card capacities have grown even if I don’t wish to machine gun my photography altogether. While out and about, I was surprised to apertures like F/14 and F/18 when I was more accustomed to a progression like F/11, F/13, F/16, F/19, F/22, etc. Most of those older values still are there though so there hasn’t been a complete break with convention. The same comment applies to shutter speeds where ones like 1/100 and 1/160 made there appearance where I might have expected just ones like 1/90, 1/125, 1/250 and so on. The extra possibilities, and that is what they are, do allow more flexibility I suppose and may even make it easier to make correct exposures though any judgement of correctness has to be in the eye of a photographer and not what a computer algorithm in a camera determines. For much of the time until now, I have stuck with an ISO of 400 apart from a little testing in a woodland area of an evening soon after the camera arrived.

Since the K5 II came my way a few months ago, I have been meaning to collect my thoughts on here and there has been a delay while I brought mu thinking to a sensible close.At one point, it felt like there was so much to say that the piece became larger in my mind that even what you have been reading now. After all, there are other things that I can adjust to see how the resulting images look and white balance is but one of these.The K10D isn’t beyond experimentation either, especially since I discovered that shake reduction was switched off and it has me asking if that lacking in quality that I mentioned earlier has another explanation. Of course, actually making use of my tripod would be another good suggestion so it’s safe to say that yet more photographic explorations await.

Battery life

2nd October 2011

In recent times, I have lugged my Toshiba Equium with me while working away from home; I needed a full screen laptop of my own for attending to various things after work hours so it needs to come with me. It’s not the most portable of things with its weight and the lack of battery life. Now that I think of it, I reckon that it’s more of a desktop PC replacement machine than a mobile workhorse. After all, it only lasts an hour on its own battery away from a power socket. Virgin Trains’ tightness with such things on their Pendolinos is another matter…

Unless my BlackBerry is discounted, battery life seems to be something with which I haven’t had much luck because my Asus Eee PC isn’t too brilliant either. Without decent power management, two hours seems to be as good as I get from its battery. However, three to four hours become possible with better power management software on board. That makes the netbook even more usable though there are others out there offering longer battery life. Still, I am not tempted by these because the gadget works well enough for me that I don’t need to wonder about how money I am spending on building a mobile computing collection.

While I am not keen on spending too much cash or having a collection of computers, the battery life situation with my Toshiba is more than giving me pause for thought. The figures quoted for MacBooks had me looking at them though they aren’t at all cheap. Curiosity about the world of the Mac may make them attractive to me but the prices forestalled that and the concept was left on the shelf.

Recently, PC Pro ran a remarkably well-timed review of laptops offering long battery life (in issue 205). The minimum lifetime in this collection was over five hours so the list of reviewed devices is an interesting one for me. In fact, it even may become a shortlist should I decide to spend money on buying a more portable laptop than the Toshiba that I already have. The seventeen hour battery life for a Sony VAIO SB series sounds intriguing even if you need to buy an accessory to gain this. That it does over seven hours without the extra battery slice makes it more than attractive anyway. The review was food for thought and should come in handy if I decide that money needs spending.

An avalanche of innovation?

23rd September 2010

It seems that, almost in spite of the uncertain times or maybe because of them, it feels like an era of change on the technology front. Computing is the domain of many of the postings on this blog and a hell of a lot seems to be going mobile at the moment. For a good while, I managed to stay clear of the attractions of smartphones until a change of job convinced me that having a BlackBerry was a good idea. Though the small size of the thing really places limitations on the sort of web surfing experience that you can have with it, you can keep an eye on the weather, news, traffic, bus and train times so long as the website in question is built for mobile browsing. Otherwise, it’s more of a nuisance than a patchy phone network (in the U.K., T-Mobile could do better on this score as I have discovered for myself; thankfully, a merger with the Orange network is coming next month).

Speaking of mobile websites, it almost feels as if a free for all has recurred for web designers. Just when the desktop or laptop computing situation had more or less stabilised, along come a whole pile of mobile phone platforms to make things interesting again. Familiar names like Opera, Safari, Firefox and even Internet Explorer are to be found popping up on handheld devices these days along with less familiar ones like Web ‘n’ Walk or BOLT. The operating system choices vary too with iOS, Android, Symbian, Windows and others all competing for attention. It is the sort of flowering of innovation that makes one wonder if a time will come when things begin to consolidate but it doesn’t look like that at the moment.

The transformation of mobile phones into handheld computers isn’t the only big change in computing with the traditional formats of desktop and laptop PC’s being flexed in all sorts of ways. First, there’s the appearance of netbooks and I have succumbed to the idea of owning an Asus Eee. Though you realise that these are not full size laptops, it still didn’t hit me how small these were until I owned one.  They are undeniably portable and tablets look even more interesting in the aftermath of Apple’s iPad. You may call them over-sized mobile photos but the idea of making a touchscreen do the work for you has made the concept fly for many. Even so, I cannot say that I’m overly tempted though I have said that before about other things.

Another area of interest for me is photography and it is around this time of year that all sorts of innovations are revealed to the public. It’s a long way from what we thought was the digital photography revolution when digital imaging sensors started to take the place of camera film in otherwise conventional compact and SLR cameras, making the former far more versatile than they used to be. Now, we have SLD cameras from Olympus, Panasonic, Samsung and Sony that eschew the reflex mirror and prism arrangement of an SLR using digital sensor and electronic viewfinders while offering the possibility of lens interchangeability and better quality than might be expected from such small cameras. In recent months, Sony has offered SLR-style cameras with translucent mirror technology instead of the conventional mirror that is flipped out of the way when a photographic image is captured.  Change doesn’t end there with movie making capabilities being part of the toolset of many a newly launch  compact, SLD and SLR camera. The pixel race also seems to have ended though increases still happen as with the Pentax K-5 and Canon EOS 60D (both otherwise conventional offerings that have caught my eye though so much comes on the market at this time of year that waiting is better for the bank balance).

The mention of digital photography brings to mind the subject of digital image processing and Adobe Photoshop Elements 9 is just announced after Photoshop CS5 appeared earlier this year. It almost feels as if a new version of Photoshop or its consumer cousin are released every year, causing me to skip releases when I don’t see the point. Elements 6 and 8 were such versions for me and I’ll be in no hurry to upgrade to 9 yet either though the prospect of using content aware filling to eradicate unwanted objects from images is tempting. Nevertheless, that shouldn’t stop anyone trying to exclude them in the first place. In fact, I may need to reduce the overall number of images that I collect in favour of bringing away only good ones. The outstanding question on this is can I slow down and calm my eagerness to bring at least one good image away from an outing by capturing anything that seems promising at the time. Some experimentation but being a little more choosy can  save work later on.

While back on the subject of software, I’ll voyage in to the world of the web before bringing these meanderings to a close. It almost feels as if there is web-based application following web-based application these days when Twitter and Facebook nearly have become household names and cloud computing is a phrase that turns up all over the place.  In fact, the former seems to have encouraged a whole swathe of applications all of itself. Applications written using technologies well used on the web must stuff many a mobile phone app store too and that brings me full circle for it is these that put so much functionality on our handsets with Java seemingly powering those I use on my BlackBerry. Them there’s spat between Apple and Adobe regarding the former’s support for Flash.

To close this mental amble, there may be technologies that didn’t come to mind while I was pondering this piece but they doubtless enliven the technological landscape too. However, what I have described is enough to take me back more than ten years ago when desktop computing and the world of the web were a lot more nascent than is the case today. Then, the changes that were ongoing felt a little exciting now that I look back on them and it does feel as if the same sort of thing is recurring though with things like phones creating the interest in place of new developments in desktop computing such as a new version of Window (though 7 was anticipated after Vista). Web designers may complain about a lack of standardisation and they’re not wrong but this may be an ear of technological change that in time may be remembered with its own fondness too.

A wider view

12th July 2010

After playing with the idea for a while, I finally have succumbed to the charms of buying a new and bigger screen. While I questioned the wisdom of replacing a 17″ screen that worked without fail, what is sitting in front of me as I write these words is a 24″ Iiyama ProLite B2409HDS and very nice it is too. This is my third Iiyama and I stayed local when it came to acquiring the thing. Mind you, bringing back a 7.7 kg box by public  transport takes its toll when trying to carry it using the handle on its top.

Once the thing was home, its installation was a straightforward matter of attaching the base, releasing the pin from the back to raise the screen higher and attaching it to a PC. The screen can be raised to a good height that stops slouching and should promote decent posture.Though there is a DVI socket on the back of the monitor, I am using the D-SUB connection because that is what is on the back of my main home PC though adding a graphics card would allow the use of the DVI option; that’s something that will have to wait for now. What will continue to await use are the speakers that are included because I never used those on the old panel either, mostly because I have a set of standalone speakers for that job.

Out of curiosity, I attached the new screen to a running PC. However, I soon found that any adjustments to the resolution produced disturbing flickering on the screen but these were banished by a system reboot. Then, I upped the resolution to the maximum of 1920×1080 and the result is more than workable with no discomfort. So far, I have put the extra display real estate to use for perusing digital maps and processing of digital photos. It doesn’t do so much for the web but there’s a limit to the length a line of text should be anyway. Considering those width restrictions, it might be time to move away from my habit of maximising application windows to feel the screen so as to have more open on the same desktop at once. That’s another option for exploring later but it’s good to have them too.

Now, I have to think up a use for the old Iiyama ProLite E431S that has served so well over the last few years. Various thoughts like spreading a display over more than one screen or using it when I have two PC’s going at once have come to into my head but I’m not rushing anything. On thing that I don’t plan on doing is retiring the thing just yet. Things have moved on from CRT monitors that start to ail after a few years of use with their LCD successors showing more resilience and cutting down on the cost of computing in the process. Seeing piles of CRT’s awaiting dumping is a distressing sight that both can and should be consigned to history in these more environmentally aware days. Thoughts like that have the effect of curtailing any spending on gadgets for me and I have no intention of building up a collection of LCD panels so what I have will need to do me for a good few years. On the evidence of the screens that I have been using, there’s good reason to expect plenty of longevity and good service to follow.

A bigger screen?

23rd February 2010

A recent bit of thinking has caused me to cast my mind back over all the screens that have sat in front of me while working with computers over the years. Well, things have come a long way from the spare television that I used with a Commodore 64 that I occasionally got to exploring the thing. Needless to say, a variety of dedicated CRT screens ensued as I started to make use of Apple and IBM compatible PC’s provided in computing labs and other such places before I bought an example of the latter as my first ever PC of my own. That sported a 15″ display that stood out a little in times when 14″ ones were mainstream but a 17″ Iiyama followed it when its operational quality deteriorated. That Iiyama came south with me from Edinburgh as I moved to where the work was and offered sterling service before it too started to succumb to aging.

During the time that the Iiyama CRT screen was my mainstay at home, there were changes afoot in the world of computer displays. A weighty 21″ Philips screen was what greeted me on a first day at work but 21″ Eizo LCD displays were set to replace those behemoths and remain in use as if to prove the longevity of LCD panels and the validity of using what had been sufficient for laptops for a decade or so. In fact, the same comment regarding reliability applies to the screen that now is what I use at home, a 17″ Iiyama LCD panel (yes, I stuck with the same brand when I changed technologies longer ago than I like to remember).

However, that hasn’t stopped me wondering about my display needs and it’s screen size that is making me think rather than the reliability of the current panel. That is a reflection on how my home computing needs have changed over time and they show how my non-computing interests have evolved too. Photography is but one of these and the move the digital capture has brought with a greater deal of image processing, so much that I wonder if I need to make less photos rather than bringing home so many that it can be hard to pick out the ones that are deserving of a wider viewing. That is but one area where a bigger screen would help but there is another and it arises from my interest in exploring countryside on foot or on my bike: digital mapping. When planning outings, it would be nice to have a wider field of view to be able to see more at a larger scale.

None of the above is a showstopper that would be the case if the screen itself was unreliable so I am going to take my time on this one. The prospect of sharing desktops across two screens is another idea but that needs some thought about where it all would fit; the room that I have set aside for working at my computer isn’t the largest but it’ll need to do. After the space side of things, then there’s the matter of setting up the hardware. Quite how a dual display is going to work with a KVM setup is something to explore as is the adding of extra video cards to existing machines. After the hardware fiddling, the software side of things is not a concern that I have because of when I used laptop as my main machine for a while last year. That confirmed that Windows (Vista but it has been possible since 2000 anyway…) and Ubuntu (other modern Linux distributions should work too…) can cope with desktop sharing out of the box.

Apart from the nice thoughts of having more desktop space, the other tempting side to all of this is what you can get for not much outlay. It isn’t impossible to get a 22″ display for less than £200 and the prices for 24″ ones are tempting too. That’s a far cry from paying next to £300 (if my memory serves me correctly) for that 17″ Iiyama and I’d hope that the quality is as good as ever.

It’s all very well talking about pricing but you need to sit down and choose a make and model when you get to deciding on a purchase. There is plenty of choice so that would take a while but magazine reviews will come in handy here. Saying that, last year’s computing misadventures have me questioning the sense of going for what a magazine places on its A-list. They also have me minded to go to a nearby computer shop to make a purchase rather than choosing a supplier on the web; it is easier to take back a faulty unit if you don’t have far to go. Speaking of faulty units, last year has left me contemplating waiting until the year is older before making any acquisitions of computer kit. All of that has put the idea of buying a new screen on the low priority list, nice to have but not essential. For now, that is where it stays but you never know what the attractions of a shiny new thing can do…

Best left until later in the year?

26th January 2010

In the middle of last year, my home computing experience was one of feeling displaced. A combination of a stupid accident and a power outage had rendered my main PC unusable. What followed was an enforced upgrade that use combination that was familiar to me: Gigabyte motherboard, AMD CPU and Crucial memory. However, assembling that lot and attaching components from the old system from the old system resulted in the sound of whirring fans but nothing appearing on-screen. Not having useful beeps to guide me meant that it was a case of undertaking educated guesswork until the motherboard was found to be at fault. In a situation like this, a deeper knowledge of electronics would have been handy and might have saved me money too. As for the motherboard, it is hard to say whether it was a faulty set from the outset or whether there was a mishap along the way, either due ineptitude with static or incompatibility with a power supply. What really tells the tale on the mainboard was the fact that all of the other components are working well in other circumstances, even that old power supply.

A few years back, I had another experience with a problematic motherboard, an Asus this time, that ate CPU’s and damaged a hard drive before I stabilised things. That was another upgrade attempted in the first half of the year. My first round of PC building was in the third quarter of 1998 and that went smoothly once I realised that a new case was needed. Similarly, another PC rebuild around the same time of year in 2005 was equally painless. Based on these experiences, I should not be blamed for waiting until later in the year before doing another rebuild, preferably a planned one rather than an emergency.

Of course, there may be another factor involved too. The hint was a non-working Sony DVD writer that was acquired early last year when it really was obvious that we were in the middle of a downturn. Could older unsold inventory be a contributor? Well, it fits in with seeing poor results twice, In addition, it would certainly tally with a problematical PC rebuild in 2002 following the end of the Dot Com bubble and after the deadly Al Qaeda attack on New York’s World Trade Centre. An IBM hard drive that was acquired may not have been the best example of the bunch and the same comment could apply to the Asus motherboard. The resulting construction may have been limping but it was working and I tolerated.

In contrast, last year’s episode had me launched into using a Toshiba laptop and a spare older PC for my needs with an external hard drive enclosure used to extract my data onto other external hard drives to keep me going. It felt a precarious arrangement but it was a useful experience in ways too. There was cause for making acquaintance with nearby PC component stores that I hadn’t visited before and I got to learning about things that otherwise wouldn’t have come my way. Using an external hard drive enclosure for accessing data on hard drives from a non-functioning PC is one of these. Discovering that it is possible to boot from external optical and hard disk drives came as a surprise too and will work so long as there is motherboard support for it. Another experience came from a crisis of confidence that had me acquiring a bare-bones system from Novatech and populating it with optical and hard disk drives. Then, I discovered that I have no need for power supplies rated more than 300 watts (around 200 W suffices). Turning my PC off more often became a habit friendly both to the planet and to household running costs too. Then, there’s the beneficial practice of shopping locally and it can suffice even if what PC magazines stick on their hot lists but shopping online for those pieces doesn’t guarantee success either. All of these were useful lessons and, while I’d rather not throw away good money after bad, it goes to show that even unsuccessful acquisitions had something to offer in the form of learning opportunities. Whether you consider that is worthwhile is up to you.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.