Technology Tales

Adventures & experiences in contemporary technology

A useful little device

1st October 2011

Last weekend, I ran into quite a lot of bother with my wired broadband service. Eventually, after a few phone calls to my provider, it was traced to my local telephone exchange and took another few days before it finally got sorted. Before that, a new ADSL filter (from a nearby branch of Maplin as it happened) was needed because the old one didn’t work with my phone. Without that, it wouldn’t have been possible to debug what was happening with the broadband clashing with my phone with the way that I set up things. Resetting the router was next and then there was a password change before the exchange was blamed. After all that, connectivity is back again and I even upgraded in the middle of it all. Downloads are faster and television viewing is a lot, lot smoother too. Having seen fairly decent customer service throughout all this, I am planning to stick with my provider for a while longer too.

Of course, this outage could have left me disconnected from the Internet but for the rise of mobile broadband. Working off dongles is all very fine until coverage lets you down and that seems to be my experience with Vodafone at the moment. Another fly in the ointment was my having a locked down work laptop that didn’t entertain such the software installation that is needed for running these things, a not unexpected state of affairs though it is possible to connect over wired and wireless networks using VPN. With my needing to work from home on Monday, I really had to get that computer online. Saturday evening saw me getting my Toshiba laptop online using mobile broadband and then setting up an ad hoc network using Windows 7 to hook up the work laptop. To my relief, that did the trick but the next day saw me come across another option in Argos (the range of computing kit in there still continues to surprise me) that made life even easier.

While seeing if it was possible to connect a wired or wireless router to mobile broadband, I came across devices that both connected via the 3G network and acted as wireless routers too. Vodafone have an interesting option into which you can plug a standard mobile broadband dongle for the required functionality. For a while now, 3 has had its Mifi with the ability to connect to the mobile network and relay Wi-Fi signals too. Though it pioneered this as far as I know, others are following their lead with T-Mobile offering something similar: its Wireless Pointer. Unsurprisingly, Vodafone has its own too though I didn’t find and mention of mobile Wi-Fi on the O2 website.

That trip into Argos resulted in a return home to find out more about the latter device before making a purchase. Having had a broadly positive experience of T-Mobile’s network coverage, I was willing to go with it as long as it didn’t need a dongle. The T-Mobile one that I have seems not to be working properly so I needed to make sure that wasn’t going to be a problem before I spent any money. When I brought home the Wireless Pointer, I swapped the SIM card from the dongle to get going without too much to do. Thankfully, the Wi-Fi is secured using WPA2 and the documentation tells you where to get the entry key. Having things secured like this means that someone cannot fritter away your monthly allowance too and that’s as important for PAYG customers (like me) as much as those with a contract. Of course, eavesdropping is another possibility that is made more difficult too. So far, I have stuck with using it while plugged in to an electrical socket (USB computer connections are possible as well) but I need to check on the battery life too. Up to five devices can be connected by Wi-Fi and I can vouch that working with two connected devices is more than a possibility. My main PC has acquired a Belkin Wi-Fi dongle in order to use the Wireless Pointer too and that has worked very well too. In fact, I found that connectivity was independent of what operating system I used: Linux Mint, Ubuntu, Windows XP and Windows 7 all connected without any bother. The gadget fits in the palm of my hand too so it hardly can be called large but it does what it sets out to do and I have been glad to have it so far.

All Change?

19th September 2011

Could 2011 be remembered as the year when the desktop computing interface got a major overhaul? One part of this, Windows 8, won’t be with us until next year but there has been enough happening so far this year that has resulted in a lot of comment. With many if not all of the changes, it is possible to detect the influence of interfaces used on smartphones. After all, the carryover from Windows Phone 7 to the new Metro interface is unmistakeable.

Two developments in the Linux world have spawned a hell of an amount of comment: Canonical’s decision to develop Unity for Ubuntu and the arrival of GNOME 3. While there have been many complaints about the changes made in both, there must be a fair few folk who are just getting on with using them without complaint. Maybe there are many who even quietly like the new interfaces. While I am not so sure about Unity, I surprised myself by taking to GNOME Shell so much that I installed it on Linux Mint. It remains a work in progress as does Unity but it’ll be very interesting to see it mature. Perhaps a good number of the growing collection of GNOME Shell plugins could make it into the main codebase. If that were to happen, I could see it being welcomed by a good few folk.

There was little doubt that the changes in GNOME 3 looked daunting so Ubuntu’s taking a different approach is understandable until you come to realise how change that involves anyway. With GNOME 3 working so well for me, I feel disinclined to dally very much with Unity at all. In fact, I am writing these words on a Toshiba laptop running UGR, effectively Ubuntu running GNOME 3, and that could become my main home computing operating system in time.

For those who find these changes not to their taste, there are alternatives. Some Linux distributions are sticking with GNOME 2 as long as they can and there apparently has been some mention of a fork to keep a GNOME 2 interface available indefinitely. However, there are other possibilities such as LXDE and XFCE out there too. In fact, until GNOME 3 won me over, LXDE was coming to mind as a place of safety until I learned that Linux Mint was retaining its desktop identity. As always, there’s KDE too but I have never warmed to that for some reason.

The latest version of OS X, Lion, also included some changes inspired by iOS, the operating system that powers both the iPhone and iPad. However, while the current edition of PC Pro highlights some disgruntlement in professional circles regarding Apple’s direction, they do not seem to have aroused the kind of ire that has been abroad in the world of Linux. Is it because Linux users want to feel that they are in charge and that iMac and MacBook users are content to have decisions made for them so long as everything just works? Speaking for myself, the former description seems to fit me though having choices means that I can reject decisions that I do not like so much.

At the time of writing, the release of a developer preview of the next version of Windows has been generating a lot of attention. It also appears that changes are headed for the Windows user too. However, I get the sense that a more conservative interface option will be retained and that could be essential for avoiding the alienation of corporate users. After all, I cannot see the Metro interface gaining much favour in the working environment when so many of us have so much to do. Nevertheless, I plan to get my hands on the developer preview to have a look (the weekend proved too short for this). It will be very interesting to see how the next version of Windows develops and I plan to keep an eye on it as it does so.

It now looks as if many will have their work cut out if they are to avoid where desktop computing interfaces are going. Established paradigms are being questioned, particularly as a result of touch interfaces on smartphones and tablets. Wii and Kinect have involved other ways of interacting with computers too so there’s a lot of mileage in rethinking how we work with computers. So far, I have been able to deal with the changes in the world of Linux but I am left wondering at the changes that Microsoft is making. After Vista, they need to be careful and they know that. Maybe, they’ll be better at getting users through changes in computing interfaces than others but it’ll be very interesting to see what happens. Unlike open source community projects, they have the survival of a massive multinational at stake.

On Making PROC REPORT Work Harder

1st September 2010

In the early years of my SAS programming career, there seemed to be just the one procedure to use if you wanted to create a summary table. That was TABULATE and it was great for generating columns according to the value of a variable such as the treatment received by a subject in a clinical study. To a point, it could generate statistics for you too and I often used it to sum frequency and percentage variables. Since then, it seems to have been enhanced a little and it surprised me with the statistics it could produce when I had a recent play. Here’s the code:

proc tabulate data=sashelp.class;
class sex;
var age;
table age*(n median*f=8. mean*f=8.1 std*f=8.1 min*f=8. max*f=8. lclm*f=8.1 uclm*f=8.1),sex / misstext="0";
run;

When you compare that with the idea of creating one variable per column and then defining them in PROC REPORT as many do, it has to look more elegant and the results aren’t bad either though they can be tweaked further from the quick example that I generated. That last comment brings me to the point that PROC REPORT seems to have taken over from TABULATE wherever I care to look these days and I do ask myself if it is the right tool for that for which it is being used or if it is being used in the best way.

Using Data Step to create one variable per column in a PROC REPORT output doesn’t strike me as the best way to write reusable code but there are ways to make REPORT do more for you. For example, by defining GROUP, ACROSS and ANALYSIS columns in an output, you can persuade the procedure to do the summarising for you and there’s some example code below with the comma nesting height under sex in the resulting table. Sums are created by default if you do this and forgoing an analysis column definition means that you get a frequency table, not at all a useless thing in many cases.

proc report data=sashelp.class nowd missing;
columns age sex,height;
define age / group "Age";
define sex / across "Sex";
define height / analysis mean f=missing. "Mean Height";
run;

For those times when you need to create more heavily formatted statistics (summarising range as min-max rather showing min and max separately, for example), you might feel that the GROUP/ACROSS set-up’s non-display of character values puts a stop to using that approach. However, I found that making every value combination unique and attaching a cell ID helps to work around the problem. Then, you can create a format control data set from the data like in the code below and create a format from that which you can apply to the cell ID’s to display things as you need them. This method does make things more portable from situation to situation than adding or removing columns depending on the values of a classification variable.

proc sql noprint;
create table cntlin as
select distinct "fmtname" as fmtname, cellid as start, cellid as end, decode as label
from report;
quit;

proc format lib=work cntlin=cnlin;
run;

Solving an upgrade hitch en route to Ubuntu 10.04

4th May 2010

After waiting until after a weekend in the Isle of Man, I got to upgrading my main home PC to Ubuntu 10.04. Before the weekend away, I had been updating a 10.04 installation on an old spare PC and that worked fine so the prospects were good for a similar changeover on the main box. That may have been so but breaking a computer hardly is the perfect complement to a getaway.

So as to keep the level of disruption to a minimum, I opted for an in-situ upgrade. The download was left to complete in its own good time and I returned to attend to installation messages asking me if I wished to retain old logs files for the likes of Apache. When the system asked for reboot at the end of the sequence of package downloading, installation and removal, I was ready to leave it do the needful.

However, I met with a hitch when the machine restarted: it couldn’t find the root drive. Live CD’s were pressed into service to shed light on what had happened. First up was an old disc for 9.10 before one for 10.04 Beta 1 was used. That identified a difference between the two that was to prove to be the cause of what I was seeing. 10.04 uses /dev/hd*# (/dev/hda1 is an example) nomenclature for everything including software RAID arrays (“fakeraid”). 9.10 used the /dev/mapper/sil_**************# convention for two of my drives and I get the impression that the names differ according to the chipset that is used.

During the upgrade process, the one thing that was missed was the changeover from /dev/mapper/sil_**************# to /dev/hd*# in the appropriate places in /boot/grub/menu.lst; look for the lines starting with the word kernel. When I did what the operating system forgot, I was greeted by a screen telling of the progress of checks on one of the system’s disks. That process took a while but a login screen followed and I had my desktop much as before. The only other thing that I had to do was run gconf-editor from the terminal to send the title bar buttons to the right where I am accustomed to having them. Since then, I have been working away as before.

Some may decry the lack of change (ImageMagick and UFRaw could do with working together much faster, though) but I’m not complaining; the rough of 9.10 drilled that into me. Nevertheless, I am left wondering how many are getting tripped up by what I encountered, even if it means that Palimpsest (what Ubuntu calls Disk Utility) looks much tidier than it did. Could the same thing be affecting /etc/fstab too? The reason that I don’t know the answer to that question is that I changed all hard disk drive references to UUID a while ago but it’s another place to look if the GRUB change isn’t fixing things for you. If my memory isn’t failing me, I seem to remember seeing /dev/mapper/sil_**************# drive names in there too.

A bigger screen?

23rd February 2010

A recent bit of thinking has caused me to cast my mind back over all the screens that have sat in front of me while working with computers over the years. Well, things have come a long way from the spare television that I used with a Commodore 64 that I occasionally got to exploring the thing. Needless to say, a variety of dedicated CRT screens ensued as I started to make use of Apple and IBM compatible PC’s provided in computing labs and other such places before I bought an example of the latter as my first ever PC of my own. That sported a 15″ display that stood out a little in times when 14″ ones were mainstream but a 17″ Iiyama followed it when its operational quality deteriorated. That Iiyama came south with me from Edinburgh as I moved to where the work was and offered sterling service before it too started to succumb to aging.

During the time that the Iiyama CRT screen was my mainstay at home, there were changes afoot in the world of computer displays. A weighty 21″ Philips screen was what greeted me on a first day at work but 21″ Eizo LCD displays were set to replace those behemoths and remain in use as if to prove the longevity of LCD panels and the validity of using what had been sufficient for laptops for a decade or so. In fact, the same comment regarding reliability applies to the screen that now is what I use at home, a 17″ Iiyama LCD panel (yes, I stuck with the same brand when I changed technologies longer ago than I like to remember).

However, that hasn’t stopped me wondering about my display needs and it’s screen size that is making me think rather than the reliability of the current panel. That is a reflection on how my home computing needs have changed over time and they show how my non-computing interests have evolved too. Photography is but one of these and the move the digital capture has brought with a greater deal of image processing, so much that I wonder if I need to make less photos rather than bringing home so many that it can be hard to pick out the ones that are deserving of a wider viewing. That is but one area where a bigger screen would help but there is another and it arises from my interest in exploring countryside on foot or on my bike: digital mapping. When planning outings, it would be nice to have a wider field of view to be able to see more at a larger scale.

None of the above is a showstopper that would be the case if the screen itself was unreliable so I am going to take my time on this one. The prospect of sharing desktops across two screens is another idea but that needs some thought about where it all would fit; the room that I have set aside for working at my computer isn’t the largest but it’ll need to do. After the space side of things, then there’s the matter of setting up the hardware. Quite how a dual display is going to work with a KVM setup is something to explore as is the adding of extra video cards to existing machines. After the hardware fiddling, the software side of things is not a concern that I have because of when I used laptop as my main machine for a while last year. That confirmed that Windows (Vista but it has been possible since 2000 anyway…) and Ubuntu (other modern Linux distributions should work too…) can cope with desktop sharing out of the box.

Apart from the nice thoughts of having more desktop space, the other tempting side to all of this is what you can get for not much outlay. It isn’t impossible to get a 22″ display for less than £200 and the prices for 24″ ones are tempting too. That’s a far cry from paying next to £300 (if my memory serves me correctly) for that 17″ Iiyama and I’d hope that the quality is as good as ever.

It’s all very well talking about pricing but you need to sit down and choose a make and model when you get to deciding on a purchase. There is plenty of choice so that would take a while but magazine reviews will come in handy here. Saying that, last year’s computing misadventures have me questioning the sense of going for what a magazine places on its A-list. They also have me minded to go to a nearby computer shop to make a purchase rather than choosing a supplier on the web; it is easier to take back a faulty unit if you don’t have far to go. Speaking of faulty units, last year has left me contemplating waiting until the year is older before making any acquisitions of computer kit. All of that has put the idea of buying a new screen on the low priority list, nice to have but not essential. For now, that is where it stays but you never know what the attractions of a shiny new thing can do…

A multitude of operating systems

27th October 2009

Like buses, it seems that a whole hoard of operating systems is descending upon us at once. OS X 10.6 came first and it was the turn of Windows 7 last week with all of the excitement that it generated in the computing and technology media. Next up will be Ubuntu, already a source of some embarrassment for the BBC’s Rory Cellan-Jones when he got his facts muddled; to his credit he later corrected himself though I do wonder how up to speed is his appreciated that Ubuntu has its distinct flavours with a netbook variant being different to the main offering that I use. Along with Ubuntu 9.10, Fedora 12 and openSUSE 11.2 are also in the wings. As if all these weren’t enough, the latest issue of PC Plus gives an airing to less well-known operating systems like Haiku (the project that carries on BeOS). The inescapable conclusion is that, far from the impressions of mainstream computer users who know only Windows, we are swimming in a sea of operating system options in which you may drown if you decide to try sampling them all. That may explain why I stick with Ubuntu for home use due to reasons of familiarity and reliability and leave much of the distro hopping to others. Of course, it shouldn’t surprise anyone that Windows is the choice of where I work with 2000 being usurped by Vista in the next few weeks (IT managers always like to be behind the curve for sake of safety).

Whither Fedora?

10th January 2009

There is a reason why things have got a little quieter on this blog: my main inspiration for many posts that make their way on here, Ubuntu, is just working away without much complaint. I have to say that BBC iPlayer isn’t working so well for me at the moment so I need to take a look at my setup. Otherwise, everything is continuing quietly. In some respects, that’s no bad thing and allows me to spend my time doing other things like engaging in hill walking, photography and other such things. I suppose that the calm is also a reflection of the fact that Ubuntu has matured but there is a sense that some changes may be on the horizon. For one thing, there are the opinions of a certain Mark Shuttleworth but the competition is progressing too.

That latter point brings me to Linux Format’s recently published verdict that Fedora has overtaken Ubuntu. I do have a machine with Fedora on there and it performs what I ask of it without any trouble. However, I have never been on it trying all of the sorts of things that I ask of Ubuntu so my impressions are not in-depth ones. Going deeper into the subject mightn’t be such a bad use of a few hours. What I am not planning to do is convert my main Ubuntu machine to Fedora. I moved from Windows because of constant upheavals and I have no intention to bring those upon me without good reason and that’s just not there at the moment.

Speaking of upheavals, one thought that is entering my mind is that of upgrading that main machine. Its last rebuild was over three years ago and computer technology has moved on a bit since then with dual and quad core CPU‘s from Intel and AMD coming into the fray. Of course, the cost of all of this needs to be considered too and that is never more true than of these troubled economic times. If you asked me about the prospect of a system upgrade a few weeks ago, I would have ruled it out of hand. What has got me wondering is my continued used of virtualisation and the resources that it needs. I am getting mad notions like the idea of running more than one VM at once and I do need to admit that it has its uses, even if it puts CPU’s and memory through their paces. Another attractive idea would be getting a new and bigger screen, particularly with what you can get for around £100 these days. However, my 17″ Iiyama is doing very well so this is one for the wish list more than anything else. None of the changes that I have described are imminent but I have noticed how fast I am filling disks up with digital images so an expansion of hard disk capacity has come much higher up the to do list.

If I ever get to doing a full system rebuild with a new CPU, memory and motherboard (I am not so sure about graphics since I am no gamer),  the idea of moving into the world of 64-bit computing comes about. The maximum amount of memory usable by 32-bit software is 4 GB so 64-bit is a must if I decide to go beyond this limit. That all sounds very fine but for the possibility of problems arising with support for legacy hardware. It sounds like another bridge to be assessed before its crossing, even if two upheavals can be made into one.

Aside from system breakages, the sort of hardware and software changes over which I have been musing here are optional and can be done in my own time. That’s probably just as for a very good reason that I have mentioned earlier. Being careful with money becomes more important at times like these and it’s good that free software not only offers freedom of choice and usage but also a way to leave the closed commercial software acquisition treadmill with all of its cost implications, leaving money for much more important things.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.