Technology Tales

Adventures in consumer and enterprise technology

Episodes of poor performance

15th January 2009

Over the last few days, I have been noticing from various that this blog isn't performing as I would want it. The first hint was a comment on a tuxmachines.org mention for a recent entry (thanks for the support, by the way) that the link wasn't working as it should have been. Add to this various emails from Are My Sites Up? saying that the site seemed to be down. By all accounts, this free service that I found through Lifehacker would appear to be doing its job and without the annoying advertising emails that Internetseer used to send me in addition to its weekly report when I used its free service. In fact, that you get alert emails several times a day is a factor in favour of the newcomer.

With one exception, these problems would appear to be intermittent. The exception was when I went using the WP Super Cache WordPress plugin. When it seemed to result in breakage of the site, it got disabled, even if it is meant to be helpful during episodes of heavy load. Otherwise, the outages would seem to be general flakiness of the service provided by my hosting provider. I have a site with them on an older server, and that seems to fare far better than the one playing host to this blog. This sort of thing does make me wonder if we are getting real progress, or whether it's a case of one step forward and two steps back. Nevertheless, I'll continue keeping an eye on things and, if there is too much deterioration, a move might be in order, but that's a good bit away yet.

Running Windows 7 within VirtualBox

12th January 2009

With all the fanfare that surrounded the public beta release of Windows 7, I suppose that the opportunity to give it a whirl was too good to miss. Admittedly, Microsoft bodged the roll-out by underestimating the level of interest and corralling everyone into a 24-hour time slot, with one exacerbating the other. In the event, they did eventually get their act together and even removed the 2.5 million licence limit. Thus, I suppose that they really needed to get 7 right after the unloved offering that was Vista, so they probably worked out that the more testers that they get, the better. After, it might be observed that the cynical view that the era of making people pay to "test" your products might be behind us and that users just want things to work well if not entirely faultlessly these days.

After several abortive raids, I eventually managed to snag myself a licence and started downloading the behemoth using the supplied download manager. I foresaw it taking a long time and so stuck with the 32-bit variant so as not to leave open the possibility of that part of the process using up any more of my time. As it happened, the download did take quite a few hours to complete, but this part of the process was without any incident or fuss.

Once the DVD image was downloaded, it was onto the familiar process of building myself a VirtualBox VM as a sandbox to explore the forthcoming incarnation of Windows. After setting up the ISO file as a virtual DVD, installation itself was an uneventful process, yet subsequent activities weren't without their blemishes. The biggest hurdle to be overcome was to get the virtual network adapter set up and recognised by Windows 7. The trick is to update the driver using the VirtualBox virtual CD as the source because Windows 7 will not recognise it using its own driver repository. Installing the other VirtualBox tools is a matter of going to Compatibility page in the Properties for the relevant executable, the one with x86 in the file name in my case, and setting XP as the Windows version (though Vista apparently works just as well, I played safe and depended on my own experience). While I was at it, I allowed the file to run under the administrator account, too. Right-clicking on executable files will bring you to the compatibility troubleshooter that achieves much the same ends but by a different route. With the Tools installed, all was workable rather than completely satisfactory. Shared folders have not worked for, but that might need a new version of the VirtualBox software or getting to know any changes to networking that come with Windows 7. I plan to stick with using USB drives for file transfer for the moment. Though stretching the screen to fit the VirtualBox window was another thing that would not happen, that's a much more minor irritation.

With those matters out of the way, I added security software from the list offered by Windows with AVG, Norton and Kaspersky being the options on offer. I initially chose the last of these but changed my mind after seeing the screen becoming so corrupted as to make it unusable. That set me to rebuilding the VM and choosing Norton 360 after the second Windows installation had finished. That is working much better, and I plan to continue my tinkering beyond this. I have noticed the inclusion of PowerShell and an IDE for the same, so that could be something that beckons. All in all, there is a certain solidity about Windows 7, though I am not so convinced of the claim of speedy startups at this stage. Time will tell and, being a beta release, it's bound to be full of debugging code that will not make it into the final version that is unleashed on the wider public.

Whither Fedora?

10th January 2009

There is a reason why things have got a little quieter on this blog: my main inspiration for many posts that make their way on here, Ubuntu, is just working away without much complaint. Since BBC iPlayer isn't working so well for me at the moment, I need to have a look at my setup. Otherwise, everything is continuing quietly. In some respects, that's no bad thing and allows me to spend my time doing other things like engaging in hill walking, photography and other such things. While I suppose that the calm is also a reflection of the fact that Ubuntu has matured, there is a sense that some changes may be on the horizon. For one thing, there are the opinions of a certain Mark Shuttleworth, though the competition is progressing too.

That latter point brings me to Linux Format's recently published verdict that Fedora has overtaken Ubuntu. I do have a machine with Fedora that performs what I ask of it without any trouble. However, I have never been on it trying all the sorts of things that I ask of Ubuntu, so my impressions are not in-depth ones. Going deeper into the subject mightn't be such a bad use of a few hours. What I am not planning to do is convert my main Ubuntu machine to Fedora. I moved from Windows because of constant upheavals and I have no intention to bring those upon me without good reason, something that's just not there at the moment.

Speaking of upheavals, one thought that is entering my mind is that of upgrading that main machine. Since its last rebuild was over three years ago, computer technology has moved on a bit since then, with dual and quad-core CPU's from Intel and AMD coming into the fray. Of course, the cost of all of this needs to be considered too, which is never more true than in these troubled economic times. If you asked me about the prospect of a system upgrade a few weeks ago, I would have ruled it out of hand.

What has got me wondering is my continued used of virtualisation and the resources that it needs. Mad notions like running more than one VM at once will put any CPU or memory through their paces. Another attractive idea would be getting a new and bigger screen, particularly with what you can get for around £100 these days. However, my 17" Iiyama is doing well enough to consign this one to the wish list for now. None of the changes that I have described are imminent, even if I have noticed how fast I am filling disks up with digital images to make an expansion of hard disk capacity a higher priority.

If I ever get to do a full system rebuild with a new CPU, memory and motherboard (I am not so sure about graphics since I am no gamer), the idea of moving into the world of 64-bit computing comes about. Since the maximum amount of memory usable by 32-bit software is 4 GB, 64-bit software is a must if I decide to go beyond this limit. That all sounds very fine, aside from the possibility of problems arising with support for legacy hardware. It sounds like another bridge to be assessed before its crossing, even if two upheavals can be made into one.

Besides system breakages, the sort of hardware and software changes over which I have been musing here are optional and can be done in my own time. That's probably just as well in a downturn like we are experiencing now. Being careful with money becomes more important at times like these, which means that it's fortunate that free software not only offers freedom of choice and usage but also a way to leave the closed commercial software acquisition treadmill with all of its cost implications, leaving money for much more important things.

Working with the ODS templates and styles when batch processing

8th December 2008

I ran into some trouble with creating new templates and styles while running a SAS job in batch mode. The cause was the user of the RSASUSER switch on the SAS command. This sets the SASUSER library to be read-only, and that is what is used to store new templates and styles by default. The fix is to switch these to another library to which there is write-access, WORK, for example. Here's the line of code that achieves the manoeuvre:

ODS PATH work.templat(update) sasuser.templat(read) sashelp.tmplmst(read);

Apparently, the same change might be needed for stored processes too, so it's one to keep in mind.

Work locally, update remotely

4th December 2008

Here's a trick that might have its uses: using a local WordPress instance to update your online blog. While there are plenty of applications that promise to edit your online blog, these need permissions to access the likes of xmlrpc.php. Along with the right database access credentials and the ability to log in remotely, adding the following two lines to wp-config.php does the trick:

define('WP_SITEURL', 'http://localhost/blog');

define('WP_HOME', 'http://localhost/blog');

These two constants override what is in the database and allow it to update the online database from your own PC using WordPress running on a local web server (Apache or otherwise). One thing to remember here is that both online and offline directory structures are similar. For example, if your online WordPress files are in blog in the root of the online web server file system (typically htdocs or html for Linux), then they need to be contained in the same directory in the root of the offline server too. Otherwise, things could get confusing and perhaps messy. Another thing to consider is that you are modifying your online blog, so the usual rules about care and attention apply, particularly regarding using the same version of WordPress both locally and remotely. This is especially a concern if you, like me, run development versions of WordPress to see if there are any upheavals ahead of us like the overhaul that is coming in with WordPress 2.7.

An alternative use of this same trick is to keep a local copy of your online database in case of any problems while using a local WordPress instance to work with it. I used to have to edit the database backup directly (on my main Ubuntu system), first with GEdit but then using a sed command like the following:

sed -e s/www\.onlinewebsite\.com/localhost/g backup.sql > backup_l.sql

The -e switch uses regular expression substitution that follows it to edit the input, with the output being directed to a new file. It's slicker than the interactive GEdit route but has been made redundant by defining constants for a local WordPress installation as described above.

An early glimpse of Ubuntu 9.04

27th November 2008

Ubuntu development is so gradual these days that there's almost no point getting too excited about new versions. Its maturity means that updates aren't that much of an upheaval, and I must admit to liking it that way. Having a look at the first alpha release of Ubuntu 9.04, otherwise known as "Jaunty Jackalope", it appears that there isn't a change to that gradual, some may call it glacial, approach. The most significant change that I noted was the addition of an encrypted private area to your home user area. In the times in which we live, I can certainly see that coming in useful, though it may not set pulses racing in some quarters. OpenOffice is still at 2.4 and things don't appear very different on the surface at all. Of course, things like kernel changes and such like could be going on under the bonnet without many of us noticing it. Saying that, it played well with VirtualBox and I seem to remember virtual machine trouble with early builds of 8.10 so that can be taken as a plus point. I suppose that it is a case of wait and see before there is anything more obviously defining about 9.04. Anyway, they've got until April next year...

No disruption here

12th November 2008

It was just over a year ago that I gave Linux a go after Windows XP gave me a torrid time of it. Since then, I have been able to work more than happily with it and have picked a few new and useful tricks along the way too. All in all, it has been a good experience and I have been able to resolve most of the issues that I have seen. The various Ubuntu upgrades along the way have been taken in their stride, too. Version 7.04 was the first one, with version 7.10 coming immediately afterwards. 8.04 went in equally seamlessly as did 8.10. Some may decry what they might perceive as the glacial nature of any changes, but the flip-side is that change can cause disruption, so my vote is for the more gradual approach, whatever others might think. In line with this, I haven't noticed too many changes in Ubuntu's latest release, and any that I have seen have been of the pleasant kind. Saying that, it's so much better than the contortions surrounding Windows upgrades. All in all, Linux is being kind to me and I hope that it stays that way.

Photoshop Elements 7 first impressions and technical issues

10th November 2008

Lately, I have been playing around with Photoshop Elements 7, doing the same sort of things that I have been doing with Elements 5. Reassuringly, I can still find my way around, even if the screen furniture has been moved about a little. My Pentax K10D is recognised, and I am able to set the white balance to get sensible results. On the images that I was testing, things started to look too warm in the Cloudy and Shade settings, but that's all part and parcel of processing photos taken in early November. The results of my exertions look decent enough, and you can see them in a post on my hillwalking blog.

While I realise that Adobe has been promoting the ability to easily airbrush unwanted objects from images and enhance blue skies, there's no point having all of that if functionality available in previous versions does not work as expected. Thankfully, this is largely the case, albeit with a few niggles.

Since I have been working with the new Elements on a Windows XP SP3 virtual machine running in VirtualBox 2.04 on Ubuntu 8.10, I wonder if that contributed in any way to what I encountered. One gigabyte of memory is allocated to the VM. The files were stored in the Ubuntu file system and accessed using VirtualBox's functionality for connecting through to the host file system. File access was fine, apart from the inability to directly open a file for full editing from the Organiser, something that I have doing very happily with Elements 5.

In addition, I noted a certain instability in the application and using the hand tool to get to the top left-hand corner of an image sent the thing into a loop, again something that Elements 5 never does. Otherwise, things work as they should, even if I saw points to the need for an update to correct any glitches like these, and I hope that there is one. For now, I will persevere and see if I can make use of any additional functionality along the way.

Remove Revisions 2.2

3rd November 2008

There is already a post on here devoted to version 1.0 of this plugin, which clearly tells you what it does. The new version will work with the forthcoming WordPress 2.7 (itself a release that's had a development cycle with such upheavals that it would make you want to watch from the relative safety of the sidelines) and has been made to be a little more user-friendly in its actions; in fact, it behaves more like any other plugin now.

Download Remove Revisions 2.2

SAS Macro and Dataline/Cards Statements in Data Step

28th October 2008

Recently, I tried code like this in a SAS macro:

data sections;
    infile datalines dlm=",";
    input graph_table_number $15. text_line @1 @;
    datalines;
    "11.1           ,Section 11.1",
    "11.2           ,Section 11.2",
    "11.3           ,Section 11.3"
    ;
run;

While it works in its own right, including it as part of a macro yielded this type of result:

ERROR: The macro X generated CARDS (data lines) for the DATA step, which could cause incorrect results.  The DATA step and the macro will stop executing.

A bit of googling landed me on SAS-L where I spotted a solution like this one that didn't involve throwing everything out:

filename temp temp;

data _null_;
    file temp;
    put;
run;
data sections;
    length graph_table_number $15 text_line $100;
    infile temp dlm=",";
    input @;
    do _infile_=
    "11.1           ,Section 11.1",
    "11.2           ,Section 11.2",
    "11.3           ,Section 11.3"
    ;
        input graph_table_number $15. text_line @1 @;
        output;
    end;
run;

filename temp clear;

The filename statement and ensuing data step creates a dummy file in the SAS work area that gets cleared at the end of every session. That seems to fool the macro engine into thinking that input is from a file and not the CARDS/DATALINES method, to which it takes grave exception. The trailing @'s hold an input record for the execution of the next INPUT statement within the same iteration of the DATA step so that the automatic variable _infile_ can be fed as part of the input process in a do block with the output statement ensure that all records from the input buffer reach the data set being created.

While this method does work, I would like to know the underlying reason as to why SAS Macro won't play well with included data entry using DATALINES or CARDS statements in a data step, particularly when it allows other methods that using either SQL insert statements or standard variable assignment in data step. I find it such a curious behaviour that I remain on the lookout for the explanation why it is like this.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.