Technology Tales

Adventures & experiences in contemporary technology

%sysfunc and missing spaces

10th June 2009

Recently, I was trying something like this and noted some odd behaviour:

data _null_;
file fileref;
put "text %sysfunc(pathname(work)) more text";
run;

This is the kind of thing that I was getting:

text c:\sasworkmore text

In other words, the space after %sysfunc was being ignored and, since I was creating and executing a Windows batch file using SAS 8.2, the command line action wasn’t doing what was expected. The fix was simple but I reckoned that I’d share what I saw anyway, in case it helped anyone else:

data _null_;
file fileref;
x="text %sysfunc(pathname(work))"||" more text";
put x;
run;

Out of memory at line: 56

28th May 2009

This is an error that I have started to see a lot in the last few weeks. First, it was with Piwik and latterly with WordPress.com Stats. For the record, I have never seen it on up to date systems but always with IE6 and at page unloading time. The CPU usage hits 100% before the error is produced and that has had me blaming JavaScript in error; it isn’t the cause of all ills. In fact, the cause seems to be a bug in a certain release of Adobe Flash 9 but I am of the opinion that the inclusion of certain features in a Flash movie are needed to trigger it too. I don’t have the exact details of this but WordPress.com Stats worked without fault until a recent update and that is what is making me reach the conclusion that I have. That observation is making me wonder whether we are coming to a point where Flash compatibility is something that needs to factored into the use of the said technology in a website or web application. Updating Flash will solve the problem on the client but it might be better if it wasn’t triggered on the server side either.

About Perl’s Binding Operator

20th May 2009

This piece is as much an aide de memoire for myself as anything else but putting it here seems worthwhile if it answers questions for others. The binding operators, =~ or !~, come in handy when you are framing conditional statements in Perl using Regular Expressions, for example, testing whether x =~ /\d+/ or not. The =~ variant is also used for changing strings using the s/[pattern1]/[pattern2]/ regexp construct (the “s” stands for “substitute”). What has brought this to mind is that I wanted to ensure that something was done for strings that did not contain a certain pattern and that’s where the !~ binding operator came in useful; ^~ might have come to mind for some reason but it wasn’t what I needed.

Are ten seconds enough?

27th April 2009

Fasthosts, the hosting provider for what you find here has, in their wisdom, decided to limit the execution time for ASP scripts to 15 seconds and 10 seconds for any others. I haven’t used Perl sufficiently in this shared hosting set up to determine how that is affected. In contrast, I can share my experiences on the PHP side and you may have noticed occasional glitches. They have also disabled the set_time_limit PHP function so you cannot easily address the matter yourself where you need to do it. You almost get the feeling that they don’t trust the abilities, actions and oversight of their users. Personally, I reckon that the ten second limit is too short and that something of the order of 20 or 30 seconds would be better. If it all gets too restrictive, I suppose that there are other providers though I think that I would avoid resellers after a previous less than glorious experience. There’s the dedicated server option too if I was feeling flush, not so likely given the economic times in which we live.

Work locally, update remotely

4th December 2008

Here’s a trick that might have its uses: using a local WordPress instance to update your online blog (yes, there are plenty of applications that promise to edit your online blog but these need file permissions to the likes of xmlrpc.php to be opened up). Along with the right database access credentials and the ability to log in remotely, adding the following two lines to wp-config.php does the trick:

define('WP_SITEURL', 'http://localhost/blog');

define('WP_HOME', 'http://localhost/blog');

These two constants override what is in the database and allow to update the online database from your own PC using WordPress running on a local web server (Apache or otherwise). One thing to remember here is that both online and offline directory structures are similar. For example, if your online WordPress files are in blog in the root of the online web server file system (typically htdocs for Linux), then they need to be contained in the same directory in the root of the offline server too. Otherwise, things could get confusing and perhaps messy. Another thing to consider is that you are modifying your online blog so the usual rules about care and attention apply, particularly with respect to using the same version of WordPress both locally and remotely. This is especially a concern if you, like me, run development versions of WordPress to see if there are any upheavals ahead of us like the overhaul that is coming in with WordPress 2.7.

An alternative use of this same trick is to keep a local copy of your online database in case of any problems while using a local WordPress instance to work with it. I used to have to edit the database backup directly (on my main Ubuntu system), first with GEdit but then using a sed command like the following:

sed -e s/www\.onlinewebsite\.com/localhost/g backup.sql > backup_l.sql

The -e switch uses regular expression substitution that follows it to edit the input with the output being directed to a new file. It’s slicker than the interactive GEdit route but has been made redundant by defining constants for a local WordPress installation as described above.

Harnessing the power of ImageMagick

26th October 2008

Using the command line to process images might sound senseless but the tools offered by ImageMagick certainly prove that it has its place. I have always been wary of using bulk processing for my digital photo files (some digitised from film prints with a scanner) but I do agree that some of it is needed to free up some time for other more necessary things. With this in mind, it is encouraging to see the results from ImageMagick and I can see it making a major difference to how I maintain my online photo gallery.

For instance, making thumbnail images for the gallery certainly seems to be one of those operations where command line bulk processing comes into its own and ImageMagick’s own convert command is heaven sent for this one. For resizing images, all that’s needed is the following:

convert -resize 40% input.jpg output.jpg

Add a spot of further shell scripting and even a dash of Perl and the possibilities for this sort of thing become clearer and this is but the pinnacle of the proverbial iceberg. The -rotate switch will do what the name suggests and there are a whole plethora of other options on tap. So long as you have Ghostscript on your system, conversion of graphics to Postscript (and Encapsulated Postscript too) and PDF files is possible with the -page option controlling the margin around the image itself in the resulting outputs. Unfortunately, portrait is the sole orientation on offer but a bit of judicious post processing will turn things around. Here’s a command that’ll do the trick:

convert -page 792×612+72+72 input.png ps2:output.ps

For retrieving image metadata like its resolution and size, the identify command comes into play. The -verbose option invokes the output of all manner of image metadata so using grep or egrep is perhaps advisable, especially for bulking processing with the likes of Perl. Having the ability to stream image metadata makes loading databases like MySQL less of a chore than the manual data entry that has been my way of doing things until now.

A quick way to create a blank text file

27th June 2008

The primary job done by the touch command in UNIX or Linux is to update the time stamps on files. However, it also has another function: creating an empty text file where you are "touching" a file that doesn’t exist. This has its uses, particularly when you want to reduce the amount of pointing and clicking that you need to do or you want to generate a series of empty files in a shell script. Whatever you do with it is up to you.

Automating FTP II: Windows

15th April 2008

Having thought about automating command line FTP on UNIX/Linux, the same idea came to me for Windows too and you can achieve much the same results, even if the way of getting there is slightly different. The first route to consider is running a script file with the ftp command at the command prompt (you may need %windir%system32ftp.exe to call the right FTP program in some cases):

ftp -s:script.txt

The contents of script are something like the following:

open ftp.server.host
user
password
lcd destination_directory
cd source_directory
prompt
get filename
bye

It doesn’t take much to turn your script into a batch file that takes the user name as its first input and your password as its second for sake of enhanced security and deletes any record thereof for the same reason:

echo open ftp.server.host >  script.txt
echo %1 >> script.txt
echo %2 >> script.txt
echo cd htdocs >> script.txt
echo prompt >> script.txt
echo mget * >> script.txt
echo bye >> script.txt
%windir%system32ftp.exe -s:script.txt
del script.txt

The feel of the Windows command line (in Windows 2000, it feels very primitive but Windows XP is better and there’s PowerShell now too) can leave a lot to be desired by someone accustomed to its UNIX/Linux counterpart but there’s still a lot of tweaking that you can do to the above, given a bit of knowledge of the Windows batch scripting language. Any escape from a total dependence on pointing and clicking can only be an advance.

A collection of lessons learnt about web hosting

28th March 2008

Putting this blog back on its feet after a spot of web hosting bother caused me to learnt a bit more about web hosting than I  otherwise might have done. Here’s a selection and they are in no particular order:

  1. Store your passwords securely and where you can find them because you never know how a foul up of your own making can strike. For example, a faux pas with a configuration file is all that’s needed to cause havoc for a database site such as a WordPress blog. After all, nobody’s perfect and your hosting provider may not get you out of trouble as quickly as you might like.
  2. Get a MySQL database or equivalent as part of your package rather than buying one separately. If your provider allows a trial period, then changing from one package to another could be cheaper and easier than if you bought a separate database and needed to jettison it because you changed from, say, a Windows package to a Linux one or vice versa.
  3. It might be an idea to avoid a reseller unless the service being offered is something special. Going for the sake of lower cost can be a false economy and it might be better to cut out the middleman altogether and go direct to their provider. Being able to distinguish a reseller from a real web host would be nice but I don’t see that ever becoming a reality; it is hardly in resellers’ interests, after all.
  4. Should you stick with a provider that takes several days to resolve a serious outage? The previous host of this blog had a major MySQL server outage that lasted for up to three days and seeing that was one of the factors that made me turn tail to go to a more trusted provider that I have used for a number of years. The smoothness of the account creation process might be another point worthy of consideration.
  5. Sluggish system support really can frustrate, especially if there is no telephone support provided and the online ticketing system seems to take forever to deliver solutions. I would advise strongly that a host who offers a helpline is a much better option than someone who doesn’t. Saying all of that, I think that it’s best to be patient and, when your website is offline, that might not be as easy you’d hope it to be.
  6. Setting up hosting or changing from one provider to another can take a number of days because of all that needs doing. So, it’s best to allow for this and plan ahead. Account creation can be very quick but setting up the website can take time while domain name transfer can take up to 24 hours.
  7. It might not take the same amount of time to set up Windows hosting as its Linux equivalent. I don’t know if my experience was typical but I have found that the same provider set up Linux hosting far quicker (within 30 minutes) than it did for a Windows-based package (several hours).
  8. Be careful what package you select; it can be easy to pick the wrong one depending on how your host’s sight is laid out and what they are promoting at the time.
  9. You can have a Perl/PHP/MySQL site working on Windows, even with IIS being used in place instead of Apache. The Linux/Apache/Perl/PHP/MySQL approach might still be better, though.
  10. The Windows option allows for ASP, .Net and other such Microsoft technologies to be used. I have to say that my experience and preference is for open source technologies so Linux is my mainstay but learning about the other side can never hurt from a career point of view. After, I am writing this on a Windows Vista powered laptop to see how the other half live as much as anything else.
  11. Domains serviced by hosting resellers can be visible to the systems of those from whom they buy their wholesale hosting. This frustrated my initial attempts to move this blog over because I couldn’t get an account set up for technologytales.com because a reseller had it already on the same system. It was only when I got the reseller to delete the account with them that things began to run more smoothly.
  12. If things are not going as you would like them, getting your account deleted might be easier than you think so don’t procrastinate because you think it a hard thing to do. Of course, it goes without saying that you should back things up beforehand.

Setting up a test web server on Ubuntu

1st November 2007

Installing all of the bits and pieces is painless enough so long as you know what’s what; Synaptic does make it thus. Interestingly, Ubuntu’s default installation is a lightweight affair with the addition of any additional components involving downloading the packages from the web. The whole process is all very well integrated and doesn’t make you sweat every time you to install additional software. In fact, it resolves any dependencies for you so that those packages can be put in place too; it lists them, you select them and Synaptic does the rest.

Returning to the job in hand, my shopping list included Apache, Perl, PHP and MySQL, the usual suspects in other words. Perl was already there as it is on many UNIX systems so installing the appropriate Apache module was all that was needed. PHP needed the base installation as well as the additional Apache module. MySQL needed the full treatment too, though its being split up into different pieces confounded things a little for my tired mind. Then, there were the MySQL modules for PHP to be set in place too.

The addition of Apache preceded all of these but I have left it until now to describe its configuration, something that took longer than for the others; the installation itself was as easy as it was for the others. However, what surprised me were the differences in its configuration set up when compared with Windows. Same software, different operating system and they have set up the configuration files differently. I have no idea why they did this and it makes no sense at all to me; we are only talking about text files after all. The first difference is that the main configuration file is called apache2.conf in Ubuntu rather than httpd.conf as in Windows. Like its Windows counterpart, Ubuntu’s Apache does uses subsidiary configuration files. However, there is an additional layer of configurability added courtesy of a standard feature of UNIX operating systems: symbolic links. Rather than having a single folder with the all configuration files stored therein, there are two pairs of folders, one pair for module configuration and another for site settings: mods-available/mods-enabled and sites-available/sites-enabled, respectively. In each pair, there is a folder with all of the files and another containing symbolic links. It is the presence of a symbolic link for a given configuration file in the latter that activates it. I learned all this when trying to get mod_rewrite going and changing the web server folder from the default to somewhere less susceptible to wrecking during a re-installation or, heaven forbid, a destructive system crash. It’s unusual but it does work, even if it takes that little bit longer to get things sorted out when you first meet up with it.

Apart from the Apache set up and finding the right things to install, getting a test web server up and running was a fairly uneventful process. All’s working well now and I’ll be taking things forward from here; making website Perl scripts compatible with their new world will be one of the next things that need to be done.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.