Technology Tales

Adventures & experiences in contemporary technology

Something to watch with the SYSODSESCAPECHAR automatic SAS macro variable

10th October 2021

Recently, a client of mine updated one of their systems from SAS 9.4 M5 to SAS 9.4 M7. In spite of performing due diligence regarding changes between the maintenance release, a change in behaviour of the SYSODSESCAPECHAR automatic macro variable surprised them. The macro variable captures the assignment of the ODS escape character used to prefix RTF codes for page numbering and other things. That setting is made using an ODS ESCAPECHAR statement like the following:

ods escapechar="~";

In the M5 release, the tilde character in this example was output by the automatic macro variable but that changed in the M7 release to 7E, the hexadecimal code for the same and this tripped up one of their validated macro programs used in output production. The adopted solution was to use the escape sequence (*ESC*) that gave the same outcome that was there before the change. That was less verbose than alternative code changing the hexadecimal code into the expected ASCII character that follows.

data _null_;
call symput("new",byte(input("&sysodsescapechar.",hex.)));
run;

The above supplies a hexadecimal code to the BYTE function for correct rendering with the SYMPUT routine assigning the resulting value to a macro variable named new. Just using the escape sequence is far more succinct though there is now an added validation need once user pilot testing has completed. In my line of business, the updating of code is the quickest part of many such changes; documentation and testing always take longer.

Improving a website contact form

23rd April 2018

On another website, I have had a contact form but it was missing some functionality. For instance, it stored the input in files on a web server instead of emailing them. That was fixed more easily than expected using the PHP mail function. Even so, it remains useful to survey corresponding documentation on the w3schools website.

The other changes affected the way the form looked to a visitor. There was a reset button and that was removed on finding that such things are out of favour these days. Thinking again, there hardly was any need for it any way.

Newer additions that came with HTML5 had their place too. Including user hints using the placeholder attribute should add some user friendliness although I have avoided experimenting with browser-powered input validation for now. Use of the required attribute has its uses for tell a visitor that they have forgotten something but I need to check how that is handled in CSS more thoroughly before I go with that since there are new :required, :optional, :valid and :invalid pseudoclasses that can be used to help.

It seems that there is much more to learn about setting up forms since I last checked. This is perhaps a hint that a few books need reading as part of catching with how things are done these days. There also is something new to learn.

Overriding replacement of double or triple hyphenation in WordPress

7th June 2016

On here, I have posts with example commands that include double hyphens and they have been displayed merged together, something that has resulted in a comment posted by a visitor to this part of the web. All the while, I have been blaming the fonts that I have been using only for it to be the fault of WordPress itself.

Changing multiple dashes to something else has been a feature of Word autocorrect but I never expected to see WordPress aping that behaviour and it has been doing so for a few years now. The culprit is wptexturize and that cannot be disabled for it does many other useful things.

What happens is that the wptexturize filter changes ‘---‘ (double hyphens) to ‘–’ (– in web entity encoding) and ‘---‘ (triple hyphens) to ‘—’ (— in web entity encoding). The solution is to add another filter to the content that changes these back to the way they were and the following code does this:

add_filter( ‘the_content’ , ‘mh_un_en_dash’ , 50 );
function mh_un_en_dash( $content ) {
$content = str_replace( ‘–’ , ‘--‘ , $content );
$content = str_replace( ‘—’ , ‘---‘ , $content );
return $content;
}

The first line of the segment adds in the new filter that uses the function defined below it. The third and fourth lines above do the required substitution before the function returns the post content for display in the web page. The whole code block can be used to create a plugin or placed the theme’s functions.php file. Either way, things appear without the substitution confusing your readers. It makes me wonder if a bug report has been created for this because the behaviour looks odd to me.

Changing the working directory in a SAS session

12th August 2014

It appears that PROC SGPLOT and other statistical graphics procedures create image files even if you are creating RTF or PDF files. By default, these are PNG files but there are other possibilities. When working with PC SAS , I have seen them written to the current working directory and that could clutter up your folder structure, especially if they are unwanted.

Being unable to track down a setting that controls this behaviour, I resolved to find a way around it by sending the files to the SAS work directory so they are removed when a SAS session is ended. One option is to set the session’s working directory to be the SAS work one and that can be done in SAS code without needing to use the user interface. As a result, you get some automation.

The method is implicit though in that you need to use an X statement to tell the operating system to change folder for you. Here is the line of code that I have used:

x "cd %sysfunc(pathname(work))";

The X statement passes commands to an operating system’s command line and they are enclosed in quotes. %sysfunc then is a macro command that allows certain data step functions or call routines as well as some SCL functions to be executed. An example of the latter is pathname and this resolves library or file references and it is interrogating the location of the SAS work library here so it can be passed to the operating systems cd (change directory) command for processing. This method works on Windows and UNIX so Linux should be covered too, offering a certain amount of automation since you don’t have to specify the location of the SAS work library in every session due to the folder name changing all the while.

Of course, if someone were to tell me of another way to declare the location of the generated PNG files that works with RTF and PDF ODS destinations, then I would be all ears. Even direct output without image file creation would be even better. Until then though, the above will do nicely.

Some SAS Macro code for detecting the presence or absence of a variable in a dataset

4th December 2013

Recently, I needed to put in place some code to detect the presence or absence of a variable in a dataset and I chose SAS Macro programming as the way to do what I wanted. The logic was based on a SAS sample that achieved the same result in a data step and some code that I had for detecting the presence or absence of a dataset. Mixing the two together gave me something like the following:

%macro testvar(ds=,var=);

%let dsid=%sysfunc(open(&ds,in));
%let varexist=%sysfunc(varnum(&dsid,&var));
%if &dsid > 0 %then %let rc=%sysfunc(close(&dsid));

%if &varexist gt 0 %then %put Info: Variable &var is in the &ds dataset;
%else %put Info: Variable &var is not in the &ds dataset;

%mend testvar;

%testvar(ds=dataset,var=var);

What this does is open up a dataset and look for the variable number in the dataset. In datasets, variables are numbered from left to right with 1 for the first one, 2 for the second and so on. If the variable is not in the dataset, the result is 0 so you know that it is not there. All of this is what the VARNUM SCL function within the SYSFUNC macro function does. In the example, this resolves to %sysfunc(varnum(&dsid,var)) with no quotes around the variable name like you would do in data step programming. Once you have the variable number or 0, then you can put in place some conditional logic that makes use of the information like what you see in the above simple example. Of course, that would be expanded to something more useful in real life but I hope it helps to show you the possibilities here.

Surveying changes coming in GNOME 3.10

20th October 2013

GNOME 3.10 came out last month but it took until its inclusion into the Arch and Antergos repositories for me to see it in the flesh. Apart from the risk of instability, this is the sort of thing at which rolling distributions excel. They can give you a chance to see the latest software before it is included anywhere else. For the GNOME desktop environment, it might have meant awaiting the next release of Fedora in order to glimpse what is coming. This is not always a bad thing because Ubuntu GNOME seems to be sticking with using a release behind the latest version. With many GNOME Shell extension writers not updating their extensions until Fedora has caught up with the latest release of GNOME for a stable release, this is no bad thing and it means that a version of the desktop environment has been well bedded in by the time it reaches the world of Ubuntu too. Debian takes this even further by using a stable version from a few years ago and there is an argument in favour of that from a solidity perspective.

Being in the habit of kitting out GNOME Shell with extensions, I have a special interest in seeing which ones still work or could work with a little tweaking and those which have fallen from favour. In the top panel, the major change has been to replace the sound and user menus with a single aggregate menu. The user menu in particular has been in receipt of the attentions of extension writers and their efforts either need re-work or dropping after the latest development. The GNOME project seems to have picked up an annoying habit from WordPress in that the GNOME Shell API keeps changing and breaking extensions (plugins in the case of WordPress). There is one habit from the WordPress that needs copying though and that is with documentation, especially of that API for it is hardly anywhere to be found.

GNOME Shell theme developers don’t escape and a large border appeared around the panel when I used Elementary Luna 3.4 so I turned to XGnome Enhanced (found via GNOME-Look.org) instead. The former no longer is being maintained since the developer no longer uses GNOME Shell and has not got the same itch to scratch; maybe someone else could take it over because it worked well enough until 3.8? So far, the new theme works for me so that will be an option should there a move to GNOME 3.10 on one of my PC’s at some point in the future.

Returning to the subject of extensions, I had a go at seeing how the included Applications Menu extension works now since it wasn’t the most stable of items before. That has improved and it looks very usable too so I am not awaiting the updating of the Frippery equivalent. That the GNOME Shell backstage view has not moved on that much from how it was in 3.8 could be seen as a disappointed but the workaround will do just fine. Aside from the Frippery Applications Menu, there are other extensions that I use heavily that have yet to be updated for GNOME Shell 3.10. After a spot of success ahead of a possible upgrade to Ubuntu GNOME 13.10 and GNOME Shell 3.8 (though I remain with version 13.04 for now), I decided to see I could port a number of these to the latest version of the user interface. Below, you’ll find the results of my labours so feel free to make use of these updated items if you need them before they are update on the GNOME Shell Extensions website:

Frippery Bottom Panel

Frippery Move Clock

Remove App Menu

Show Desktop

There have been more changes coming in GNOME 3.10 than GNOME Shell, which essentially is a JavaScript construction. The consolidation of application title bars in GNOME applications continues but a big exit button has appeared in the affected applications that wasn’t there before. Also there remains the possibility of applying the previously shared modifications to Nautilus (also known as Files) and a number of these usefully extend themselves to other applications such as Gedit too. Speaking of Gedit, this gains a very useful x of y numbering for the string searching functionality with x being the actual number of the occurrence of a certain piece of text in a file and y being its total number of occurrences.GNOME Tweak Tool has got an overhaul too and lost the setting that makes a folder path box appear in Nautilus instead of a location part, opening Dconf-Editor and going to org > gnome > nautilus > preferences and completing the tick box for always-use-location-entry will do the needful.

Essentially, the GNOME project is continuing along the path on which it set a few years ago. Though I would rather that GNOME Shell would be more mature, invasive changes are coming still and it leaves me wondering if or when this might stop. Maybe that was the consequence of mounting a controversial experiment when users were happy with what was there in GNOME 2. The arrival of Fedora 20 should bring with it an increase in the number of GNOME shell extensions that have been updated. So long as it remains stable Antergos is good have a look at the latest version of GNOME for now and Cinnamon fans may be pleased the Cinnamon 2.0 is another desktop option for the Arch-based distribution. An opportunity to say more about that may arrive yet once the Antergos installer stops failing at a troublesome package download; a separate VM is being set aside for a look at Cinnamon because it destabilised GNOME during a previous look.

Restoring GNU Parallel Functionality in Ubuntu GNOME 13.04

31st July 2013

There is a handy comand line utility called GNU Parallel that allows you to run Linux commands on more than one CPU core at a time to perform parallel processing of the task at hand. Here is a form of the command that is similar to one that I often use:

ls *.* | parallel gm convert -sharpen 1×3 {} sharpened_images/{}

What it does is pipe a list of files in a folder to GraphicsMagick for sharpening and outputting to a sharpened_images directory. The {} in the command is where the filenames go in the sharpening command.

This worked fine in Ubuntu GNOME 12.10 but stopped doing so after I upgraded to the next version. A look on the web set me to running the following command:

parallel --version

That produced output that included the following line:

WARNING: YOU ARE USING --tollef. IF THINGS ARE ACTING WEIRD USE --gnu.

Rerunning the original command with the --gnu option worked but there was a more permanent solution than using something like this:

ls *.* | parallel --gnu gm convert -sharpen 1×3 {} sharpened_images/{}

That was editing /etc/parallel/config with root privileges to delete the --tollef option from there. With that completed, all was as it should again and it makes me wonder why the change was made in the first place. Perhaps because of it, there even is a discussion about the possibility of removing the --tollef option altogether since it is raising more questions than it answers.

A display of brand loyalty

12th July 2013

Since 2007, my main camera has been a Pentax K10D DSLR and it has gone on many journeys with me. In fact, more than 15,000 images have been captured with it and I have classed it as an unfailing servant. The autofocus may not be the fastest but my subjects tend to be stationary: landscapes, architecture, flora and transport. Even any bus and train photos have included parked vehicles rather than moving ones so there never have been issues. The hint of underexposure in any photos always can be sorted because DNG files are what I create, with all the raw capture information that is possible to retain. In fact, it has been hard to justify buying another SLR because the K10D has done so well for me.

In recent months, I have looking at processed photos and asking myself if time has moved along for what is not far from being a six year old camera. At various times, I have been looking at higher members of the Pentax while wondering if an upgrade would be a good idea. First, there was the K7 and then the K5 before the K5 II got launched. Even though its predecessor is still to be found on sale, it was the newer model that became my choice.

Pentax K5-II

My move to Pentax in 2007 was a case of brand disloyalty since I had been a Canon user from when I acquired my first SLR, an EOS 300. Even now, I still have a Powershot G11 that finds itself slipped into a pocket on many a time. Nevertheless, I find that Canon images feel a little washed out prior to post processing and that hasn’t been the case with the K10D. In fact, I have been hearing good things about Nikon cameras delivering punchy results so one of them would be a contender were it not for how well the Pentax performed.

So, what has my new K5 II body gained me that I didn’t have before? For one thing, the autofocus is a major improvement on that in the K10D. It may not stop me persevering with manual focusing for most of the time but there are occasions the option of solid autofocus is good to have. Other advances include a 16.3 megapixel sensor with a much larger ISO range. The advances in sensor technology since when the K10D appeared may give me better quality photos and noise is something that my eyes may have begun to detect in K10D photos even at my usual ISO of 400.

There have been innovations that I don’t need too. Live View is something that I use heavily with the Powershot G11 because it has such a pitiful optical viewfinder. The K5 II has a very bright and sharp one so that function lays dormant, especially when I witnessed dodgy autofocus performance with it in use; manual focusing should be OK, I reckon. By default too, the screen stays on all the time and that’s a nuisance for an optical viewfinder user like me so I looked through the manual and the menus to switch off the thing. My brief flirtation with the image level display met an end for much the same reason though it’s good that it’s there. There is some horizon auto-correction available as a feature and this is left on to see what it offers since there have been a multitude of times when I needed to sort out crooked horizons caused by my handholding the camera.

The K5 II may have a 3″ screen on its back but it has done nothing to increase the size of the camera. If anything, it is smaller that the K10D and that usefully means that I am not on the lookout for a new camera holster. Not having a bigger body also means there is little change in how the much camera feels in the hand compared with the older one.

In many ways, the K5 II works very like the K10D once I took control over settings that didn’t suit me. Both have Shake Reduction in their camera bodies though the setting has been moved into the settings menu in the new camera when the older one had a separate switch on its body. Since I’d be inclined to leave it on all the time and prefer not to have it knocked off accidentally, this is not an issue.  Otherwise, many of the various switches are in the same places so it’s not that hard to find my way around them.

That’s not to say that there aren’t other changes like the addition of a lock to the mode dial but I have used Canon EOS camera bodies with that feature so I do not consider it a step backwards. The exposure compensation button has been moved to the top of the camera where I found it very easily and have been using perhaps more than on the K10D; it’s also something that I use on the G11 so the experimentation is being brought across to the K5 II now as well. Beside it, there’s a new ISO button so further experimentation can be attempted with that to see how it does.

If I have any criticism, it’s about the clutter of the menus on the K5 II. The long lists through you scrolled on the K10D have been replaced with a series of extra tabs so that on-screen scrolling is not needed as before. However, I reckon that this breaks up things too much and makes working through the settings look more foreboding to anyone who is not so technical in mindset. Nevertheless, settings such as the the type of file to capture are there and I continue to use RAW DNG files as is usual for me though JPEG and Pentax’s own RAW format also are there. For a while, I forgot to set the date, soon found out what I did and the situation was remedied. The same sort of thing applied to storing files in different folders according to the capture date. For my own reasons, I turned this off to put everything into a single PENTX directory to suit my own workflow. My latest discovery among the menus was the ability to add photographer and copyright holder information to the EXIF metadata attached to the image files created by the camera. With legislative proposals that dilute the automatic rights of copyright holders going through the U.K. parliament, this seems a very timely inclusion even if most would prefer that there was no change to copyright law.

Of course, the worth of any camera is in the images that it produces and I have been happy with what I have been getting so far. The bigger files mean less images fit on a memory card as before. Thankfully, SDHC card capacities have grown even if I don’t wish to machine gun my photography altogether. While out and about, I was surprised to apertures like F/14 and F/18 when I was more accustomed to a progression like F/11, F/13, F/16, F/19, F/22, etc. Most of those older values still are there though so there hasn’t been a complete break with convention. The same comment applies to shutter speeds where ones like 1/100 and 1/160 made there appearance where I might have expected just ones like 1/90, 1/125, 1/250 and so on. The extra possibilities, and that is what they are, do allow more flexibility I suppose and may even make it easier to make correct exposures though any judgement of correctness has to be in the eye of a photographer and not what a computer algorithm in a camera determines. For much of the time until now, I have stuck with an ISO of 400 apart from a little testing in a woodland area of an evening soon after the camera arrived.

Since the K5 II came my way a few months ago, I have been meaning to collect my thoughts on here and there has been a delay while I brought mu thinking to a sensible close.At one point, it felt like there was so much to say that the piece became larger in my mind that even what you have been reading now. After all, there are other things that I can adjust to see how the resulting images look and white balance is but one of these.The K10D isn’t beyond experimentation either, especially since I discovered that shake reduction was switched off and it has me asking if that lacking in quality that I mentioned earlier has another explanation. Of course, actually making use of my tripod would be another good suggestion so it’s safe to say that yet more photographic explorations await.

Calculating geometric means using SAS

19th March 2012

Recently, I needed to calculate geometric means after a break of a number of years and ended racking a weary brain until it was brought back to me. In order that I am not in the same situation again, I am recording it here and sharing it always is good too.

The first step is to take the natural log (to base e or the approximated irrational value of the mathematical constant, 2.718281828) of the actual values in the data. Since you cannot have a log of zero, the solution is either to exclude those values or substitute a small value that will not affect the overall result as is done in the data step below. In SAS, the log function uses the number e as its base and you need to use the log10 equivalent when base 10 logs are needed.

data temp;
set temp;
if result=0 then _result=0.000001;
else _result=result;
ln_result=log(_result);
run;

Next, the mean of the log values is determined and you can use any method of doing that so long as it gives the correct values. PROC MEANS is used here but PROC SUMMARY (identical to MEANS except it defaults to data set creation while that generates output by default; because of that, we need to use the NOPRINT option here), PROC UNIVARIATE or even the MEAN function in PROC SQL.

proc means data=temp noprint;
output out=mean mean=mean;
var ln_result;
run;

With the mean of the log values obtained, we need to take the exponential of the obtained value(s) using the EXP function. This returns values of the same magnitude as in the original data using the formula emean.

data gmean;
set mean;
gmean=exp(mean);
run;

ERROR 22-322: Syntax error, expecting one of the following: a name, *.

14th June 2010

This is one of the classic SAS errors that you can get from PROC SQL and it can be thrown by a number of things. Missing out a comma in a list of variables on a SELECT statement is one situation that will do it, as will having an extraneous one. As I discovered recently, an ill-defined SAS function nesting like LEFT(TRIM(PERIOD,BEST.)) will have the same effect; notice the missing PUT function in the example. The latter surprised me because I might have expected something more descriptive for this as would be the case in data step code. In the event, it took some looking before the problem hit me because it’s amazing how blind you can become to things that are staring you in the face. Familiarity really can make you pay less attention.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.