An avalanche of innovation?
23rd September 2010It seems that, almost despite the uncertain times or maybe because of them, it feels like an era of change on the technology front. Computing is the domain of many of the postings on this website, and a hell of a lot seems to be going mobile at the moment. For a good while, I managed to stay clear of the attractions of smartphones until a change of job convinced me that having a BlackBerry was a good idea. Though the small size of the thing really places limitations on the sort of web surfing experience that you can have with it, you can keep an eye on the weather, news, traffic, bus and train times so long as the website in question is built for mobile browsing. Otherwise, it's more of a nuisance than a patchy phone network (in the U.K., T-Mobile could do better on this score, as I have discovered for myself; thankfully, a merger with the Orange network is coming next month).
Speaking of mobile websites, it almost feels as if a free for all has recurred for web designers. Just when the desktop or laptop computing situation had more or less stabilised, along came a whole pile of mobile phone platforms to make things interesting again. Familiar names like Opera, Safari, Firefox and even Internet Explorer are to be found popping up on handheld devices these days along with less familiar ones like Web 'n' Walk or BOLT. The operating system choices vary too, with iOS, Android, Symbian, Windows and others all competing for attention. It is the sort of flowering of innovation that makes one wonder if a time will come when things begin to consolidate, but it doesn't look like that at the moment.
The transformation of mobile phones into handheld computers isn't the only big change in computing, with the traditional formats of desktop and laptop PC's being flexed in all sorts of ways. First, there's the appearance of netbooks, and I have succumbed to the idea of owning an Asus Eee. Though you realise that these are not full size laptops, it still didn't hit me how small these were until I owned one. They are undeniably portable, while tablets look even more interesting in the aftermath of Apple's iPad. Though you may call them over-sized mobile photo frames, the idea of making a touchscreen do the work for you has made the concept fly for many. Even so, I cannot say that I'm overly tempted, though I have said that before about other things.
Another area of interest for me is photography, and it is around this time of year that all sorts of innovations are revealed to the public. It's a long way from what, we thought, was the digital photography revolution when digital imaging sensors started to take the place of camera film in otherwise conventional compact and SLR cameras, making the former far more versatile than they used to be. Now, we have SLD cameras from Olympus, Panasonic, Samsung and Sony that eschew the reflex mirror and prism arrangement of an SLR using digital sensor and electronic viewfinders while offering the possibility of lens interchangeability and better quality than might be expected from such small cameras. Lately, Sony has offered SLR-style cameras with translucent mirror technology instead of the conventional mirror that is flipped out of the way when a photographic image is captured. Change doesn't end there, with movie making capabilities being part of the tool set of many a newly launched compact, SLD and SLR camera. The pixel race also seems to have ended though increases still happen as with the Pentax K-5 and Canon EOS 60D (both otherwise conventional offerings that have caught my eye, though so much comes on the market at this time of year that waiting is better for the bank balance).
The mention of digital photography brings to mind the subject of digital image processing and Adobe Photoshop Elements 9 is just announced after Photoshop CS5 appeared earlier this year. It almost feels as if a new version of Photoshop or its consumer cousin is released every year, causing me to skip releases when I don't see the point. Elements 6 and 8 were such versions for me, so I'll be in no hurry to upgrade to 9 yet either, even if the prospect of using content aware filling to eradicate unwanted objects from images is tempting. Nevertheless, that shouldn't stop anyone trying to exclude them in the first place. In fact, I may need to reduce the overall number of images that I collect in favour of coming away with only the better ones. The outstanding question on this is: can I slow down and calm my eagerness to bring at least one good image away from an outing by capturing anything that seems promising at the time? Some experimentation but being a little more choosy can save work later on.
While back on the subject of software, I'll voyage in to the world of the web before bringing these meanderings to a close. It almost feels as if there are web-based applications following web-based applications these days, when Twitter and Facebook nearly have become household names and cloud computing is a phrase that turns up all over the place. In fact, the former seems to have encouraged a whole swathe of applications all of itself. Applications written using technologies well-used on the web must stuff many a mobile phone app store too and that brings me full circle for it is these that put so much functionality on our handsets with Java seemingly powering those I use on my BlackBerry. Then there's the spat between Apple and Adobe regarding the former's support for Flash.
To close this mental amble, there may be technologies that didn't come to mind while I was pondering this piece, but they doubtless enliven the technological landscape too. However, what I have described is enough to take me back more than ten years ago, when desktop computing and the world of the web were a lot more nascent than is the case today. Then, the changes that were ongoing felt a little exciting now that I look back on them, and it does feel as if the same sort of thing is recurring though with things like phones creating the interest in place of new developments in desktop computing such as a new version of Window (though 7 was anticipated after Vista). Web designers may complain about a lack of standardisation, and they're not wrong, yet this may be an era of technological change that in time may be remembered with its own fondness too.
Exploring the option of mobile broadband
20th September 2010Last week, I decided to buy and experiment with a Vodafone PAYG mobile broadband dongle (the actual device is a ZTE K3570-Z) partly as a backup for my usual broadband (it has had its moments recently) and partly to allow me to stay more connected while on the move. Thoughts of blogging and checking up on email or the real-time web while travelling to and from different places must have swayed me.
Hearing that the use of Windows or OS X with the device had me attempting to hook up the device to Windows 7 running within a VirtualBox virtual machine on my main home computer. When that proved too big a request from the software setup, I went googling out of curiosity and found that there was a way to get the thing going with Linux. While I am not so sure that it works with Ubuntu without any further adjustments, my downloading of a copy of the Sakis3G script was enough to do the needful, and I was online from my main OS after all. So much for what is said on the box...
More success was had with Windows 7 as loaded on my Toshiba Equium notebook, with setting up and connections being as near to being effortless as these things can be. Ubuntu is available on there too, courtesy of Wubi, and the Sakis3G trick didn't fail for that either.
That's not to say that mobile broadband doesn't have its limitations, as I found. For instance, Subversion protocols and Wubi installations aren't supported, but that may be a result of non-support of IPv6 than anything else. Nevertheless, connection speeds are good as far as I can see, though I yet have to test out the persistence of Vodafone's network while constantly on the move. Having seen how flaky T-Mobile's network can be in the U.K. as I travel around using my BlackBerry, that is something that needs doing, yet all seems painless enough so far. However, the fact that Vodafone uses the more usual mobile phone frequency may be a help.
Creating a Data Set Containing Confidence Intervals Using PROC UNIVARIATE
5th September 2010While you could generate data sets containing means and confidence intervals using PROC SUMMARY
or PROC MEANS
, curiosity and the need to verify a program using a different technique were what drove me to consider using PROC UNIVARIATE
for the task. For the record, the PROC SUMMARY
code is below and the only difference between it and MEANS
is that it doesn't produce output by default, something that's not needed in this case anyway. Quite why there are two SAS procedures doing the same thing is beyond me, though I do wonder if the NOPRINT
option was a later addition than these two procedures. The LCLM
and UCLM
keywords are what triggers the calculation of confidence limits and the ALPHA
option controls the confidence interval used; 0.05 specifies a 95% interval, 0.1 a 90% one and so on.
proc summary data=sashelp.class mean lclm uclm alpha=0.05;
var age;
output out=sasuser.lims mean=mean lclm=lclm uclm=uclm;
run;
Given that I have had PROC UNIVARIATE
producing statistics that MEANS/SUMMARY
didn't in previous versions of SAS (I believe that it was standard deviation that was absent from MEANS/SUMMARY
), I might have expected the calculation and export of confidence limits to a data set to be straightforward. Sadly, it's not a case of simply adding LCLM
and UCLM
keywords in the OUTPUT
statement for the procedure, and ODS OUTPUT
is needed to create the data set instead. An ODS SELECT
statement is needed to pick out the BasicIntervals
output object (UNIVARIATE
creates quite a few, it seems) that is created through specification of the CIBASIC
and ALPHA
(performs the same role as it does for PROC MEANS/SUMMARY
) options on the PROC UNIVARIATE
statement. The reason for the ODS LISTING
and ODS RTF
statements below is to stop output being sent to the output window in a standard SAS session. For some reason, it appears that you need the sending of output to one of the LISTING
, HTML
or RTF
destinations or there will be no data in the data set; I met up with the same behaviour when using ODS PS
, an ODS PRINTER
destination. The data set will contain statistics for mean, standard deviation and variance, so that's why there is a WHERE
clause on the ODS OUTPUT
statement.
ods listing close;
ods rtf body="c:\temp\uni_eg.doc";
ods select BasicIntervals;
ods output BasicIntervals=sasuser.stats(where=(lowcase(parameter)="mean") );
proc univariate cibasic alpha=0.05 data=sashelp.class;
var age;
run;
ods output close;
ods rtf close;
ods listing;
Using ODS Graphics to Create Plots Using PROC LIFETEST
3rd September 2010One of the nice things about SAS 9.2 is that creation of statistical graphics is enhanced using the Output Delivery System (ODS). One of the beneficiaries of this is PROC LIFETEST
, a procedure that gained a lot when data sets could be created from it using ODS OUTPUT
statements. Before that, it was a matter of creating text output and converting it to a SAS data set using Data Step, and that was a nuisance on a system that attached special significance to output destinations set up using PROC PRINTTO
. What you’ll find below is a sample of the type of code for creating a Kaplan-Meier survival plot for time to adverse events resulting in discontinuation of study treatment, with actual and censored times. The IMAGENAME
parameter on the ODS GRAPHICS statement line controls the name of the file, and it is possible to change the type using the IMAGEFMT
parameter too.
ods graphics on / imagename=”fig5″;
proc lifetest data=km3 method=km plots=survival;
time timetoae*cens_ae(0);
run;
ods graphics off;
On Making PROC REPORT Work Harder
1st September 2010In the early years of my SAS programming career, there seemed to be just the one procedure to use if you wanted to create a summary table. That was TABULATE
and it was great for generating columns according to the value of a variable such as the treatment received by a subject in a clinical study. To a point, it could generate statistics for you too, and I often used it to sum frequency and percentage variables. Since then, it seems to have been enhanced a little and surprised me with the statistics it could produce when I had a recent play. Here's the code:
proc tabulate data=sashelp.class;
class sex;
var age;
table age*(n median*f=8. mean*f=8.1 std*f=8.1 min*f=8. max*f=8. lclm*f=8.1 uclm*f=8.1),sex
/ misstext="0";
run;
When you compare that with the idea of creating one variable per column and then defining them in PROC REPORT
as many do, it looks more elegant and the results aren't bad either, though they can be tweaked further from the quick example that I generated. That last comment brings me to the point that PROC REPORT
seems to have taken over from TABULATE
wherever I care to look these days, and I do ask myself if it is the right tool for that for which it is being used or if it is being used in the best way.
While using Data Step to create one variable per column in a PROC REPORT
output doesn't strike me as the best way to write reusable code, there are ways to make PROC REPORT
do more for you. For example, by defining GROUP
, ACROSS
and ANALYSIS
columns in an output, you can persuade the procedure to do the summarising for you and there's some example code below with the comma nesting height under sex in the resulting table. Sums are created by default if you do this, and forgoing an analysis column definition means that you get a frequency table, not at all a useless thing in numerous instances.
proc report data=sashelp.class nowd missing;
columns age sex,height;
define age / group "Age";
define sex / across "Sex";
define height / analysis mean f=missing. "Mean Height";
run;
For those times when you need to create more heavily formatted statistics (summarising range as min-max rather showing min and max separately, for example), you might feel that the GROUP/ACROSS
set-up's non-display of character values puts a stop to using that approach. However, I found that making every value combination unique and attaching a cell ID helps to work around the problem. Then, you can create a format control data set from the data like in the code below and create a format from that which you can apply to the cell ID's to display things as you need them. This method does make things more portable from situation to situation than adding or removing columns depending on the values of a classification variable.
proc sql noprint;
create table cntlin as
select distinct "fmtname" as fmtname, cellid as start, cellid as end, decode as label
from report;
quit;
proc format lib=work cntlin=cnlin;
run;
A look at Emacs
10th August 2010It's remarkable what work can bring your way in terms of technology. For me, (GNU) Emacs Has proved to be such a thing recently. It may have been around since 1975, long before my adventures in computing ever started, in fact, but I am asking myself why I never really have used it much. There are vague recollections of my being aware of its existence in the early days of my using UNIX over a decade ago. Was it a shortcut card with loads of seemingly esoteric keyboard shortcuts and commands that put me off it back then? The truth may have been that I got bedazzled with the world of Microsoft Windows instead, and so began a distraction that lingered until very recently. As unlikely as it looks now, Word and Office would have been part of the allure of what some consider as the dark side these days. O how OpenOffice.org and their ilk have changed that state of affairs...
The unfortunate part of the Emacs story might be that its innovations were never taken up as conventions by mainstream computing. If its counterparts elsewhere used the same keyboard shortcuts, it would feel like learning such an unfamiliar tool. Still, it's not as if there isn't logic behind it because it will work both in a terminal session (where I may have met it for the first time) and a desktop application GUI. The latter is the easier to learn, and the menus list equivalent keyboard shortcuts for many of their entries, too. For a fuller experience though, I can recommend the online manual, and you can buy it in paper form too if you prefer.
One thing that I discovered recently is that external factors can sour the impressions of a piece of software. For instance, I was using a UNIX session where the keyboard mapping wasn't optimal. There's nothing like an unfamiliar behaviour for throwing you off track because you feel that your usual habits are being obstructed. For instance, finding that a Backspace key is behaving like a Delete one is such an obstruction. It wasn't the fault of Emacs, and I have found that using Ctrl+K (C-k in the documentation) to delete whole lines is invaluable.
Apart from keyboard mapping niggles, Emacs has to be respected as a powerful piece of software in its own right. It may not have the syntax highlighting capabilities of some, like gedit or NEdit for instance, but I have a hunch that a spot of Lisp programming would address that need. What you get instead is support for version control systems like RCS or CVS, along with integration with GDB for debugging programs written in a number of languages. Then, there are features like file management, email handling, newsgroup browsing, a calendar and a calculator that make you wonder if they tried to turn a text editor into something like an operating system. With Google trying to use Chrome as the basis of one, it almost feels as is Emacs was ahead of its time, though that may have been more due to its needing to work within a UNIX shell in those far-off pre-GUI days. It really is saying something that it has stood the test of time when so much has fallen by the wayside. Like Vi, it looks as if the esteemable piece of software is showing no signs of going away just yet. Maybe it was well-designed in the beginning, and the thing certainly seems more than a text editor with its extras. Well, it has to offer a good reason for making its way into Linux too...
On web browsers for BlackBerry devices
8th August 2010The browser with which my BlackBerry Curve 8520 came is called Web'n'Walk and, while it does have its limitations, it works well enough for much of what I want to do. Many of the sites that I wish to visit while away from a PC have mobile versions that are sufficiently functionality for much of what I needed to do. Names like GMail, Google Reader, Met Office and National Rail come to mind here, and the first two are regularly visited while on the move. They work well to provide what I need too. Nevertheless, one of the things that I have found with mobile web browsing is that I am less inclined to follow every link that might arouse my interest. Sluggish response times might have something to do with it but navigating the web on a small screen is more work too. Therefore, I have been taking a more functional approach to web usage on the move rather than the more expansive one that tends to happen on a desktop PC.
For those times when the default browser was not up to the task, I installed Opera Mini. It certainly has come in very useful for keeping an eye on the Cheshire East bus tracker and looking at any websites without mobile versions for when I decide to look at such things. Downloading any of these does take time, and there's the reality of navigating a big page on a small screen. However, I have discovered that the browser has an annoying tendency to crash, which it did it on one occasion while I was awaiting a bus. The usual solution, rightly or wrongly, has been to delete the thing and reinstall it again with the time and device restarts that entails. While I got away with it once, it seems to mean losing whatever bookmarks or favourites that you have set up too, a real nuisance. Because of this, I am not going to depend on it as much any more. Am I alone in experiencing this type of behaviour?
Because of Opera's instability, I decided on seeking alternative approaches. One of these was to set up bookmarks for the aforementioned bus tracker on Web 'n' Web. What is delivered in the WAP version of the site, and it's not that user-friendly at all. When it comes to selecting a bus stop to monitor, it asks for a stance number. Only for my nous, I wouldn't have been able to find the ID's that I needed. That's not brilliant, but I worked around it to make things work for me. The observation is one for those who design mobile versions of websites for public use.
Another development is the discovery of Bolt Browser and, so far, it seems a worthy alternative to Opera Mini too. There are times when it lives up to the promise of faster web page loading, but that is dependent on the strength of the transmission signal. A trial with the Met Office website showed it to be capable, though there were occasions when site navigation wasn't as smooth as it could have been. Up to now, there have been no crashes like what happened to Opera Mini, so it looks promising. If there is any criticism, it is that it took me a while to realise how to save favourites (or bookmarks). While the others that I have used have a button on the screen for doing so, Bolt needs you to use the application menu. Other than that, the software seems worthy of further exploration.
All in all, surfing the mobile remains an area of continued exploration for me. Having found my feet with it, I remain on the lookout for other web browsers for the BlackBerry platform. While it is true that OS 6 features a WebKit-powered browser, I'm not buying another device to find out how good that is. What I am after are alternatives that work on the device that I have. Though porting of Firefox's mobile edition would be worthwhile, its availability seems to be limited to Nokia's handsets for now. Only time will reveal where things are going.
Changing Outlook usage habits
2nd August 2010Given that I have been using it for so long, I shouldn't be discovering new things with Outlook. However, there is one thing that I have been doing for years: leaving messages set as unread until I have dealt with them. Now that I look at it, it seems a terrible habit compared with an alternative that I recently found.
Quite why I haven't been flagging messages for follow-up instead is beyond me. Is it because I worked with Outlook 2000 at my place of work for so long, and the arrival of Outlook 2007 into my life wasn't sufficient to force a change of habits? In fact, it has taken a downgrade to Outlook 2003 to make it dawn on me; it was the sight of search folder for messages marked for follow-up that triggered the realisation.
Speaking of old habits, there is one that I'll be dropping: setting up loads of rules, allegedly for organising messages. Given that they were the cause of my missing emails quite a few times, it's one more nuisance that needed to be left behind me.
A little thing with Outlook
24th July 2010When you start working somewhere new like I have done, various software settings that you have had at your old place of work don't automatically come with you, leaving you to scratch your head as to how you had things working like that in the first place. That's how it was with the Outlook set up on my new work PC. It was setting messages as read the first time that I selected them, and I was left wondering to set things up as I wanted them.
From the menus, it was a matter of going to Tools > Options and poking around the dialogue box that was summoned. What was then needed was to go to the Other Tab and Click on the Reading Pane Button. That action produced another dialogue box with a few check-boxes on there. My next step was to clear the one with this label: Mark item as read when selection changes. While there's another tick box that I left unchanged: Mark items as read when viewed in Reading Pane; that's inactive by default anyway.
From my limited poking around, these points are as relevant to Outlook 2007 as they are to the version that I have at work, Outlook 2003. Going further back, it might have been the same with Outlook 2000 and Outlook XP too. While I have yet to what Outlook 2010, the settings should be in there too, though the Ribbon interface might have placed them somewhere different. It might be interesting to see if a big wide screen like what I now use at home would be as useful to the latest version as it is to its immediate predecessor.
Worth the attention?
21st July 2010The latest edition of Web Designer has features and tutorials on modern trends, including new ways to use fonts and typography in websites. One thing that's at the heart of the attention is the @font-face
CSS selector. It's what allows you to break away from the limitations of whatever fonts your visitors might have on their PC's to use something available remotely.
In principle, that sounds like a great idea, yet there are caveats. The first of these is the support for the @font-face selector in the first place, though modern browsers I have tested handle this reasonably well. These include the latest versions of Firefox, Internet Explorer, Opera and Chrome. While the new fonts may render OK, there's a short delay in the full loading of a web page. With Firefox, the rendering seems to treat the process like an interleaved image, so you may see fonts from your own PC before the remote ones come into place, a not too ideal situation in my opinion. Also, I have found that this is more noticeable on the Linux variant of the browser than its Windows counterpart. Loading a page that is predominantly text is another scenario where you'll see the behaviour more clearly. Having a sizeable image file loading seems to make things less noticeable. Otherwise, you may see a short delay to the loading of a web page because the fonts have to be downloaded first. Opera is a particular offender here, with IE8 loading things rather quickly and Chrome not being too bad either.
In the main, I have been using Google's Fonts Directory but, in the interests of supposedly getting a better response, I tried using font files stored on a test web server only to discover that there was more of a lag with the fonts on the web server. While I do not know what Google has done with their set-up, using their font delivery service appears to deliver better performance in my testing, so it'll be my choice for now. Though there's Typekit too, I'll be hanging onto to my money in the light of my recent experiences.
After my brush with remote font loading, I am inclined to wonder if the current hype about fonts applied using the @font-face
directive is deserved until browsers get better and faster at loading them. As things stand, they may be better than before, while the jury's still out for me, with Firefox's rendering being a particular irritant. Of course, things can get better...