Technology Tales

Adventures in consumer and enterprise technology

TOPIC: MICROSOFT

Ditching PC Plus?

28th June 2007

When I start to lose interest in the features in a magazine that I regularly buy, then it's a matter of time before I stop buying the magazine altogether. Such a predicament is facing PC Plus, a magazine that I have been buying every month over the last ten years. The fate has already befallen titles like Web Designer, Amateur Photographer and Trail, all of which I now buy sporadically.

Returning to PC Plus, I get the impression that it feels more of a lightweight these days. Over the last decade, Future Publishing has been adding titles to its portfolio that take actually from its long-established stalwart. Both Linux Format and .Net are two that come to mind, while there are titles covering Windows Vista and computer music as well. In short, there may be sense in having just a single title for all things computing.

Being a sucker for punishment, I did pick up this month's PC Plus, only for the issue to be as good an example of the malaise as any. Reviews, once a mainstay of the title, are now less prominent than they were. In place of comparison tests, we now find discussions of topics like hardware acceleration, with some reviews mixed in. Topics such as robotics and artificial intelligence do rear their heads in feature articles, when I cannot say that I have a great deal of time for such futurology. The section containing tutorials remains, even if it has been hived off into a separate mini-magazine, and seemingly fails to escape the lightweight revolution.

All this is leading me to dump PC Plus in favour of PC Pro from Dennis Publishing. This feels reassuringly more heavyweight and, while the basic format has remained unchanged over the years, it still managed to remain fresh. Reviews, of both software and hardware, are very much in evidence while it manages to have those value-adding feature articles; this month, digital photography and rip-off Britain come under the spotlight. Add the Real Word Computing section, and it all makes a good read in these times of behemoths like Microsoft, Apple and Adobe delivering new things on the technology front. While I don't know if I have changed, PC Pro does seem better than PC Plus these days.

IE6 and JavaScript performance

22nd June 2007

Having been exposed to an application at work that uses a lot of JavaScript, I fully appreciate what some mean when they discuss IE6's inefficient handling of JavaScript. After seeing a web page taking an age to reload and your CPU taking a hammering because of JavaScript processing, the penny does tend to drop...

Needless to say, this very much impacts the world of AJAX-driven web applications with their heavy dependence on client-side JavaScript. While IE7 does come to the rescue, there remain plenty of IE6 users still out there, and this is reflected in website statistics. This demonstrates a certain level of inertia in the browser market that not only afflicts the uptake of IE7 but also the likes of Mozilla, Opera and Safari. It also means that anyone developing AJAX applications very much needs to continue testing in IE6, especially if the product of their labours is for wider public use.

An example of such an application is Zimbra, an open-source web application for messaging and collaboration, and the people behind it have generously shared the results of their browser performance benchmarking. They did comparisons of IE6 vs. IE7 and Firefox 2 vs. IE7. IE6 easily came out as the worst of these, while Firefox 2 was the best.

The next question to be asked could centre around the type of code that is processed inefficiently by IE6. While I wouldn't be at all surprised if a list emerged, here's one: using Microsoft's proprietary innerHTML object to update the DOM for a web page format. Having a quick trawl on Google, this came up for mention as a cause of memory leaks. It is also a Microsoft innovation that never got taken up by those overseeing web standards, hardly a surprise since a spot of DOM scripting achieves the same end. It may be faster to code than any alternatives, and it does have some support from other browsers, but it does seem to have got a bad name, so it should be avoided if possible. That said, it would be interesting to see a performance comparison between innerHTML and DOM methods in IE6.

Exploring AJAX

7th June 2007

When I started it, my online photo gallery started out simply as a set of interlinked HTML pages. Over time, I discovered frames (yes, them!) and started to make use of JavaScript to make the slideshows slicker. In those days, I was working off free webspace provided by my ISP and client-side scripting was the only tool that I had for enhancing functionality. Having tired of the vagaries of client-side scripting while the browser wars were in full swing and incompatibilities reigned supreme, I went with paid hosting to get access to tools like Perl and PHP for server-side processing. Because their flexibility compared to JavaScript was a breath of fresh air to me, I am still a fan of the server-side approach.

The journey that I have just described is one that I now know was followed by many website builders around the same time. Nevertheless, I have still held on to JavaScript for some things, particularly for updating the DOM as part of making the pages more responsive to user interaction. In the last few years, a hybrid approach has been gaining currency: AJAX. This offers the ability to modify parts of a page without needing to reload the whole thing, generating a considerable amount of interest among web application developers.

The world of AJAX is evidently a complex one, though the underlying principle can be explained in simple terms. The essential idea is that you use JavaScript to call a server-side script, PHP is as good an example as any, that returns either text or XML that can be used to update part of a web page in situ without the need to reload it as per the traditional way of working. It has opened up so many possibilities from the interface design point of view that AJAX became a hot topic that still receives much attention today. One bugbear is efficiency because I have seen an AJAX application lock up a PC with a little help from IE6. There will always remain times when server-side processing is the best route, even if that needs to be balanced against the client-side approach.

Like its forbear DHTML, AJAX is really a development approach using a number of different technologies in combination. The DHTML elements such as (X)HTML, CSS, DOM and JavaScript are very much part of the AJAX world but server-side elements such as HTTP, PHP, MySQL and XML are also very much part of the fabric of the landscape. In fact, while AJAX can use plain text as the transfer format, XML is the one implied by the AJAX acronym and XSLT is used to transform XML into HTML. However, AJAX is not limited to the aforementioned technologies; for instance, I cannot see why Perl cannot play a role in place of PHP and ASP, both of which can be used for the same things.

Even in these standards-compliant days, browser support for AJAX remains diverse, to say the least, and it is akin to having MSIE in one corner and the rest in the other. Mind you, Microsoft did introduce the tools in the first place only for them to use ActiveX, while Mozilla created a new object type rather than continue this method of operation. Given that ActiveX is a Windows-only technology, I can see why Mozilla did what they did, and it is a sensible decision. In fact, IE7 appears to have picked up the Mozilla way of doing things.

Even with the apparent convergence, there will continue to be a need for the AJAX JavaScript libraries that are currently out there. Incidentally, Adobe has included one called Spry with Dreamweaver CS3. Nevertheless, I still like to find out how things work at the basic level and feel somewhat obstructed when I cannot do this. I remember perusing Wrox’s Professional AJAX and found the constant references to the associated function library rather grating; the writing style didn’t help either.

My taking a more granular approach has got me reading SAMS Teach Yourself AJAX in 10 Minutes as a means for getting my foot in the door. As with their Teach Yourself … in 24 Hours series, the title is a little misleading since there are 22 lessons of 10 minutes in duration (the 24 Hours moniker refers to there being 24 lessons, each of one hour in length). Anything composed of 10-minute lessons, even 22 of them, is never going to be comprehensive but, as a means for getting started, I have to say that the approach seems effective based on this volume. It has certainly whetted my appetite for giving AJAX a go, and it’ll be interesting to see how things progress from here.

Windows Sysinternals

5th June 2007

In an earlier post, I wondered about command line management of Windows processes. Well, I have since located the sort of tools that I was after as part of the Windows Sysinternals toolkit. It began as an independent endeavour and continued as such until Microsoft acquired them in 2006. You can find out more about the process utilities here, and the whole Sysinternals suite can be downloaded in a single package.

Outdoors enthuasiasts blogging in the U.K.

10th May 2007

What we call walking or hillwalking in the U.K. goes under the banners of hiking, tramping and yomping in other parts of the world. One term that we share with other parts is backpacking and this is much bigger in the U.S. than it is in the U.K. My hillwalking blog has come to the attention of members of the hillwalking and backpacking community and WordPress’s logging of who visited my blog has alerted me to this and allowed to find other similar blogs.

Why have I mentioned this here? The reason is that it has allowed me to see what blogging software others have been using. Blogger seems to be a very popular choice with a number using Windows Live Spaces, in the process making me aware that Microsoft has dipped its toes into the hosted blogs space. Other than this, I have also seen Typepad being used and one or two self-hosted operations to boot, mine included. Intriguingly, I have yet to encounter a fellow hillwalking fan in the U.K. using WordPress.com to host a hill blog, but I do know of a German backpacker having one. Video blogging is used by some, with the ever pervasive YouTube becoming a staple for this, at least for the ones that I have seen.

It’s an intriguing survey that leaves me to wonder how things develop…

Twin-pane Windows file managers

7th May 2007

When Microsoft moved away from its two pane file manager with the advent of Windows 95, I was one of those who thought it to be a retrograde step. While two Windows Explorer instances can be tiled on a desktop, the old two-pane paradigm still has its uses and there are third party purveyors of such things. Salamander from ALTAP is one such option, as is SpeedProject's SpeedCommander. While I have been using the latter for most of a year now and would gladly pay for it but for the fact that SpeedProject's payment system isn't working. It's just as well that the demo continues to function fully following the expiry of its evaluation period. It even takes the twin pane paradigm further by adding sub-panes within each of these, but that isn't all to this major update to the Norton Commander concept. Recently, I downloaded the free version of Salamander to have a look and, though basic, it does a lot of what I ask of it, so I might continue to see how it performs and may even evaluate the commercial version to see how it goes.

More on Office 2007

31st March 2007

Since today was to have been the last day of my Office 2007 trial, I headed over to Amazon.co.uk at the start of the week to bag both Office Home and Student 2007 and Outlook 2007. Both arrived yesterday, so I set to ridding my system of all things Office before adding the new software. So the 2007 trial had to go, as did Office XP and any reference to Office 97; Office XP was an upgrade. From this, you might think that I am on a five-year upgrade cycle for Office, and it certainly does appear that way though Office 95 was the first version that I had on a PC; it came with my then more than acceptable Dell Dimension XPS133 (Pentium 133, 16MB RAM, 1.6GB hard drive… it all looks so historical now).

Returning to the present, the 2007 installations went well and all was well with my system. Curiously, Microsoft seems to label the components of Office Home and Student “non-commercial use”. While I accept that the licence is that way inclined, they could be a little more subtle than to go emblazoning the application title bars with the said wording. Nevertheless, I suppose that it is a minor irritation when you consider that you are allowed a three machine licence for what are the full versions of Word, Excel, PowerPoint and OneNote. It must be the presence of OpenOffice on the scene that is inducing such benevolence.

Curiously, Outlook isn’t included in Office Home and Student, hence my getting the full version of the application separately. That means that there is no nefarious wording about the purpose for which it should be used. While on the subject of Outlook, my purge of previous Office versions thankfully didn’t rid my system of the PST files that I was using with Outlook 2007’s predecessors. In fact, the new version just picked up where its predecessors had left off without any further ado. As I have been getting used to the new interface, changed from Outlook 2002 but not as dramatically as the likes of Word, Excel or PowerPoint, there is a certain amount of continuation from what has gone before in any case. The three-pane window is new to me as I never encountered Outlook 2003 and that may explain why it took a little time to find a few things. An example is that all calenders appear in the same place when I had expected the association between calenders and their PST files to be retained. Nevertheless, it is not at all a bad way to do things, but it does throw you when you first encounter it. Its RSS feed reader is a nice touch, as are the translucent pop-ups that appear when a new message arrives; that tells you the title and the sender so you can decide whether to read it without so much as having to look at it and interrupt what you are doing.

In a nutshell, all seems well with Office 2007 on my machine, and I am set up to go forward without the headache of an upgrade cycle since I have recommenced from a clean slate. Though I have heard of some problems with Office 2007 on Windows Vista, I am running Windows XP and I have had no problems so far. In fact, I plan to sit out the Vista saga for a while to see how things develop and, who knows, I might even not bother with Vista at all and go for Vienna, its replacement due in 2009/2010, since XP support is to continue for a good while yet.

Is Windows 2000 support finished?

30th March 2007

At work, we still use Windows 2000 on our desktop and laptop PC’s. This may (or may not) surprise you, but the XP upgrade seems to have been thought a premature move, only for Vista to turn up later than might have been expected. Now that Microsoft is winding down support for Windows 2000, thoughts have started to turn to a Vista upgrade, but the realisation soon dawned that a move to Vista was a major one, and it now looks as if we will be on Windows 2000 for a little while yet, until 2008 at least.

I, too, have Windows 2000 lurking around at home as a testing platform, not a work copy I hasten to add, and software vendors increasingly are not supporting the operating system any more. Symantec is one of these, with the 2006 versions of its products being the last ones to support Windows 2000. Initially, I was left with the impression that Kaspersky was the same, but this does not seem to be the case. While the open-source community can continue their supply of productivity applications such as OpenOffice, the GIMP and so on, it is the security side that is of most concern as regards the future of Windows 2000. That said, its successors are not the prime targets for cracking, but shared code could mean that it falls foul of the same exploits.

I have yet to notice it with the hardware that I am using, but hardware advances may yet put paid to Windows 2000 like they did to members of the Windows 9x line, especially when you consider that the operating system dates from 1999. Then again, you may find that you don’t need the latest hardware, so this might not affect you. This is not all that unreasonable given that the pace of technological progress is less frenzied these days than it was in the nineties, when Windows 95 was more or less out of date by the turn of the millennium. Having the gold OEM version of Windows 95 as the basis for a Windows 9x upgrade treadmill meant that my move into the world of NT-based operating systems was a clean break with a full version of my new operating system and not its upgrade edition.

Nevertheless, there remains a feeling that Windows 2000 is being cut off prematurely and that it could last a while longer with a bit of support, even if there is a feel of the late nineties about the thing. After all, Windows 2000 probably still supports a lot of what people want to do and without the Big Brother tendencies of Vista too.

Adobe CS3 Launch

28th March 2007

Last night, I sat through part of Adobe’s CS3 launch and must admit that I came away intrigued. Products from the Macromedia stable have been very much brought under the Adobe umbrella and progressed to boot. One of these that attracts my interest in Dreamweaver and Adobe is promoting its AJAX capabilities (using the Spry library), its browser compatibility checking facility and integration with Photoshop, among other things. Dreamweaver’s CSS support also gets taken forward. In addition, Dreamweaver can now integrate with Adobe Bridge and Adobe Device Central. The latter allows you to preview how your site might look on a plethora of WAP-enabled mobile phones while the latter, unless I have been missing something, seems to have become a media manager supporting all of CS3 and not just Photoshop.

Speaking of Photoshop, this now gets such new features as smart filters, I think of these as adjustment layers for things like sharpening, monochrome conversion and much more. Raw image processing now has a non-destructive element with Photoshop Lightroom being touted as a companion for the main Photoshop. Speaking of new additions to the Photoshop family, there is a new Extended edition for those working with digital imaging with a 3D aspect and this is targeted at scientists, engineers, medical professionals and others. It appears that data analysis and interpretation is becoming part of the Photoshop remit now as well.

Dreamweaver and Photoshop are the components of the suite in which I have most interest, while I also note that the Contribute editor now has blogging capabilities; it would be interesting to see how these work, especially given Word 2007’s support for blogging tools like WordPress and Blogger. Another member of note is Version Cue, adding version control to the mix and making CS3 more like a group of platforms than collections of applications.

Unsurprisingly, the changes are rung out for the rest of the suite with integration being a major theme and this very much encompasses Flash too. The sight of an image selection being copied straight into Dreamweaver was wondrous in its own way and the rendering of Photoshop files into 3D images was also something to behold. The latter was used to demonstrate the optimisations that have been added for the Mac platform, a major selling point, apparently.

I suppose that the outstanding question is this: do I buy into all of this? It’s a good question because the computer enthusiast seems to be getting something of a sidelining lately. And that seems to be the impression left by Windows Vista, it gives the appearance that Microsoft is trying to be system administrator to the world. There is no doubt but CS3 is very grown up now and centred around work flows and processes. These have always been professional tools, with the present level of sophistication and pricing* very much reflecting this. That said, enthusiasts like me have been known to use them too, at least for learning purposes. The latter point may yet cause me to get my hands on Photoshop CS3 with its powerful tools for digital imaging, while Dreamweaver is another story. Given it doesn’t fit what how I work now, this is an upgrade that I may give a miss, as impressive as it looks. For a learning experience, I might download a demo, but that would a separate matter from updating my web presence. This time next month may tell a tale…

*Pricing remains the bugbear for the U.K. market that it always has been. At the present exchange rates, we should be getting a much better deal on Adobe products than we do. For instance, Amazon.com has the Web Premium CS3 suite from Macromedia Studio 8 priced at $493.99 while it is £513.99 on Amazon.co.uk. Using the exchange rate current as I write this, £1 buying $1.96605, the U.K. price is a whopping $1010.53 in U.S. terms. To me, this looks like price gouging and Microsoft has been slated for this too. Thus, I wonder what will be said to Adobe on this one.

Is Vista’s DRM a step too far?

16th February 2007

If it isn’t enough that Vista’s licensing legalese has being causing raised blood pressure, its use of DRM technology is arousing passionate outbursts and outpourings of FUD. The fact that DRM has been part of the Windows has been included in Windows since the 1990’s does nothing to quell the storm. One thing that needs to be pointed out is that the whole furore entails the delivery of protected content to consumers. Microsoft would no doubt approve of the line that if there was no protected content, then there would be no need to worry. However, there is a sizeable number of people who do not trust Microsoft to keep to its word and are making their feelings known.

The embodiment of the issue is Microsoft’s incorporation of HDCP into 64-bit Vista. It is an Intel standard that is on the market, with users already having bad experiences with it. The problems surround the need to ensure that protected video is not intercepted while a movie is being played, and this involves the hardware as much as the software. The result is that you need a compatible monitor that will have the correct inputs so that DRM can be employed. Some also suggest that this is not the end of the matter as regards hardware compatibility and the list can grow long enough that a whole new PC looks like a good idea.

At the heart of this debate is a paper written by Peter Gutmann of the University of Auckland, in which the consequences of Microsoft’s implementation are examined. The idea of a system with an alternative agenda to that which you have is hardly enthralling: neither using CPU time to monitor DRM and the locking down hardware are particularly attractive. Such is the exposure that this article has received that even Microsoft has had to respond to it. The point that they try to make is that decoding of protected content occurs in a sandbox and does not affect anything else that might be going on in the system. Unfortunately for them, many of those adding comments to the piece take the chance to launch a broadside on the company; some of the vitriol is certainly successful when it comes to trying to put me off Vista. To Microsoft’s credit, the negative comments remain, but it far from helps their attempted rebuttal of Gutmann.

Though the main fuel for the negativity is not Gutmann’s paper per se but a lack of trust in Microsoft itself, all of this despite its Trustworthy Computing initiative. The question goes like this: if the company uses DRM for video and audio, where else could it use the technology? The whole licensing debate also furthers this, and it is at this point that the fear, uncertainty and doubt really goes into overdrive, no matter how much effort is expended by people like Ed Bott on debunking any myths. Users generally do not like software taking on itself to decide what can and cannot be done. Personally, I have experience of Word’s habits of this nature in the past, and they were maddening: trying to produce my doctoral thesis with it went OK until I tried pulling the whole thing together using a master document; I backtracked and made PRN files for each chapter so that it wouldn’t change; LaTeX would never have done this….

What is the point of all of this DRM? It looks as if Microsoft clearly feels that it is necessary to pitch the PC as an entertainment content delivery device to continue growing their revenues in the home users market. Some would take this idea even further: that it is control of the entertainment industry that Microsoft wants. However, to do so, they have gone with strong DRM when there exists a growing backlash against the technology. And then there’s the spectre of the technology getting cracked. In fact, Alex Ionescu has found a potential way to fool the Protected Media Path (called Protected Video Path in a ComputerWorld Security article) into working with unsigned device drivers. Needless to say given the furore that has been generated, but there are others who are more than willing to take the idea of cracking Vista DRM even further. A recent remark from a senior Microsoft executive will only encourage this.

I must admit that I remain unconvinced by the premise of using a PC as my only multimedia entertainment device. Having in the past had problems playing DVD’s on my PC, I nowadays stick to using a standalone DVD player to do the honours. And I suspect that I’ll do the same with HD video should I decide to do watch it; it’s not that high on my list of priorities. In fact, I would be happier if Microsoft made a version of Vista with and without protected HD capability, and they do: 32-bit Vista will not play protected HD video. And it avoids all the hackles that have caused so much controversy too, allowing an easier upgrade in the process. The downsides are that the security model isn’t as tough as it is in the 64-bit world and that maximum memory is limited to 4 GB, not an issue right now, it more than likely will become one. If you are keen on Vista, the 32-bit option does give you time to see how the arguments about the 64-bit world run. And if hardware catches up. As for me, I’ll stick with XP for now.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.