Technology Tales

Adventures in consumer and enterprise technology

TOPIC: ADOBE

Taking a camera on a walk…

24th July 2007

On Saturday, I happened to be in a branch of Jessops only to overhear a salesman emphatically state that you don't buy a camera for its specifications but for the photos that it produces. While his tone of voice was a touch condescending, and he seemed to be putting down a DSLR, he was essentially right. Nevertheless, the specifications do help you get the images, so they have to be seen in that light.

For instance, having on-board sensor cleaning may save you from having either to clean the thing yourself or send the camera away for the professionals to do the needful, a much safer option in my view. And there may be occasions where image stabilisation is very useful, low light wildlife photography for instance. Yes, there are features that I consider surplus to requirements, like live viewing and movie capture, and that is very much due to my buying cameras to make photos. The salesman in question would surely have agreed...

Field Testing in the Lakeland Fells

Sunday saw me head to the Lakeland Fells for some walking and a spot of testing of my new Pentax K10D. The details of the walk itself are not for here but for my hillwalking blog and that is where you will find them. While making my way from Crewe to Windermere, I perused the manual looking particularly for information pertaining to functions that I actually use, I should really have done this beforehand, but distractions meant that I hadn't got around to it. I had to wade through something designed for a new SLR user before I got to what I consider the important stuff. Though this may be a bit irritating, I can understand and accept why they do it this way; we were all new users once, and they are hardly likely to want to know about things like aperture priority, raw file capture, ISO control and such like straight away.

First Impressions of the Pentax K10D

What do I think of it then? Let's start with first impressions. It is definitely smaller than the Canon EOS 10D it accompanies in my possession. That said, it is not too small and there is a decent grip hosting the shutter release button and the camera on/off switch. It also feels well-assembled and reassuringly weighty, an important consideration given that it will see the outdoors a lot. A discussion of the features most relevant to me follows.

Power Management and Response Time

On the subject of switching on and off, the camera is set to go into a sleep mode after a second of inactivity, but it reawakens quickly when needed, the trigger being half-depression of the shutter release button. In fact, the camera does reawaken much faster than my Canon as it happens and where the delay is a constant source of some irritation.

Key Controls and Features

Even if it might sound strange, the on/off switch is also used to activate the depth of field preview, something that no SLR should not have. The location may be unusual, but maybe the designers thought that having shutter release and depth of field preview next to each was a logical way to do it. From a camera operation point of view, there is certainly something to that way of thinking.

Behind the shutter release, you'll find a screen that is a reminder of film SLR's. This conveys information such as battery life, number of exposures remaining on the card and exposure details (aperture & shutter speed).

Display and Menu System

Staying on the subject of screens, the one on the back of the camera is larger than that on the Canon. As is customary for these, it allows replay of photos taken and access to the various menus required to control the camera's operation. In comparison to the Canon, which is essentially a one menu affair with a thumb wheel controlling scrolling and an OK button at its centre to perform operations, the Pentax has a more elaborate system of submenus: one each for recording, playback and set-up.

The playback menu is where I found the setting that makes the camera highlight areas of underexposure and overexposure during image playback. This is something that I missed regarding the Canon until I happened upon it. Camera cleaning is located on the set-up menu, and the camera is now set to clean the sensor every time that it is turned on. Why this is not enabled by default is a little beyond me, but the designers might have thought that a vibration from the camera on turning it on could have resulted in a load of support calls. The same submenu also hosts memory card formatting.

File Format and Navigation Options

The recording submenu is where I set the camera to deliver RAW DNG files, an Adobe innovation, rather than the default JPEG's. There are other options like RAW PEF files, Pentax's own format, or RAW and JPEG simultaneously, but my choice reflects my workflow in Photoshop Elements; I have yet to stop the said software editing the DNG files, however.

With all these options, it is fortunate that there is a navigation wheel whose operation uses arrow buttons to get about. While on the subject of the back screen, there are further settings that are accessed with the FN button rather than the Menu one. These include ISO, white balance, shooting mode (single, continuous, timed and so on) and flash. The only setting that I changed out of this lot was to set the ISO to 400; I prefer to feel that I am in control.

Exposure Modes and Controls

Returning to the camera's top plate, the exposure mode dial is on the left-hand side, which is no hardship to me as this is in the same place as on the Canon. There are no scene modes, but the available exposure modes are more than sufficient: fully automatic, program, sensitivity priority, shutter priority, aperture priority, shutter and aperture priority, manual, bulb and external flash synchronisation.

A few of these need a spot of explaining. Sensitivity priority is a new one on me, but it is a consequence of the ability of DSLR's to offer a range of ISO settings; the aperture and the shutter speed are varied according to the ISO setting. Shutter and aperture priority is like manual exposure and is the inverse of sensitivity priority: set both aperture and shutter speed, and the camera will vary the ISO setting. Both of the foregoing assume that you let the camera set the ISO, but my setting the thing myself may have put paid to these functions.

Shutter priority and aperture priority are, as far as I can tell, their usual selves. For all exposure modes, the thumb wheels at the front and back of the shutter release handgrip set apertures and shutter speeds where appropriate, and this arrangement works well.

Metering Options

The metering mode selector sits on the same column as the exposure dial, offering more options than my Canon, which provides only full and partial multi-segment metering. The Pentax expands these choices to include spot and centre-weighted metering alongside the default multisegmented option.

Spot metering is particularly valuable for precise exposure control, but the implementation requires some dexterity—you must simultaneously half-press the shutter button while fully pressing the AE lock button. This contrasts with Canon's more streamlined approach, where partial metering requires just a single button operation to meter and retain the reading. This is one area where Pentax could certainly improve by adopting Canon's more intuitive design.

Focus Controls

The focussing mode selector is found on the left of the body, next to the lens coupling. I am used to having this on the lenses themselves, so this is a new arrangement for me and one to which I can easily become accustomed. In fact, it is easy to find it while composing a picture. The modes themselves are manual focus, one-time autofocus and continuous autofocus; the last of these is for focussing on moving objects.

While I could go further, perhaps overboard, with a discussion of the features of this camera, I draw a line at what's here. Yes, it is useful to set the focussing point and activate image stabilisation, but the above are what matter to me and its performance in the photo making department is the most important aspect.

Performance in the Field

That neatly brings me to my appraisal of how it performs. With inspection of the first few images on the review screen, I was a little disappointed to see how dark the foreground was in comparison to the sky. When I brought everything home, as I always do, I found that things weren't necessarily as they appeared in the field. The Pentax more usefully offers histogram review and highlighting of any areas that are either underexposed or overexposed. It is these functions that I will be using in reshooting decisions while out and about with the Pentax, and the same can be said for how I currently use the Canon.

In fast changing lighting, the AE lock technique was a bit irritating, yet I am certain that I will get better at it. The autofocus doesn't always lock onto the subject, especially in tricky lighting, so manual focussing is a definite necessity and is more useful more for landscape photography, in fact. Nevertheless, the autofocus did do well most of the time, and my Sigma lenses have done worse things on me.

Conclusion

Overall, I'm satisfied with the K10D and plan to continue using it. My recent excursion yielded some quality photographs, which, as the Jessops salesperson would agree, is the ultimate purpose of any camera.

Ditching PC Plus?

28th June 2007

When I start to lose interest in the features in a magazine that I regularly buy, then it's a matter of time before I stop buying the magazine altogether. Such a predicament is facing PC Plus, a magazine that I have been buying every month over the last ten years. The fate has already befallen titles like Web Designer, Amateur Photographer and Trail, all of which I now buy sporadically.

Returning to PC Plus, I get the impression that it feels more of a lightweight these days. Over the last decade, Future Publishing has been adding titles to its portfolio that take actually from its long-established stalwart. Both Linux Format and .Net are two that come to mind, while there are titles covering Windows Vista and computer music as well. In short, there may be sense in having just a single title for all things computing.

Being a sucker for punishment, I did pick up this month's PC Plus, only for the issue to be as good an example of the malaise as any. Reviews, once a mainstay of the title, are now less prominent than they were. In place of comparison tests, we now find discussions of topics like hardware acceleration, with some reviews mixed in. Topics such as robotics and artificial intelligence do rear their heads in feature articles, when I cannot say that I have a great deal of time for such futurology. The section containing tutorials remains, even if it has been hived off into a separate mini-magazine, and seemingly fails to escape the lightweight revolution.

All this is leading me to dump PC Plus in favour of PC Pro from Dennis Publishing. This feels reassuringly more heavyweight and, while the basic format has remained unchanged over the years, it still managed to remain fresh. Reviews, of both software and hardware, are very much in evidence while it manages to have those value-adding feature articles; this month, digital photography and rip-off Britain come under the spotlight. Add the Real Word Computing section, and it all makes a good read in these times of behemoths like Microsoft, Apple and Adobe delivering new things on the technology front. While I don't know if I have changed, PC Pro does seem better than PC Plus these days.

Exploring AJAX

7th June 2007

When I started it, my online photo gallery started out simply as a set of interlinked HTML pages. Over time, I discovered frames (yes, them!) and started to make use of JavaScript to make the slideshows slicker. In those days, I was working off free webspace provided by my ISP and client-side scripting was the only tool that I had for enhancing functionality. Having tired of the vagaries of client-side scripting while the browser wars were in full swing and incompatibilities reigned supreme, I went with paid hosting to get access to tools like Perl and PHP for server-side processing. Because their flexibility compared to JavaScript was a breath of fresh air to me, I am still a fan of the server-side approach.

The journey that I have just described is one that I now know was followed by many website builders around the same time. Nevertheless, I have still held on to JavaScript for some things, particularly for updating the DOM as part of making the pages more responsive to user interaction. In the last few years, a hybrid approach has been gaining currency: AJAX. This offers the ability to modify parts of a page without needing to reload the whole thing, generating a considerable amount of interest among web application developers.

The world of AJAX is evidently a complex one, though the underlying principle can be explained in simple terms. The essential idea is that you use JavaScript to call a server-side script, PHP is as good an example as any, that returns either text or XML that can be used to update part of a web page in situ without the need to reload it as per the traditional way of working. It has opened up so many possibilities from the interface design point of view that AJAX became a hot topic that still receives much attention today. One bugbear is efficiency because I have seen an AJAX application lock up a PC with a little help from IE6. There will always remain times when server-side processing is the best route, even if that needs to be balanced against the client-side approach.

Like its forbear DHTML, AJAX is really a development approach using a number of different technologies in combination. The DHTML elements such as (X)HTML, CSS, DOM and JavaScript are very much part of the AJAX world but server-side elements such as HTTP, PHP, MySQL and XML are also very much part of the fabric of the landscape. In fact, while AJAX can use plain text as the transfer format, XML is the one implied by the AJAX acronym and XSLT is used to transform XML into HTML. However, AJAX is not limited to the aforementioned technologies; for instance, I cannot see why Perl cannot play a role in place of PHP and ASP, both of which can be used for the same things.

Even in these standards-compliant days, browser support for AJAX remains diverse, to say the least, and it is akin to having MSIE in one corner and the rest in the other. Mind you, Microsoft did introduce the tools in the first place only for them to use ActiveX, while Mozilla created a new object type rather than continue this method of operation. Given that ActiveX is a Windows-only technology, I can see why Mozilla did what they did, and it is a sensible decision. In fact, IE7 appears to have picked up the Mozilla way of doing things.

Even with the apparent convergence, there will continue to be a need for the AJAX JavaScript libraries that are currently out there. Incidentally, Adobe has included one called Spry with Dreamweaver CS3. Nevertheless, I still like to find out how things work at the basic level and feel somewhat obstructed when I cannot do this. I remember perusing Wrox’s Professional AJAX and found the constant references to the associated function library rather grating; the writing style didn’t help either.

My taking a more granular approach has got me reading SAMS Teach Yourself AJAX in 10 Minutes as a means for getting my foot in the door. As with their Teach Yourself … in 24 Hours series, the title is a little misleading since there are 22 lessons of 10 minutes in duration (the 24 Hours moniker refers to there being 24 lessons, each of one hour in length). Anything composed of 10-minute lessons, even 22 of them, is never going to be comprehensive but, as a means for getting started, I have to say that the approach seems effective based on this volume. It has certainly whetted my appetite for giving AJAX a go, and it’ll be interesting to see how things progress from here.

Adobe Digital Editions

3rd June 2007

Adobe Digital Editions

For now, I still have my eye on Photoshop CS3, and it was with interest that I noticed that Peachpit Press had published a book, entitled Adobe Photoshop CS3: Up to Speed, exploring the changes made from CS2. The plot thickened when I found that I could download as an e-book.

However, it was then that I discovered a major change made to Adobe Reader for its eighth version: it no longer reads what Adobe titles Digital Editions. For that, you need either the previous version or to download the beta version of Adobe Digital Editions (yes, it does rather appear that they couldn't tie up the final release dates), which appears to be a Flash front end to Reader.

As it happens, I am not so convinced by this development: the thing is in essence coloured black and the mouse pointer disappears a lot of the time. Having menus and navigational screen furniture constantly disappearing and reappearing doesn't do much to enhance the reputations of Adobe's user interface designers. While it wouldn't be too bad if you could customise the colours, you can't; a light grey has to be better than black. Its taking over the whole screen when maximised is another irritation, especially when it covers up your task bar and the Alt+Tab shortcut is needed to leave it without having to hit the minimise button.

Yes, it does do the job, but I still find myself hankering after an Adobe Reader style interface and I have no idea why this has been foisted upon us when the previous approach was a perfectly good one. All in all, I have only to say it seems a premature roll-out of the approach. Now, where's Reader 7 when I need it?

Adobe CS3 Launch

28th March 2007

Last night, I sat through part of Adobe’s CS3 launch and must admit that I came away intrigued. Products from the Macromedia stable have been very much brought under the Adobe umbrella and progressed to boot. One of these that attracts my interest in Dreamweaver and Adobe is promoting its AJAX capabilities (using the Spry library), its browser compatibility checking facility and integration with Photoshop, among other things. Dreamweaver’s CSS support also gets taken forward. In addition, Dreamweaver can now integrate with Adobe Bridge and Adobe Device Central. The latter allows you to preview how your site might look on a plethora of WAP-enabled mobile phones while the latter, unless I have been missing something, seems to have become a media manager supporting all of CS3 and not just Photoshop.

Speaking of Photoshop, this now gets such new features as smart filters, I think of these as adjustment layers for things like sharpening, monochrome conversion and much more. Raw image processing now has a non-destructive element with Photoshop Lightroom being touted as a companion for the main Photoshop. Speaking of new additions to the Photoshop family, there is a new Extended edition for those working with digital imaging with a 3D aspect and this is targeted at scientists, engineers, medical professionals and others. It appears that data analysis and interpretation is becoming part of the Photoshop remit now as well.

Dreamweaver and Photoshop are the components of the suite in which I have most interest, while I also note that the Contribute editor now has blogging capabilities; it would be interesting to see how these work, especially given Word 2007’s support for blogging tools like WordPress and Blogger. Another member of note is Version Cue, adding version control to the mix and making CS3 more like a group of platforms than collections of applications.

Unsurprisingly, the changes are rung out for the rest of the suite with integration being a major theme and this very much encompasses Flash too. The sight of an image selection being copied straight into Dreamweaver was wondrous in its own way and the rendering of Photoshop files into 3D images was also something to behold. The latter was used to demonstrate the optimisations that have been added for the Mac platform, a major selling point, apparently.

I suppose that the outstanding question is this: do I buy into all of this? It’s a good question because the computer enthusiast seems to be getting something of a sidelining lately. And that seems to be the impression left by Windows Vista, it gives the appearance that Microsoft is trying to be system administrator to the world. There is no doubt but CS3 is very grown up now and centred around work flows and processes. These have always been professional tools, with the present level of sophistication and pricing* very much reflecting this. That said, enthusiasts like me have been known to use them too, at least for learning purposes. The latter point may yet cause me to get my hands on Photoshop CS3 with its powerful tools for digital imaging, while Dreamweaver is another story. Given it doesn’t fit what how I work now, this is an upgrade that I may give a miss, as impressive as it looks. For a learning experience, I might download a demo, but that would a separate matter from updating my web presence. This time next month may tell a tale…

*Pricing remains the bugbear for the U.K. market that it always has been. At the present exchange rates, we should be getting a much better deal on Adobe products than we do. For instance, Amazon.com has the Web Premium CS3 suite from Macromedia Studio 8 priced at $493.99 while it is £513.99 on Amazon.co.uk. Using the exchange rate current as I write this, £1 buying $1.96605, the U.K. price is a whopping $1010.53 in U.S. terms. To me, this looks like price gouging and Microsoft has been slated for this too. Thus, I wonder what will be said to Adobe on this one.

A web development toolbox

23rd March 2007

Having been on a web-building journey from Geocities to having a website with my own domain hosted by Fasthosts, it should come as no surprise that I have encountered a number of tools and technologies over this time and that my choices and knowledge have evolved too. I’ll muse over the technologies first before going on to the tools that I use.

Technologies

XHTML

When I started building websites, it was not after HTML 4 got released, and I devoured most if not all of Elizabeth Castro’s Peachpit Visual Quickstart guide to the language within a weekend. Having previously used fairly primitive WYSIWYG tools like Netscape Composer and Claris Home Page, it was an empowering experience and the first edition (it is now on its third) of Jennifer Niederst Robbins’ Web Design in a Nutshell took things much further, becoming something of a bible for a number of years.

When it first appeared, XHTML 1.0 wasn’t a major change from HTML 4, but its stricter more XML-compliant syntax was meant to point the way to the future and semantic markup was at its heart at least as much as it was for HTML 4. XHTML 2.0 is on the horizon and after the modular approach of XHTML 1.1 (which I have never used), it will be interesting to see how it develops. Nevertheless, there is a surprising development in that some people are musing over the idea of having an HTML 5. Let’s hope that the (X)HTML apple cart doesn’t get completely overturned after some years of relative stability. I still bear scars from the browser wars raging in the 1990’s and don’t want to see standards wars supplanting the relative peace that we have now. That said, I don’t mind peaceful progression.

CSS

Only seems to be coming into its own in the last few years and is truly a remarkable technology despite the hobbles that MSIE places on our ambitions. CSS Zen Garden has been a major source of ideas; I wouldn’t have been able to customise this blog as much as I have without them. I was an early adopter of the technology and got burnt by inconsistent browser support; Netscape 4 was the proverbial bête noir back then, fulfilling the role that MSIE plays today. In those days, it was the idea of controlling text display and element backgrounds from a single place that appealed. Since then, I have progressed to using CSS to replace table-based layouts and to control element positioning. It can do more…

JavaScript

Having had a JavaScript-powered photo gallery before my current Perl-driven one, I can say that I have definitely sampled this ever-pervasive scripting language. Being a client-side language rather than a server-side one, it does place you rather at the mercy of the browser purveyors, and it never ceases to amaze me that there is a buzz around AJAX because of this. In fact, the abundance of AJAX cross-browser function libraries is testimony to the need for browser-specific code. Despite my preferences for server-side scripting, I still find a use for JavaScript, and its main use for me these days is to dynamically control CSS elements to do such things as control the height of a page element or whether it is shown or not. Apparently, CSS may get some dynamic capabilities in the future and reduce my dependence on JavaScript. Meanwhile, Jeremy Keith’s DOM Scripting (Friends of Ed) will prove as much of an asset as it has done.

XML

These days, a lot of the raw data underlying my personal website is stored in XML. I did try to dynamically transform the display of the XML into something meaningful with CSS and XSLT when I first scaled its dizzy heights, but I soon resorted to other techniques. Browser support and the complexity of what I required were the major contributors to this. The new strategy involved two different approaches. The first was to create PHP/XHTML pages from the precursor XML offline, and this is how I generate the website’s directory pages. The other one is to process the XML as text to dynamically supply an XHTML page as the user visits it; this is the way that the photo gallery works.

Perl

This still powers all of my photo gallery. While thoughts of changing it all to PHP linger, there is a certain something about the Perl language that keeps it there. I suppose it is that PHP is entangled in the HTML while Perl encases the whole business, and I am reasonably familiar with its syntax these days, which is why it still does a lot of the data processing grunt work that I need.

PHP

PHP is everywhere these days, though it doesn’t attract quite the level of hype that used to be the case. It still appears with its sidekick MySQL in many website applications. Blogging software such as WordPress and content management systems like Drupal, Mambo and Joomla! wouldn’t exist without the pair. It appears on my website as the glue that holds my visitor directories together and is the processing engine of my WordPress blog. And if I ever get to a Drupal element to the site, by no means a foregone conclusion though I am spending a lot of time with it at this time, PHP will continue its presence in my website scripting as it powers that too.

Applications

Macromedia HomeSite

I have a liking for hand coding, so this does most of what I need. When Macromedia (itself since taken over by Adobe, of course) took over Allaire, HomeSite sadly lost its WYSIWYG capability, but the application still soldiers on even though Dreamweaver offers a lot to code cutters these days. Nevertheless, it does have certain advantages over Dreamweaver: it is a fleeter beast to start up and colour codes Perl syntax.

Macromedia Dreamweaver

There was a time when Dreamweaver was solely a tool for visual web page development, but the advent of Dreamweaver UltraDev added server-side development capabilities to the Dreamweaver family. These days, there is only one Dreamweaver version, but UltraDev’s capabilities still live on in the latest version and I would not be surprised if they were taken further in these database-driven times.

Nowadays, Dreamweaver isn’t an application where I spend a great deal of time. In former times, when my site was made up of static HTML pages, I used Dreamweaver a lot, even if its rendering capabilities were a step behind the then-current browser versions. I suppose that it didn’t fit the way in which I worked, but its template-driven workflow would have been a boon back then.

However, my move from a static site to a dynamic one, starting with my photo gallery, has meant that I haven’t used it as much since then. However, with my use of PHP/MySQL components on my site. Its server-side abilities could get the level of investigation that its PHP/MySQL capabilities allow.

Altova XMLSpy Professional

Adding MySQL databases to my web hosting costs money, not a lot, but it could be spent on other (more important?) things. Hence, I use XML as the data store for my photo gallery and XML files are pre-processed into XHTML/PHP pages for my visitor directories before uploading onto the server.

I use XMLSpy to edit and manage the XML files that I use: its ability to view XML in grid format is a killer feature as far as I am concerned and XML validation also proves very useful; particularly when it comes to ensuring that DTD’s and XML files are in step and for the correct coding of XSLT files. There are other features that I need to explore and that would also take my knowledge of the XML further to boot, not at all a bad thing.

Saxon

For processing XML into another file format such as XHTML, you need a parser and I use the free version of Saxon to do the needful, Saxonica offers commercial versions of it. There is, I believe, a parser in XMLSpy, but I don’t use it because Saxon’s command line interface fits better into my workflow. This is a Perl-driven process where XML files are read and XSLT files, one per XML file, are built before both are fed to Saxon for transforming into XHTML/PHP files. It all works smoothly and updating the XML inputs is all that is required.

AceFTP

If I were looking for an FTP client now, it would be FileZilla, but AceFTP has served me well over the last few years, and it looks as if that will continue. It does have some extra features over FileZilla: transfers between remote sites, and scheduling, for example. I have yet to use either, but they look valuable.

Hutmil

In bygone days when I had loads of static HTML files, making changes was a bit of a chore if they affected every single file. An example is changing the year on the copyright message on the page footers. Hutmil, which I found on a magazine cover-mounted disc, was a great time saver in those days. Today, I achieve this by putting this information into a single file and getting Perl or PHP to import that when building the page. The same “define once, use anywhere” approach underlies CSS as well, and scripting very usefully allows you to take that into the XHTML domain.

Apache

Apache is ubiquitous these days, and both the online and offline versions of my site are powered by it. It does require some configuration, but it is a powerful piece of kit. The introduction of 2.2.x meant a big change in the way that configuration files were modularised and while most things were contained in a single file for 2.0.x, the settings are broken up into different files in 2.2.x, and it can take a while to find things again. Without having it on my home PC, I would not be able to use Perl, PHP or MySQL. Apart from this, I especially like its virtual site capability; very useful for offline development.

WordPress

My hosting supplier offers blogs on Blogware, but that didn’t offer the level of configuration that I would have liked. It is true that this is probably true of any host of blogs. I can’t speak for Blogger, but WordPress.com does have its restrictions too. To make my hillwalking blog fit in with the appearance of my photo gallery, I went popped over to WordPress.org to download WordPress so that I could host a blog myself and have maximum control over its appearance. WordPress supports themes, so I created my own and got my blog pages looking as if they are part of my website, rather than looking like something that was bolted on. Now that I think of it, what about WordPress supporting user-created themes? I support that there is the worry of insecure PHP code but what about it?

MySQL

I am between minds on whether this is a technology or a tool. SQL certainly would be a technology standard, but I am not so clear on what MySQL would be. In any case, I have classed it as a tool, and a very useful one at that. It is the linchpin for my WordPress blogs and, if I go for a content management system like Drupal, its role would surely grow. While I do have a lot of experience with using SAS SQL and this helps me to deal with other varieties, there is still a learning curve with MySQL that gets me heading for a good book and Kofler’s The Definitive Guide to MySQL5 (Apress) seems to perform more than adequately in this endeavour.

Paint Shop Pro

As someone who hosts an online photo gallery, it won’t come as a surprise that I have had exposure to image editors. Despite various other flirtations, Paint Shop Pro has been my tool of choice over the years, but it is now set to be usurped by a member of Adobe’s Photoshop family. Paint Shop Pro does have books devoted to it, but it appears that Photoshop gets better coverage and I feel that my image processing needs to be taken up a gear, hence the potential move to Photoshop

Is Photoshop CS3 imminent?

16th March 2007

We have seen the beta come out, an unprecedented move for Adobe, and now we are hearing about the new professional editions of Photoshop: Photoshop CS3 for digital imaging and Photoshop CS3 Extended with tools for processing digital video. Together with Photoshop Lightroom for digital photography and Photoshop Elements for the consumer market, it appears that Photoshop is moving from a single application to becoming a big family of them. Adobe is hosting an online launch for the CS3 suite on March 27th so the appearance on the market of the new Photoshop must be very imminent. In the light of this, I think I’ll hold off on a decision to purchase either Elements 5 or its CS2 until I have tried out the latter’s successor.

Update: I’ve just perused both  .Net’s and Advanced Photoshop’s initial appraisals of Photoshop CS3. Since they seemed impressed, it should be worth a look then. Another tempting idea is to have a taste of Lightroom, so I went and downloaded the 30-day trial version. I may well have a go with it in my own time; I’m not wanting to install it and let the 30 days run out before I get to use it in anger.

Why are there no savings on buying software using electronic distribution, Adobe?

15th March 2007

If you ever potter over to Adobe's online software store, a curious anomaly awaits you: electronic download editions of their software are never cheaper than the equivalent boxed versions. In fact, there are cases where the electronic version costs more than the boxed one. One would have thought that ditching the box, the disc(s) and whatever accompanies them would save Adobe money, and they would pass this onto you, yet it does not seem to make its way into the pricing for some reason. Another thing is that selling direct should allow Adobe to undercut retailers and make more money from their software, but it is the likes of Amazon that have the better prices. Whatever way you look at it, you have to admit that this pricing model doesn't make a lot of sense.

More thoughts on learning to use digital imaging software

14th March 2007

If you ever go into a bricks and mortar newsagent and peruse its shelves with an eye out for references to data imaging software, you might find Adobe’s Photoshop as predominant there as it is in the digital imaging world. And the same trend seems to continue in to the bricks and mortar bookshops as well. Online, especially within the vaults of Amazon, it is not as much a matter of what gets stocked as what gets published and my impression is that the bias, if that’s the right word, continues there. That said, I didn’t realise until recently that Elsevier’s Focal Press has been covering Paint Shop Pro, once branded the poor man’s Photoshop, from at least version 7. That discovery, if it had come earlier, may have made a big difference to how I have been using PSP. That said, I have seen some opinions that PSP is easy to use and that may explain the lack of attention from publishers. Future Publishing did put out a monthly guide to PSP but that seems to have disappeared from the shelves and it does lend weight to that argument. Or it could have been Corel’s purchase of JASC that changed things…

Of course, without books and magazines, it is not as easy to see the possibilities and it is here where Photoshop really scores. The digital photography revolution has ensured the software’s escape from the world of computing and the digital arts into photography magazines and beyond. These days, even conventional photography titles feature Photoshop how-to articles. In fact, such is the level of digital content in titles such as Photography Monthly, Practical Photography and Outdoor Photography that you hardly need to pursue the specialist digital photography titles at all.

Speaking of photography, this is and has been my main use of digital imaging technology, be it the scanner that I use for digitising the output of my efforts in film photography or processing RAW files from my digital SLR. I have been using scanners since 1998 and am on my second, a CanoScan 5000F. The colour rendition in the output from its predecessor, a UMAX 1212U, deteriorated to the point where a replacement needed to be sought. As it happened, the Canon proved to be light years ahead of the UMAX, even with the latter operating properly. Incidentally, my first scanning outing was in the then current version of Photoshop (I booked some time on a scanner at the graphics centre of the university I was attending at the time and sneaked in the scanning of a photo with the journal graphic that I needed to do) - a limited affair, it has to be said - but I then reverted to things like Corel PhotoPaint and Paint Shop Pro. And PSP was what I was using in the main even after encountering the copy of Photoshop Elements 2 bundled with my EOS 10D. Elements’ cloning capabilities did tempt me though and I did acquire a Focal Press volume on the application but I somehow never took it further.

At the end of last year, Corel and Adobe launch new versions of PSP and Elements, respectively. That got me tempted by the idea of giving the whole business another look, this time in detail. My look at PSP XI regrettably suffered from the lack of time that I could devote to it and seeing what a book on it might have to say. I had more of a chance with Photoshop Elements and came away impressed with the way that it worked. Since then, I have been making my way through Scott Kelby’s latest Elements book and the ideas are building up. At the same time, I have been making good use of a Photoshop CS2 try-out and I am on the horns of a dilemma: do I splash out on CS2, do I get Elements 5 or do I await the now imminent CS3? You’ll notice that PSP doesn’t feature here; the amount of literature pertaining to Photoshop simply is too much to ignore and I have loads more to learn.

PSP file gotcha

29th January 2007

Corel Paint Shop Pro Photo XI

Having completed my evaluation of Corel's Paint Shop Pro (a.k.a. PSP) Photo XI, I dutifully uninstalled it from my system. However, on catching up with some files that I had acquired through the application, I found that I could not open them with its forbear PSP 9. From this, I would have to conclude that Corel made a change to PSP's native PSPIMAGE file format along the way. Having had Windows 2000 installed in a VMware virtual machine, I got back PSP XI to batch convert the files into PSD (Photoshop's own file format) and TIFF files for the future. Carrying out the conversion was easy enough thanks to being able to select files according to their file type, something that Adobe could do with bringing into Photoshop Elements; it's not there even in the latest version.

Batch processing with Corel Paint Shop Pro Photo XI

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.