Technology Tales

Adventures & experiences in contemporary technology

Rethinking photo editing

17th April 2018

Photo editing has been something that I have been doing since my first-ever photo scan in 1998 (I believe it was in June of that year but cannot be completely sure nearly twenty years later). Since then, I have been using a variety of tools for the job and wondered how other photos can look better than my own. What cannot be excluded is my preference for being active in the middle of the day when light is at its bluest as well as a penchant for using a higher ISO of 400. In other words, what I do when making photos affects how they look afterwards as much as the weather that I had encountered.

My reason for mentioning the above aspects of photographic craft is that they affect what you can do in photo editing afterwards, even with the benefits of technological advancement. My tastes have changed over time, so the appeal of re-editing old photos fades when you realise that you only are going around in circles and there always are new ones to share, so that may be a better way to improve.

When I started, I was a user of Paint Shop Pro but have gone over to Adobe since then. First, it was Photoshop Elements, but an offer in 2011 lured me into having Lightroom and the full version of Photoshop. Nowadays, I am a Creative Cloud photography plan subscriber so I get to see new developments much sooner than once was the case.

Even though I have had Lightroom for all that time, I never really made full use of it and preferred a Photoshop-based workflow. Lightroom was used to select photos for Photoshop editing, mainly using adjustments for such things as tones, exposure, levels, hue and saturation. Removal of dust spots, resizing and sharpening were other parts of a still minimalist approach.

What changed all this was a day spent pottering about the 2018 Photography Show at the Birmingham NEC during a cold snap in March. That was followed by my checking out the Adobe YouTube Channel afterwards where there were videos of the talks featured every day of the four-day event. Here are some shortcuts if you want to do some catching up yourself: Day 1, Day 2, Day 3, and Day 4. Be warned though that these videos are long in that they feature the whole day and there are enough gaps that you may wish to fast-forward through them. Even so, there is quite a bit of variety of things to see.

Of particular interest were the talks given by the landscape photographer David Noton who sensibly has a philosophy of doing as little to his images as possible. It helps that his starting points are so good that adjusting black and white points with a little tonal adjustment does most of what he needs. Vibrancy, clarity and sharpening adjustments are kept to a minimum while some work with graduated filters evens out exposure differences between skies and landscapes. It helps that all this can be done in Lightroom, so that set me thinking about trying it out for size and the trick of using the backslash (\) key to switch between raw and processed views is a bonus granted by non-destructive editing. Others may have demonstrated the creation of composite imagery, but simplicity is more like my way of working.

Confusingly, we now have the cloud-based Lightroom CC while the previous desktop counterpart is known as Lightroom Classic CC. Though the former may allow for easy dust spot removal among other things, it is the latter that I prefer because the idea of wholesale image library upload does not appeal to me for now and I already have other places for off-site image backup like Google Drive and Dropbox. The mobile app does look interesting since it allows capturing images on a such a device in Adobe’s raw image format DNG. Still, my workflow is set to be more Lightroom-based than it once was and I quite fancy what new technology offers, especially since Adobe is progressing its Sensai artificial intelligence engine. The fact that it has access to many images on its systems due to Lightroom CC and its own stock library (Adobe Stock, formerly Fotolia) must mean that it has plenty of data for training this AI engine.

An upgrade threadmill starts again…

25th September 2007

No sooner have we passed the CS3 circus than Adobe and Corel start bringing out new releases of their consumer digital imaging software. This autumn has brought us Paint Shop Pro X2 and Photoshop Elements 6. I’ll wait for the reviews but I can’t say that very much about the new Elements strikes me as making it a compelling upgrade and I definitely have left the PSP fold, even if the latest release seems to have some interesting features on offer. In any case, I am not sure who is going to upgrade their software on an annual basis. With so many other calls on my cash, I definitely am not of that ilk.

Amateur Photographer reviews…

19th July 2007

Amateur Photographer seem to have had a run of reviews recently. First off were the Olympus E-410 and E-510 that they seemed to like. Then, they moved onto the Ricoh Caplio GX100 and they seemed to like that too, though they did say that quality wasn’t up to SLR standards. But then again, it is a compact and that might be expecting a bit too much. This week, Paint Shop Pro comes under the spotlight as does Epson’s V350 scanner. I have yet to read these but I have been engaging in a spot of equipment acquisition anyway. My CanoScan 5000F scanner has been usurped by Epson’s Perfection Photo 4490 and very happy I am with it too. The quality of the scans that I have been doing of prints has been good and the presence of an on/off switch is a creditable one. None of the other scanners that I have had possessed it and having to plug something in and out from the power socket is inconvenient to say the least. I have also gone and got myself a new DSLR. Seeing Pentax’s K10D going with a 18-55 mm lens for £499 at Jessop’s overrode my better reason and put paid to ideas of purchasing any other electronic goods for the rest of this year. It’s an award-winning gadget and Photography Monthly’s Will Cheung seemed to get on fine with it. Which Digital Camera said it was heavy but it has to stand up to use in the great outdoors. The sensor may be a 10 megapixel affair so this will be an upgrade to my Canon EOS 10D; that has a sensor in need of clean right now (I plan to get it done by the professionals) and every time that I want to use an image that it has made, Photoshop’s healing brush has to be pressed into service. Pentax does boast about all of the seals that it has added to the K10D, a good thing if they cut down on the dust entering the camera. And if dust does get in, the sensor cleaning feature will hopefully see it off from the photos. Image stabilisation, another value adding feature, is also there and may prove interesting. Strangely, there’s some motion picture capture as well and I hope that it doesn’t get the EU coming after me to collect retrospective camcorder duty. In any case, it’s not a feature that I really need and the Live View functions on the equivalent Olympus offerings fall into the same category anyway. It’ll be interesting to see how the K10D performs and it’s a change from the Canon/Nikon hegemony that seems to dominate digital photography these days.

Pentax K10D

Update: I have since perused the current issue of Amateur Photographer and seen that Paint Shop Pro suffered from performance issues on computers that worked fine with Photoshop. Otherwise, it compared well with Adobe’s offerings even if the interface wasn’t seen to be as slick. Epson’s V350 was well received though it was apparent that spending more got you a better scanner but that’s always the way with these things.

A web development toolbox

23rd March 2007

Having been on a web-building journey from Geocities to having a website with my own domain hosted by Fasthosts, it should come as no surprise that I have encountered a number of tools and technologies over this time and that my choices and knowledge have evolved too. I’ll muse over the technologies first before going on to the tools that I use.

Technologies

XHTML

When I started building websites, HTML 4 was not long in existence and I devoured most if not all of Elizabeth Castro’s Peachpit Visual Quickstart guide to the language within a weekend. Having previously used fairly primitive WYSIWYG tools like Netscape Composer and Claris Home Page, it was an empowering experience and the first edition (it is now on its third) of Jennifer Niederst Robbins’ Web Design in a Nutshell took things much further, becoming something of a bible for a number of years.

When it first appeared, XHTML 1.0 wasn’t a major change from HTML 4, but its stricter more XML-compliant syntax was meant to point the way to the future and semantic markup was at its heart at least as much as it was for HTML 4. XHTML 2.0 is on the horizon and after the modular approach of XHTML 1.1 (which I have never used), it will be interesting to see how it develops. Nevertheless, there is a surprising development in that some people are musing over the idea of having an HTML 5. Let’s hope that the (X)HTML apple cart doesn’t get completely overturned after some years of relative stability. I still bear scars from the browser wars raging in the 1990’s and don’t want to see standards wars supplanting the relative peace that we have now. That said, I don’t mind peaceful progression.

CSS

Only seems to be coming into its own in the last few years and is truly an amazing technology in spite of the hobbles that MSIE places on our ambitions. CSS Zen Garden has been a major source of ideas; I wouldn’t have been able to customise this blog as much as I have without them. I was an early adopter of the technology and got burnt by inconsistent browser support; Netscape 4 was the proverbial bête noir back then, fulfilling the role that MSIE plays today. In those days, it was the idea of controlling text display and element backgrounds from a single place that appealed. Since then, I have progressed to using CSS to replace table-based layouts and to control element positioning. It can do more…

JavaScript

Having had a JavaScript-powered photo gallery before my current Perl-driven one, I can say that I have definitely sampled this ever-pervasive scripting language. Being a client-side language rather than a server-side one, it does place you rather at the mercy of the browser purveyors and it never ceases to amaze me that there is a buzz around AJAX because of this. In fact, the abundance of AJAX cross-browser function libraries is testimony to the need for browser-specific code. Despite my preferences for server-side scripting, I still find a use for JavaScript and its main use for me these days is to dynamically control CSS elements to do such things as control the height of a page element or whether it is shown or not. Apparently, CSS may get some dynamic capabilities in the future and reduce my dependence on JavaScript. In the meantime, Jeremy Keith’s DOM Scripting (Friends of Ed) will prove as much of an asset as it has done.

XML

These days, a lot of the raw data underlying my personal website is stored in XML. I did try to dynamically transform the display of the XML into something meaningful with CSS and XSLT when I first scaled its dizzy heights but I soon resorted to other techniques. Browser support and the complexity of what I required were the major contributors to this. The new strategy involved two different approaches. The first was to create PHP/XHTML pages from the precursor XML offline and this is how I generate the website’s directory pages. The other one is to process the XML as text to dynamically supply an XHTML page as the user visits it; this is the way that the photo gallery works.

Perl

This still powers all of my photo gallery. While thoughts of changing it all to PHP linger, there is a certain something about the Perl language that keeps it there. I suppose it is that PHP is entangled in the HTML while Perl encases the whole business and I am reasonably familiar with its syntax these days which is why it still does a lot of the data processing grunt work that I need.

PHP

PHP is everywhere these days, though it doesn’t attract quite the level of hype that used to be the case. It still appears with its sidekick MySQL in many website applications. Blogging software such as WordPress and content management systems like Drupal, Mambo and Joomla! wouldn’t exist without the pair. It appears on my website as the glue that holds my visitor directories together and is the processing engine of my WordPress blog. And if I ever get to a Drupal element to the site, by no means a foregone conclusion though I am spending a lot of time with it at the moment, PHP will continue its presence in my website scripting as it powers that too.

Applications

Macromedia HomeSite

I have a liking for hand coding, so this does most of what I need. When Macromedia (itself since taken over by Adobe, of course) took over Allaire, HomeSite sadly lost its WYSIWYG capability, but the application still soldiers on even though Dreamweaver offers a lot to code cutters these days. Nevertheless, it does have certain advantages over Dreamweaver: it is a fleeter beast to start up and colour codes Perl syntax.

Macromedia Dreamweaver

There was a time when Dreamweaver was solely a tool for visual web page development, but the advent of Dreamweaver UltraDev added server-side development capabilities to the Dreamweaver family. These days, there is only one Dreamweaver version, but UltraDev’s capabilities still live on in the latest version and I would not be surprised if they were taken further in these database-driven times.

Nowadays, Dreamweaver isn’t an application where I spend a great deal of time. In former times, when my site was made up of static HTML pages, I used Dreamweaver a lot even if its rendering capabilities were a step behind the then-current browser versions. I suppose that it didn’t fit the way in which I worked, but its template-driven workflow would have been a boon back then.

However, my move from a static site to a dynamic one, starting with my photo gallery, has meant that I haven’t used it as much since then. However, with my use of PHP/MySQL components on my site. Its server-side abilities could get the level of investigation that its PHP/MySQL capabilities allow.

Altova XMLSpy Professional

Adding MySQL databases to my web hosting costs money, not a lot but it could be spent on other (more important?) things. Hence, I use XML as the data store for my photo gallery and XML files are pre-processed into XHTML/PHP pages for my visitor directories prior to uploading onto the server.

I use XMLSpy to edit and manage the XML files that I use: its ability to view XML in grid format is a killer feature as far as I am concerned and XML validation also proves very useful; particularly with regard to ensuring that DTD’s and XML files are in step and for the correct coding of XSLT files. There are other features that I need to explore and that would also take my knowledge of the XML further to boot, not at all a bad thing.

Saxon

For processing XML into another file format such as XHTML, you need a parser and I use the free version of Saxon to do the needful, Saxonica offers commercial versions of it. There is, I believe, a parser in XMLSpy but I don’t use it because Saxon’s command line interface fits better into my workflow. This is a Perl-driven process where XML files are read and XSLT files, one per XML file, are built before both are fed to Saxon for transforming into XHTML/PHP files. It all works smoothly and updating the XML inputs is all that is required.

AceFTP

If I were looking for an FTP client now, it would be FileZilla but AceFTP has served me well over the last few years and it looks as if that will continue. It does have some extra features over FileZilla: transfers between remote sites, and scheduling, for example. I have yet to use either but they look valuable.

Hutmil

In bygone days when I had loads of static HTML files, making changes was a bit of a chore if they affected every single file. An example is changing the year on the copyright message on the page footers. Hutmil, which I found on a magazine cover-mounted disc, was a great time saver in those days. Today, I achieve this by putting this information into a single file and getting Perl or PHP to import that when building the page. The same “define once, use anywhere” approach underlies CSS as well and scripting very usefully allows you to take that into the XHTML domain.

Apache

Apache is ubiquitous these days and both the online and offline versions of my site are powered by it. It does require some configuration but it is a very powerful piece of kit. The introduction of 2.2.x meant a big change in the way that configuration files were modularised and while most things were contained in a single file for 2.0.x, the settings are broken up into different files in 2.2.x and it can take a while to find things again. Without having it on my home PC, I would not be able to use Perl, PHP or MySQL. Apart from this, I especially like its virtual site capability; very useful for offline development.

WordPress

My hosting supplier offers blogs on Blogware, but that didn’t offer the level of configuration that I would have liked. It is true that this is probably true of any host of blogs. I can’t speak for Blogger but WordPress.com does have its restrictions too. To make my hillwalking blog fit in with the appearance of my photo gallery, I went popped over to WordPress.org to download WordPress so that I could host a blog myself and have maximum control over its appearance. WordPress supports themes so I created my own and got my blog pages looking as if they are part of my website, rather than looking like something that was bolted on. Now that I think of it, what about WordPress supporting user-created themes? I support that there is the worry of insecure PHP code but what about it?

MySQL

I am between minds on whether this is a technology or a tool. SQL certainly would be a technology standard but I am not so clear on what MySQL would be. In any case, I have classed it as a tool and a very useful one at that. It is the linchpin for my WordPress blogs and, if I go for a content management system like Drupal, its role would surely grow. While I do have a lot of experience with using SAS SQL and this helps me to deal with other varieties, there is still a learning curve with MySQL that gets me heading for a good book and Kofler’s The Definitive Guide to MySQL5 (Apress) seems to perform more than adequately in this endeavour.

Paint Shop Pro

As someone who hosts an online photo gallery, it won’t come as a surprise that I have had exposure to image editors. Despite various other flirtations, Paint Shop Pro has been my tool of choice over the years, but it is now set to be usurped by a member of Adobe’s Photoshop family. Paint Shop Pro does have books devoted to it but it seems that Photoshop gets better coverage and I feel that my image processing needs to be taken up a gear, hence the potential move to Photoshop

More thoughts on learning to use digital imaging software

14th March 2007

If you ever go into a bricks and mortar newsagent and peruse its shelves with an eye out for references to data imaging software, you might find Adobe’s Photoshop as predominant there as it is in the digital imaging world. And the same trend seems to continue in to the bricks and mortar bookshops as well. Online, especially within the vaults of Amazon, it is not as much a matter of what gets stocked as what gets published and my impression is that the bias, if that’s the right word, continues there. That said, I didn’t realise until recently that Elsevier’s Focal Press has been covering Paint Shop Pro, once branded the poor man’s Photoshop, from at least version 7. That discovery, if it had come earlier, may have made a big difference to how I have been using PSP. That said, I have seen some opinions that PSP is easy to use and that may explain the lack of attention from publishers. Future Publishing did put out a monthly guide to PSP but that seems to have disappeared from the shelves and it does lend weight to that argument. Or it could have been Corel’s purchase of JASC that changed things…

Of course, without books and magazines, it is not as easy to see the possibilities and it is here where Photoshop really scores. The digital photography revolution has ensured the software’s escape from the world of computing and the digital arts into photography magazines and beyond. These days, even conventional photography titles feature Photoshop how-to articles. In fact, such is the level of digital content in titles such as Photography Monthly, Practical Photography and Outdoor Photography that you hardly need to pursue the specialist digital photography titles at all.

Speaking of photography, this is and has been my main use of digital imaging technology, be it the scanner that I use for digitising the output of my efforts in film photography or processing RAW files from my digital SLR. I have been using scanners since 1998 and am on my second, a CanoScan 5000F. The colour rendition in the output from its predecessor, a UMAX 1212U, deteriorated to the point where a replacement needed to be sought. As it happened, the Canon proved to be light years ahead of the UMAX, even with the latter operating properly. Incidentally, my first scanning outing was in the then current version of Photoshop (I booked some time on a scanner at the graphics centre of the university I was attending at the time and sneaked in the scanning of a photo with the journal graphic that I needed to do) -- a limited affair, it has to be said -- but I then reverted to things like Corel PhotoPaint and Paint Shop Pro. And PSP was what I was using in the main even after encountering the copy of Photoshop Elements 2 bundled with my EOS 10D. Elements’ cloning capabilities did tempt me though and I did acquire a Focal Press volume on the application but I somehow never took it further.

At the end of last year, Corel and Adobe launch new versions of PSP and Elements, respectively. That got me tempted by the idea of giving the whole business another look, this time in detail. My look at PSP XI regrettably suffered from the lack of time that I could devote to it and seeing what a book on it might have to say. I had more of a chance with Photoshop Elements and came away impressed with the way that it worked. Since then, I have been making my way through Scott Kelby’s latest Elements book and the ideas are building up. At the same time, I have been making good use of a Photoshop CS2 try-out and I am on the horns of a dilemma: do I splash out on CS2, do I get Elements 5 or do I await the now imminent CS3? You’ll notice that PSP doesn’t feature here; the amount of literature pertaining to Photoshop simply is too much to ignore and I have loads more to learn.

PSP file gotcha

29th January 2007

Corel Paint Shop Pro Photo XI

Having completed my evaluation of Corel’s Paint Shop Pro (a.k.a. PSP) Photo XI, I dutifully uninstalled it from my system. However, on catching up with some files that I had acquired through the application, I found that I could not open them with its forbear PSP 9. From this, I would have to conclude that Corel made a change to PSP’s native PSPIMAGE file format along the way. Having had Windows 2000 installed in a VMware virtual machine, I got back PSP XI to batch convert the files into PSD (Photoshop’s own file format) and TIFF files for the future. Carrying out the conversion was easy enough thanks to being able to select files according to their file type, something that Adobe could do with bringing into Photoshop Elements; it’s not there even in the latest version.

Batch processing with Corel Paint Shop Pro Photo XI

Photoshop Elements 5 on trial

26th January 2007

Photoshop Elements 5 on trial

After having a trial version of Corel’s Paint Shop Pro Photo XI on evaluation, I have now moved on to the latest version of Adobe’s Photoshop Elements. Thanks to the digital photography revolution, image editing has become a lot more user-friendly these days. This is no doubt down to companies realising that investing in the development of this class of software does yield a return and the development effort is progressing things very nicely indeed.

My first exposure to image editing was with Corel’s PhotoPaint, a low profile application that did what I asked of it while guzzling as much of what little memory my PC had in those days (32 MB first, then 64 MB). Paint Shop Pro 5, a popular tool of shareware origins, replaced this, though I must admit that I did briefly encounter the ubiquitous Photoshop at this stage. The PSP commitment continued through versions 7 and 9; though it is only in the later versions that photo processing began to be a significant part of the functionality. PSP XI has taken this further and has add-on tools like SnapFire for downloading photos from cameras. It does feature screen calibration as well but I found that photos appeared very pale when using the calibrated workspace; maybe I set it up wrong. That said, its assistance in setting the brightness and contrast of my monitor was most useful. Until then, I hadn’t realised the details that I had been missing.

Nevertheless, Adobe’s Photoshop Elements has been able to inspire a certain level of confidence that PSP doesn’t. Even PE 2, which I got bundled with my Canon EOS 10D SLR, had that little extra when compared with PSP 9. For instance, it was PE where I saw the real power of the clone stamping tool. That different in air of confidence also extends to the latest generation. I have found PE 5 very quick and easy to generate good results. Features like the levels tool and “Save for the Web” are things that I have found very useful. There are a few minor disappointments such as its not showing the pixel size of the image being edited, a very useful feature of PSP. A bit of clarity around image resizing would also be nice but I suspect that I may just need to learn a little more. Overall, its speed when it comes to creating nice results swings me away from PSP and may ultimately put paid to any loyalty that I may have had to Corel’s image editor for the masses. I think that I’ll have a go with its big brother but it looks as if I may well acquire PE 5 on a more permanent basis.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.