Rethinking photo editing

Photo editing has been something that I have been doing since my first ever photo scan in 1998 (I believe it was in June of that year but cannot be completely sure nearly twenty years later). Since then, I have using a variety of tools for the job and wondered how other photos can look better than my own. What cannot be excluded is my tendency for being active in the middle of the day when light is at its bluest as well as a penchant for using a higher ISO of 400. In other words, what I do when making photos affects how they look afterwards as much as the weather that I encountered.

My reason for mentioned the above aspects of photographic craft is that they affect what you can do in photo editing afterwards, even with the benefits of technological advancement. My tastes have changed over time so the appeal of re-editing old photos fades when you realise that you only are going around in circles and there always are new ones to share so that may be a better way to improve.

When I started, I was a user of Paint Shop Pro but have gone over to Adobe since then. First, it was Photoshop Elements but an offer in 2011 lured me into having Lightroom and the full version of Photoshop. Nowadays, I am a Creative Cloud photography plan subscriber so I get to see new developments much sooner than once was the case.

Even though I have had Lightroom for all that time, I never really made full use of it and preferred a Photoshop-based workflow. Lightroom was used to select photos for Photoshop editing, mainly using adjustment for such such things as tones, exposure, levels, hue and saturation. Removal of dust spots, resizing and sharpening were other parts of a still minimalist approach.

What changed all this was a day spent pottering about the 2018 Photography Show at the Birmingham NEC during a cold snap in March. That was followed by my checking out theĀ Adobe YouTube Channel afterwards where there were videos of the talks featured every day of the four day event. Here are some shortcuts if you want to do some catching up yourself: Day 1, Day 2, Day 3, and Day 4. Be warned though that these videos are long in that they feature the whole day and there are enough gaps that you may wish to fast forward through them. Even so, there is a quite of variety of things to see.

Of particular interest were the talks given by the landscape photographer David Noton who sensibly has a philosophy of doing as little to his images as possible. It helps that his starting points are so good that adjusting black and white points with a little tonal adjustment does most of what he needs. Vibrance, clarity and sharpening adjustments are kept to a minimum while some work with graduated filters evens out exposure differences between skies and landscapes. It helps that all this can be done in Lightroom so that set me thinking about trying it out for size and the trick of using the backslash (\) key to switch between raw and processed views is a bonus granted by non-destructive editing. Others may have demonstrated the creation of composite imagery but simplicity is more like my way of working.

Confusingly, we now have the cloud-based Lightroom CC while the previous desktop counterpart is known as Lightroom Classic CC. Though the former may allow for easy dust spot removal among other things, it is the latter that I prefer because the idea of wholesale image library upload does not appeal to me for now and I already have other places for offsite image backup like Google Drive and Dropbox. The mobile app does look interesting since it allows to capture images on a such a device in Adobe’s raw image format DNG. Still, my workflow is set to be more Lightroom-based than it once was and I quite fancy what new technology offers, especially since Adobe is progressing its Sensai artificial intelligence engine. The fact that it has access to many images on its systems due to Lightroom CC and its own stock library (Adobe Stock, formerly Fotolia) must mean that it has plenty of data for training this AI engine.

Trying out a new way to upgrade Linux Mint in situ while going from 17.3 to 18.1

There was a time when the only recommended way to upgrade Linux Mint from one version to another was to do a fresh installation with back-ups of data and a list of the installed applications created from a special tool.

Even so, it never stopped me doing my own style of in situ upgrade though some might see that as a risky option. More often than not, that actually worked without causing major problems in a time when Linux Mint releases were more tightly tied to Ubuntu’s own six-monthly cycle.

In recent years, Linux Mint’s releases have kept in line with Ubuntu’s Long Term Support (LTS) editions instead. That means that any major change comes only every two years with minor releases in between those. The latter are delivered through Linux Mint’s Update Manager so the process is a simple one to implement. Still, upgrades are not forced on you so it is left to your discretion as to when you need to upgrade since all main and interim versions get the same extended level of support. In fact, the recommendation is not to upgrade at all unless something is broken on your own installation.

For a number of reasons, I stuck with that advice by sticking on my main machine with Linux Mint 17.3 instead of upgrading to Linux Mint 18. The fact that I broke things on another machine using an older method of upgrading provided even more encouragement.

However, I subsequently discovered another means of upgrading between major versions of Linux Mint that had some endorsement from the project. There still are warnings about testing a live DVD version of Linux Mint on your PC first and backing up your data beforehand. Another task is ensuring that you are upgraded from a fully up to data Linux Mint 17.3 installation.

When you are ready, you can install mintupgrade using the following command:

sudo apt-get install mintupgrade

When that is installed, there is a sequence of tasks that you need to do. The first of these is to simulate an upgrade to test for the appearance of untoward messages and resolve them. Repeating any checking until all is well gets a recommendation. The command is as follows:

mintupgrade check

Once you are happy that the system is ready, the next step is to download the updated packages so they are on your machine ahead of their installation. Only then should you begin the upgrade process. The two commands that you need to execute are below:

mintupgrade download
mintupgrade upgrade

Once these have completed, you can restart your system. In my case the whole process worked well with only my PHP installation needing attention. A clash between different versions of the scripting interpretor was addressed by removing the older one since PHP 7 is best kept for sake of testing. Beyond that, a reinstallation of VMware Player and the move from version 18 to version 18.1, there hardly was anything more to do and there was next to no real disruption. That is just as well since I depend heavily on my main PC these days. The backup option of a full installation would have left me clearing up things for a few days afterwards since I use a bespoke selection of software.

Upgrading from OpenMediaVault 1.x to OpenMediaVault 2.x

Having an older PC about, I decided to install OpenMediaVault on there earlier in the year after adding in a 6 TB hard drive for storage, a Gigabit network card to speed up backups and a new BeQuiet! power supply to make it quieter. It has been working smoothly since then and the release of OpenMediaVault 2.x had me wondering how to upgrade to it.

Usefully, I enabled an SSH service for remote logins and set up an account on there for anything that I needed to do. This includes upgrades, taking backups of what is on my NAS drives and even shutting down the machine when I am done with what I need to do on there.

Using an SSH session, the first step was to switch to the administrator account and issue the following command to ensure that my OpenMediaVault 1.x installation was as up to date as it could be:

omv-update

Once that had completed what it needed to do, the next step was to do the upgrade itself with the following command:

omv-release-upgrade

With that complete, it was time to reboot the system and I fired up the web administration interface and spotted a kernel update that I applied. Again the system was restarted and further updates were noticed and these were applied, again through the web interface. The whole thing remains based on Debian 7.x but I am not complaining since it quietly does exactly what I need of it.

More thoughts on Windows 10

Now that I have left Windows 8.x behind me and there are a number of my machines running Windows 10, I have decided to revisit my impressions of the operating system. The first Technical Preview was something that I installed in a virtual machine and I have been keeping an eye on things have developed since then and intend to retain a Windows Insider installation to see what might be heading our way as Windows 10 evolves as now expected.

After elaborating on the all important upgrade process earlier, I am now moving onto other topics. The Start Menu is a big item but there are others as you will see below.

Start Menu

Let’s start with an admission: the prototype Start Menu that we got in the initial Windows 10 Technical Preview was more to my liking. Unpinning all the tiles allowed the menu to collapse back to the sort of width that anyone familiar with Windows 7 would have liked. If there was a setting to expunge all tiles at once and produce this state, I would have been well happy.

It was latter that we got to learn that Microsoft was not about consign the Windows 8 Modern interface entirely to history as many would have wanted. Some elements remain with us such as a Start Menu with a mandatory area for tiles and the ability to have it display full screen. Some are live but this can be turned off on a tile by tile basis and unneeded ones can be removed altogether. It is even possible to uninstall most apps by right clicking on a tile or other Start Menu entry and select the required option from the resulting context menu. For others, there is a command line alternative that uses Powershell to do removals. After this pruning, things were left in such a state that I have not been moved to restore Classic Shell so far.

The Start Menu settings used be in the same place as those for the taskbar but they are found in the new Settings tool. Some are in the Personalisation section and it has its own Start subsection for setting full screen mode or highlighting of new apps among other things. The equivalent Colours subsection is where you find other settings like assigning background colours based on those in a desktop background image, which itself is assigned in it own subsection in the Personalisation area.

Virtual Desktops

Initially, I failed to see the point in how Microsoft implemented these and favoured Virtuawin instead. My main complaint was the taskbar showed buttons for all open apps regardless of the screen in which they are opened. However, that was changed so your taskbar shows different buttons for each virtual desktop, just like the way that Linux and UNIX do things. Switching between desktops may not be as smooth of those yet but the default setting is a move in the right direction and you can change it if you want.

Cortana

This was presented to the world as a voice operated personal assistant like Apple’s Siri but I cannot say that I am keen on such things so I decided to work as I usually do instead. Keyboard interaction works fine and I have neutered things to leave off web searches on Bing to use the thing much in the same way as the search box on the Windows 7 Start Menu. It may be able to do more than that but I am more than happy to keep my workflow unchanged for now. Cortana’s settings are available via its pop-up menu. Collapsing the search box to an icon to save space for your pinned and open applications is available from the Search section of the taskbar context menu (right clicking the taskbar produces this).

Settings

In Windows 8.x, the Control Panel was not the only area for settings but remained feature complete but the same is not the case for Windows 10 where the new Settings panel is starting to take over from it. The two co-exist for now but it seems clear that Settings is where everything is headed.

The Personalisation section of the tool has been mentioned in relation to the Start Menu but there are plenty of others. For instance, the Privacy one is one that definitely needs reviewing and I found myself changing a lot of the default settings in there. Naturally, there are some other sections in Settings that need hardly any attention from most of us and these include Ease of access (accessibility), Time & language, Devices and Network & Internet. The System section has a few settings like tablet mode that may need review and the Update & security one has backup and recovery subsections that may be of interest. The latter of these is where you find the tools for refreshing the state of the system following instability or returning to a previous Windows version (7 or 8.x) within thirty days of the upgrade.

Copying only updated new or updated files by command line in Linux or Windows

With a growing collection of photographic images, I often find myself making backups of files using copy commands and the data volumes are such that I don’t want to keep copying the same files over and over again so incremental file transfers are what I need. So commands like the following often get issued from a Linux command line:

cp -pruv [source] [destination]

Because this is in Linux, it the bash shell that I use so the switches may not apply to others like ssh, fish or ksh. For my case, p preserves file properties such as its time and date and the cp command does not do this always so it needs adding. The r switch is useful because the copy then in recursive so only a directory needs to specified as the source and the destination needs to be one level up from a folder with the same name there so as to avoid file duplication. It is the u switch that makes the file copy incremental and the v one issues messages to the shell that show how the copying is going. Seeing a file name issued by the latter does tell you how much more needs to be copied and that the files are going where they should.

What inspired this post though is my need to do the same in a Windows session and issuing xcopy commands will achieve the same end. Here are two that will do the needful:

xcopy [source] [destination] /d /s

xcopy [source] [destination] /d /e

In both cases, it is the d switch that ensures that the copy is incremental and you can add a date too, with a colon between it and the /d, if you see fit. The s switch copies only directories that contain files while the e one copies even empty directories. Using the d switch without either of those did not trigger any copying action when I tried so I reckon that you cannot do without either of them. By default, both of these commands issue output to the command line so you can keep an eye on what is happening and this especially is useful when ensuring that files are going to the right destination because the behaviour differs from that of the bash shell in Linux.