Technology Tales

Adventures & experiences in contemporary technology

Online learning

18th April 2021

Recently, I shared my thoughts on learning new computing languages by oneself using books, online research and personal practice. As successful as that can be, there remains a place for getting some actual instruction as well. Maybe that is why so many turn to YouTube, where there is a multitude of video channels offering such possibilities without cost. What I have also discovered is that this is complemented by a host of other providers whose services attract a fee, and there will be a few of those mentioned later in this post. Paying for online courses does mean that you can get the benefit of curation and an added assurance of quality in what appears to be a growing market.

The variation in quality can dog the YouTube approach, and it also can be tricky to find something good, even if the platform does suggest new videos based on what you have been watching. Much of what is found there does take the form of webinars from the likes of the Why R? Foundation, Posit or the NHSR Community. These can be useful, and there are shorter videos from such providers as the Association of Computing Machinery or SAS Users. These do help more if you already have some knowledge about the topic area being discussed, so they may not make the best starting points for someone who is starting from scratch.

Of course, working your way through a good book will help, and it is something that I have been known to do, but supplementing this with one or more video courses really adds to the experience and I have done a few of these on LinkedIn. That part of the professional platform came from the acquisition of Lynda.com and the topic areas range from soft skills like time management through to computing skills courses with R, SAS and Python seeing coverage among the data science portfolio. Even O’Reilly has ventured into the area in an expansion from the book publishing activities for which so many of us know the organisation.

The available online instructor community does not stop at the above since there are others like Degreed, Baeldung, Udacity, Programiz, Udemy, Business Science and Datanovia. Some of these tend towards online education provision that feels more like an online university course and those are numerous as well as you will find through Data Science Central or KDNuggets. Both of these earn income from advertising to pay for featured blog posts and newsletters, while the former also organises regular webinars and was my first port of call when I became curious about the world of data science during the autumn of 2017.

My point of approach into the world of online training has been as a freelance information professional needing to keep up to date with a rapidly changing field. The mix of content that is both free of charge and that which attracts a fee is one that can work. Both kinds do complement each other while possessing their unique advantages and disadvantages. The need to continually expand skills and knowledge never goes away, so it is well worth spending some time working what you are after, since you need to be sure that any training always adds to your own knowledge and skill level.

Rethinking photo editing

17th April 2018

Photo editing has been something that I have been doing since my first-ever photo scan in 1998 (I believe it was in June of that year but cannot be completely sure nearly twenty years later). Since then, I have been using a variety of tools for the job and wondered how other photos can look better than my own. What cannot be excluded is my preference for being active in the middle of the day when light is at its bluest as well as a penchant for using a higher ISO of 400. In other words, what I do when making photos affects how they look afterwards as much as the weather that I had encountered.

My reason for mentioning the above aspects of photographic craft is that they affect what you can do in photo editing afterwards, even with the benefits of technological advancement. My tastes have changed over time, so the appeal of re-editing old photos fades when you realise that you only are going around in circles and there always are new ones to share, so that may be a better way to improve.

When I started, I was a user of Paint Shop Pro but have gone over to Adobe since then. First, it was Photoshop Elements, but an offer in 2011 lured me into having Lightroom and the full version of Photoshop. Nowadays, I am a Creative Cloud photography plan subscriber so I get to see new developments much sooner than once was the case.

Even though I have had Lightroom for all that time, I never really made full use of it and preferred a Photoshop-based workflow. Lightroom was used to select photos for Photoshop editing, mainly using adjustments for such things as tones, exposure, levels, hue and saturation. Removal of dust spots, resizing and sharpening were other parts of a still minimalist approach.

What changed all this was a day spent pottering about the 2018 Photography Show at the Birmingham NEC during a cold snap in March. That was followed by my checking out the Adobe YouTube Channel afterwards where there were videos of the talks featured every day of the four-day event. Here are some shortcuts if you want to do some catching up yourself: Day 1, Day 2, Day 3, and Day 4. Be warned though that these videos are long in that they feature the whole day and there are enough gaps that you may wish to fast-forward through them. Even so, there is quite a bit of variety of things to see.

Of particular interest were the talks given by the landscape photographer David Noton who sensibly has a philosophy of doing as little to his images as possible. It helps that his starting points are so good that adjusting black and white points with a little tonal adjustment does most of what he needs. Vibrancy, clarity and sharpening adjustments are kept to a minimum while some work with graduated filters evens out exposure differences between skies and landscapes. It helps that all this can be done in Lightroom, so that set me thinking about trying it out for size and the trick of using the backslash (\) key to switch between raw and processed views is a bonus granted by non-destructive editing. Others may have demonstrated the creation of composite imagery, but simplicity is more like my way of working.

Confusingly, we now have the cloud-based Lightroom CC while the previous desktop counterpart is known as Lightroom Classic CC. Though the former may allow for easy dust spot removal among other things, it is the latter that I prefer because the idea of wholesale image library upload does not appeal to me for now and I already have other places for off-site image backup like Google Drive and Dropbox. The mobile app does look interesting since it allows capturing images on a such a device in Adobe’s raw image format DNG. Still, my workflow is set to be more Lightroom-based than it once was and I quite fancy what new technology offers, especially since Adobe is progressing its Sensai artificial intelligence engine. The fact that it has access to many images on its systems due to Lightroom CC and its own stock library (Adobe Stock, formerly Fotolia) must mean that it has plenty of data for training this AI engine.

Carrying out a hard reset of a home KVM switch

20th March 2017

During a recent upgrade from Linux Mint 18 to Linux Mint 18.1 on a secondary machine, I ran into bother with my Startech KVM (keyboard, video, mouse and audio sharing) switch. The PC failed to recognise the attachment of my keyboard and mouse so an internet search began.

Nothing promising came from it apart from resetting the KVM switch. In other words, the solution was to turn it off and back on again. That was something that I did try without success. What I had overlooked was that there USB connections to PC’s that fed the device with a certain amount of power and that was enough to keep it on.

Unplugging those USB cables as well as the power cable was needed to completely switch off the device. That provided the reset that I needed and all was well again. Otherwise, I would have been baffled enough to resort to buying a replacement KVM switch so the extra information avoided a purchase that could have cost in the region of £100. In other words, a little research had saved me money.

A need to update graphics hardware

16th June 2013

Not being a gaming enthusiast, having to upgrade graphics cards in PC’s is not something that I do very often or even rate as a priority. However, two PC’s in my possession have had that very piece of hardware upgraded on them and it’s not because anything was broken either. My backup machine has seen quite a few Linux distros on there since I built it nearly four years ago. The motherboard is an ASRock K10N78 that sourced from MicroDirect and it has onboard an NVIDIA graphics chip that has performed well if not spectacularly. One glitch that always existed was a less than optimal text rendering in web browsers but that never was enough to get me to add a graphics card to the machine.

More recently, I ran into trouble with Sabayon 13.04 with only the 2D variant of the Cinnamon desktop environment working on it and things getting totally non-functional when a full re-installation of the GNOME edition was attempted. Everything went fine until I added the latest updates to the system when a reboot revealed that it was impossible to boot into a desktop environment. Some will relish this as a challenge but I need to admit that I am not one of those. In fact, I tried out two Arch-based distros on the same PC and got the same results following a system update on each. So, my explorations of Antergos and Manjaro have to continue in virtual machines instead.

To get a working system, I gave Linux Mint 15 Cinnamon a go and that worked a treat. However, I couldn’t ignore that the cutting edge distros that I tried before it all took exception to the onboard NVIDIA graphics. systemd has been implemented in all of these and it seems reasonable to think that it is coming to Linux Mint at some stage in the future so I went about getting a graphics card to add into the machine. Having had good experiences with ATi’s Radeon in the past, I stuck with it even though it now is in the hands of AMD. Not being that fussed so long there was Linux driver support, I picked up a Radeon HD 6450 card from PC World. Adding it into the PC was a simpler of switching off the machine, slotting in the card, closing it up and powering it on again. Only later on did I set the BIOS to look for PCI Express graphics before anything else and I could have got away without doing that. Then, I made use of the Linux Mint Additional Driver applet in its setting panel to add in the proprietary driver before restarting the machine to see if there were any visual benefits. To sort out the web browser font rendering, I used the Fonts applet in the same settings panel and selected full RGBA hinting. The improvement was unmissable if not still like the appearance of fonts on my main machine. Overall, there had been an improvement and a spot of future proofing too.

That tinkering with the standby machine got me wondering about what I had on my main PC. As well as onboard Radeon graphics, it also gained a Radeon 4650 card for which 3D support wasn’t being made available by Ubuntu GNOME 12.10 or 13.04 to VMware Player and it wasn’t happy about this when a virtual machine was set to have 3D support. Adding the latest fglrx driver only ensured that I got a command line instead of a graphical interface. Issuing one of the following commands and rebooting was the only remedy:

sudo apt-get remove fglrx

sudo apt-get remove fglrx-updates

Looking at the AMD website revealed that they no longer support 2000, 3000 or 4000 series Radeon cards with their latest Catalyst driver the last version that did not install on my machine since it was built for version 3.4.x of the Linux kernel. A new graphics card then was in order if I wanted 3D graphics in VWware VM’s and both GNOME and Cinnamon appear to need this capability. Another ASUS card, a Radeon HD 6670, duly was acquired and installed in a manner similar to the Radeon HD 6450 on the standby PC. Apart from not needing to alter the font rendering (there is a Font tab on Gnome Tweak Tool where this can be set), the only real exception was to add the Jockey software to my main PC for installation of the proprietary Radeon driver. The following command does this:

sudo apt-get install jockey-kde

When that was done I issue the jockey-kde command and selected the first entry on the list. The machine worked as it should on restarting apart from an AMD message at the bottom right hand corner bemoaning unrecognised hardware. There had been two entries on that Jockey list with exactly the same name so it was time to select the second of these and see how it went. On restarting, the incompatibility message had gone and all was well. VMware even started virtual machines with 3D support without any messages so the upgrade did the needful there.

Hearing of someone doing two PC graphics card upgrades in a weekend may make you see them as an enthusiast but my disinterest in computer gaming belies this. Maybe it highlights that Linux operating systems need 3D more than might be expected. The Cinnamon desktop environment now issues messages if it is operating in 2D mode with software 3D rendering and GNOME always had the tendency to fall back to classic mode, as it had been doing when Sabayon was installed on my standby PC. However, there remain cases where Linux can rejuvenate older hardware and I installed Lubuntu onto a machine with 10 year old technology on there (an 1100 MHz Athlon CPU, 1GB of RAM and 60GB of hard drive space in case dating from 1998) and it works surprisingly well too.

It seems that having fancier desktop environments like GNOME Shell and Cinnamon means having the hardware on which it could run. For a while, I have been tempted by the possibility of a new PC since even my main machine is not far from four years old either. However, I also spied a CPU, motherboard and RAM bundle featuring an Intel Core i5-4670 CPU, 8GB of Corsair Vengence Pro Blue memory and a Gigabyte Z87-HD3 ATX motherboard included as part of a pre-built bundle (with a heatsink and fan for the CPU) for around £420. Even for someone who has used AMD CPU’s since 1998, that does look tempting but I’ll hold off before making any such upgrade decisions. Apart from exercising sensible spending restraint, waiting for Linux UEFI support to mature a little more may be no bad idea either.

Update 2013-06-23: The new graphics card in my main machine is working as it should and has reduced the number of system error report messages turning up too; maybe Ubuntu GNOME 13.04 didn’t fancy the old graphics card all that much. A rogue .fonts.conf file was found in my home area on the standby machine and removing it has improved how fonts are displayed on there immeasurably. If you find one on your system, it’s worth doing the same or renaming it to see if it helps. Otherwise, tinkering with the font rendering settings is another beneficial act and it even helps on Debian 6 too and that uses GNOME 2! Seeing what happens on Debian 7.1 could be something that I go testing sometime.

A bigger screen?

23rd February 2010

A recent bit of thinking has caused me to cast my mind back over all the screens that have sat in front of me while working with computers over the years. Well, things have come a long way from the spare television that I used with a Commodore 64 that I occasionally got to exploring the thing. Needless to say, a variety of dedicated CRT screens ensued as I started to make use of Apple and IBM compatible PC’s provided in computing labs and other such places before I bought an example of the latter as my first ever PC of my own. That sported a 15″ display that stood out a little in times when 14″ ones were mainstream but a 17″ Iiyama followed it when its operational quality deteriorated. That Iiyama came south with me from Edinburgh as I moved to where the work was and offered sterling service before it too started to succumb to aging.

During the time that the Iiyama CRT screen was my mainstay at home, there were changes afoot in the world of computer displays. A weighty 21″ Philips screen was what greeted me on a first day at work but 21″ Eizo LCD displays were set to replace those behemoths and remain in use as if to prove the longevity of LCD panels and the validity of using what had been sufficient for laptops for a decade or so. In fact, the same comment regarding reliability applies to the screen that now is what I use at home, a 17″ Iiyama LCD panel (yes, I stuck with the same brand when I changed technologies longer ago than I like to remember).

However, that hasn’t stopped me wondering about my display needs and it’s screen size that is making me think rather than the reliability of the current panel. That is a reflection on how my home computing needs have changed over time and they show how my non-computing interests have evolved too. Photography is but one of these and the move the digital capture has brought with a greater deal of image processing, so much that I wonder if I need to make less photos rather than bringing home so many that it can be hard to pick out the ones that are deserving of a wider viewing. That is but one area where a bigger screen would help but there is another and it arises from my interest in exploring countryside on foot or on my bike: digital mapping. When planning outings, it would be nice to have a wider field of view to be able to see more at a larger scale.

None of the above is a showstopper that would be the case if the screen itself was unreliable so I am going to take my time on this one. The prospect of sharing desktops across two screens is another idea but that needs some thought about where it all would fit; the room that I have set aside for working at my computer isn’t the largest but it’ll need to do. After the space side of things, then there’s the matter of setting up the hardware. Quite how a dual display is going to work with a KVM setup is something to explore as is the adding of extra video cards to existing machines. After the hardware fiddling, the software side of things is not a concern that I have because of when I used laptop as my main machine for a while last year. That confirmed that Windows (Vista but it has been possible since 2000 anyway…) and Ubuntu (other modern Linux distributions should work too…) can cope with desktop sharing out of the box.

Apart from the nice thoughts of having more desktop space, the other tempting side to all of this is what you can get for not much outlay. It isn’t impossible to get a 22″ display for less than £200 and the prices for 24″ ones are tempting too. That’s a far cry from paying next to £300 (if my memory serves me correctly) for that 17″ Iiyama and I’d hope that the quality is as good as ever.

It’s all very well talking about pricing but you need to sit down and choose a make and model when you get to deciding on a purchase. There is plenty of choice so that would take a while but magazine reviews will come in handy here. Saying that, last year’s computing misadventures have me questioning the sense of going for what a magazine places on its A-list. They also have me minded to go to a nearby computer shop to make a purchase rather than choosing a supplier on the web; it is easier to take back a faulty unit if you don’t have far to go. Speaking of faulty units, last year has left me contemplating waiting until the year is older before making any acquisitions of computer kit. All of that has put the idea of buying a new screen on the low priority list, nice to have but not essential. For now, that is where it stays but you never know what the attractions of a shiny new thing can do…

Temptations, temptations…

19th August 2009

Pentax K-7

The last time that I went out and bought a new camera was over two years ago and I am minded not to make another purchase for a while. Apart from damage to the battery cover arising from a fall, my Pentax K10D has survived so far without a problem and I admit to being satisfied with the photos that it makes. Following a professional sensor clean, my Canon EOS 10D has been pressed into service over the past few months too. In the meantime, 6 and 10 megapixel sensors generate nowhere near the attention that might have been case a few years back but that’s by the by. In fact, the megapixel race seems to have stalled with features like video being added to stills cameras over the last year and live screens coming to prominence also. Neither would make me rush out to buy a new DSLR anyway, perhaps because having things the old way suits me just fine and megapixel counts never ever moved me in the first place either.

That’s never to say that the likes of Pentax’s K-7 or Canon’s EOS 50D and 5D Mark II don’t capture my attention with their promises of better quality. However, with things the way that they are in the world, I am more likely to hold onto my cash or maybe invest in new photo processing software for making the most of what I already have. Ideas for photography projects creep into my head when I get to looking over my online photo gallery and realise that not have my tastes changed but my photographic eye has developed too. That seeing of things in a new light may mean that old subjects get revisited and I don’t need a new camera to do that.

Olympus E-P1

High end compact cameras such as Canon’s G11 and Ricoh’s GR Digital III do detain attention for a while but a quick look at their prices proves that you really got to need the portability and I never can justify the outlay when a DSLR will do all that I want from it, and perhaps even for less money. While I admit to pondering the purchase of a GR Digital to cover for the EOS 10D while it was away for cleaning, the Pentax came to be acquired when I realised that the versatility of a DSLR was too much to lose, even for a while. Olympus’ E-P1 may have bridged the gap but the old question of going miniature for the price of a full sized article recurs.

All in all, I am going to stick with what I have right now. We are coming to a time of year when things appear more golden and that combination of lighting and colour are what really matters, not how many megapixels is in you camera sensor unless you are making large prints or supplying stock libraries. As long as my cameras continue to deliver pleasing results, I’ll stick with improving my skills and taking my time over that task, even with all the announcements of new cameras at various exhibitions and shows.

Microsoft: Audio and Video Streaming Devices

6th March 2007

Here’s a link to where some of the documents at the heart of the Vista DRM controversy live: Audio and Video Streaming Devices .

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.