Technology Tales

Adventures & experiences in contemporary technology

A desktop Markdown editing environment

8th November 2022

Earlier this year, I changed over two websites from dynamic versions using content management systems to static ones by using Hugo to build them from Markdown files. That meant that I needed to look at the editing of MarkDown even if it is a fairly simple file format. For one thing, Grammarly can be incorporated into WordPress so I did not want to lose something like that.

The latter point meant that I was steered away from plain text editors. Otherwise, there are online ones like StackEdit and Dillinger but the Firefox Grammarly plugin only appears to work on the first of these, and even then only partially in my experience. Dillinger does offer connections to online file storage providers like Google, Dropbox and OneDrive but I wanted to store files on my desktop for upload to a web server. It also works with Github but I prefer to use another web hosting provider.

There are various specialised MarkDown editors for desktop usage like Typora, ReText, Formiko or Ghostwriter but I chose none of these. My actual choice may surprise many: it was Visual Studio Code. The availability of a Grammarly plug-in was what swayed it for me even if it did need to be switched on for MarkDown files. In many ways, it does work as smoothly as elsewhere because it gets fooled by links and other code-like pieces of text. Also, having the added ability to add words to a custom dictionary would be ideal. Some rule overriding is available but I am not sure that everything is covered even if the list of options is lengthy. Some time is needed to inspect all of them before I proceed any further. Thus far, things are working well enough for me.

Limiting Google Drive upload & synchronisation speeds using Trickle

9th October 2021

Having had a mishap that lost me some photos in the early days of my dalliance with digital photography, I have been far more careful since then and that now applies to other files as well. Doing regular backups is a must that you find reiterated by many different authors and the current computing climate makes doing that more vital than it ever was.

So, as well as having various local backups, I also have remote ones in the form of OneDrive, Dropbox and Google Drive. These more correctly are file synchronisation services but disciplined use can make them useful as additional storage facilities in the interests of maintaining added resilience. There also are dedicated backup services that I have seen reviewed in the likes of PC Pro magazine but I have to make use of those.

Insync

Part of my process for dealing with new digital photo files is to back them up to Google Drive and I did that with a Windows client in the early days but then moved to Insync running on Linux Mint. One drawback to the approach is that this hogs the upload bandwidth of an internet connection that has yet to move to fibre from copper cabling. Having fibre connections to a local cabinet helps but a 100 KiB/s upload speed is easily overwhelmed and digital photo file sizes keep increasing. It does not help that I insist on using more flexible raw formats like DNG, CR2 or CR3 either.

Making fewer images could help to cut the load but I still come away from an excursion with many files because I get so besotted with my surroundings. This means that upload sessions take numerous hours and can extend across calendar days. Ultimately, this makes my internet connection far less usable so I want to throttle upload speed much like what is possible in the Transmission BitTorrent client or in the Dropbox client. Unfortunately, this is not available in Insync so I have tried using the trickle command instead and an example is below:

trickle -d 2000 -u 50 insync

Here, the upload speed is limited to 50 KiB/s while the download speed is limited to 2000 KiB/s. In my case, the latter of these hardly matters while the former leaves me with acceptable internet usability. Insync does not work smoothly with this, however, so occasional restarts are needed to keep file uploads progressing and CPU load also is higher. As rough as the user experience feels, uploads can continue in parallel with other work.

gdrive

One other option that I am exploring is the use of the command-line tool gdrive and this appears to work well with trickle. After downloading and installing the tool, getting going is a matter of issuing the following command and following the instructions:

gdrive about

On web servers, I even have the tool backing up things to Google Drive on a scheduled basis. Because of a Google Drive limitation that I have encountered not only with gdrive but also with Insync and Google’s own Windows Google Drive client, synchronisation only can happen with two new folders, one local and the other remote. Handily, gdrive supports the usual bash style commands for working with remote directories so something like the following will create a directory on Google Drive:

gdrive mkdir ttdc [ID for parent folder]

Here, the ID for the parent folder may be omitted but it can be obtained by going to Google Drive online and getting a link location by right-clicking on a folder and choosing the appropriate context menu item. This gets you something like the following and the required identifier is found between the last slash and the first question mark in the address string (so as not to share any real links, I made the address more general below):

https://drive.google.com/drive/folders/[remote folder ID]?usp=sharing

Then, synchronisation uses a command like the following:

gdrive sync upload [local folder or file path] [remote folder ID]

There also is the option to do a one-way upload and this is the form of the command used:

gdrive upload [local folder or file path] -p [remote folder ID]

Because every file or folder object has its own ID on Google Drive, it is possible to create two objects on there that appear to have the same name though that is sure to cause confusion even if you know what is happening. It is possible in each of the above to throttle them using trickle as well:

trickle -d 2000 -u 50 gdrive sync upload [local folder or file path] [remote folder ID]
trickle -d 2000 -u 50 gdrive upload [local folder or file path] -p [remote folder ID]

Handily, this works without the added drama seen with Insync and lends itself to scripting as well so it could be something that I will incorporate into my current workflow. One thing that needs to be watched is file upload failures but there may be ways to catch those and retry them so that would another thing that needs doing. This is built into Insync and it would be a learning opportunity if I was to stick with gdrive instead.

Rethinking photo editing

17th April 2018

Photo editing has been something that I have been doing since my first-ever photo scan in 1998 (I believe it was in June of that year but cannot be completely sure nearly twenty years later). Since then, I have been using a variety of tools for the job and wondered how other photos can look better than my own. What cannot be excluded is my preference for being active in the middle of the day when light is at its bluest as well as a penchant for using a higher ISO of 400. In other words, what I do when making photos affects how they look afterwards as much as the weather that I had encountered.

My reason for mentioning the above aspects of photographic craft is that they affect what you can do in photo editing afterwards, even with the benefits of technological advancement. My tastes have changed over time, so the appeal of re-editing old photos fades when you realise that you only are going around in circles and there always are new ones to share, so that may be a better way to improve.

When I started, I was a user of Paint Shop Pro but have gone over to Adobe since then. First, it was Photoshop Elements, but an offer in 2011 lured me into having Lightroom and the full version of Photoshop. Nowadays, I am a Creative Cloud photography plan subscriber so I get to see new developments much sooner than once was the case.

Even though I have had Lightroom for all that time, I never really made full use of it and preferred a Photoshop-based workflow. Lightroom was used to select photos for Photoshop editing, mainly using adjustments for such things as tones, exposure, levels, hue and saturation. Removal of dust spots, resizing and sharpening were other parts of a still minimalist approach.

What changed all this was a day spent pottering about the 2018 Photography Show at the Birmingham NEC during a cold snap in March. That was followed by my checking out the Adobe YouTube Channel afterwards where there were videos of the talks featured every day of the four-day event. Here are some shortcuts if you want to do some catching up yourself: Day 1, Day 2, Day 3, and Day 4. Be warned though that these videos are long in that they feature the whole day and there are enough gaps that you may wish to fast-forward through them. Even so, there is quite a bit of variety of things to see.

Of particular interest were the talks given by the landscape photographer David Noton who sensibly has a philosophy of doing as little to his images as possible. It helps that his starting points are so good that adjusting black and white points with a little tonal adjustment does most of what he needs. Vibrancy, clarity and sharpening adjustments are kept to a minimum while some work with graduated filters evens out exposure differences between skies and landscapes. It helps that all this can be done in Lightroom, so that set me thinking about trying it out for size and the trick of using the backslash (\) key to switch between raw and processed views is a bonus granted by non-destructive editing. Others may have demonstrated the creation of composite imagery, but simplicity is more like my way of working.

Confusingly, we now have the cloud-based Lightroom CC while the previous desktop counterpart is known as Lightroom Classic CC. Though the former may allow for easy dust spot removal among other things, it is the latter that I prefer because the idea of wholesale image library upload does not appeal to me for now and I already have other places for off-site image backup like Google Drive and Dropbox. The mobile app does look interesting since it allows capturing images on a such a device in Adobe’s raw image format DNG. Still, my workflow is set to be more Lightroom-based than it once was and I quite fancy what new technology offers, especially since Adobe is progressing its Sensai artificial intelligence engine. The fact that it has access to many images on its systems due to Lightroom CC and its own stock library (Adobe Stock, formerly Fotolia) must mean that it has plenty of data for training this AI engine.

Batch conversion of DNG files to other file types with the Linux command line

8th June 2016

At the time of writing, Google Drive is unable to accept DNG files, the Adobe file type for RAW images from digital cameras. The uploads themselves work fine but the additional processing at the end that I believe is needed for Google Photos appears to be failing. Because of this, I thought of other possibilities like uploading them to Dropbox or enclosing them in ZIP archives instead; of these, it is the first that I have been doing and with nothing but success so far. Another idea is to convert the files into an image format that Google Drive can handle and TIFF came to mind because it keeps all the detail from the original image. In contrast, JPEG files lose some information because of the nature of the compression.

Handily, a one line command does the conversion for all files in a directory once you have all the required software installed:

find -type f | grep -i “DNG” | parallel mogrify -format tiff {}

The find and grep commands are standard with the first getting you a list of all the files in the current directory and sending (piping) these to the grep command so the list only retains the names of all DNG files. The last part uses two commands for which I found installation was needed on my Linux Mint machine. The parallel package is the first of these and distributes the heavy workload across all the cores in your processor and this command will add it to your system:

sudo apt-get install parallel

The mogrify command is part of the ImageMagick suite along with others like convert and this is how you add that to your system:

sudo apt-get install imagemagick

In the command at the top, the parallel command works through all the files in the list provided to it and feeds them to mogrify for conversion. Without the use of parallel, the basic command is like this:

mogrify -format tiff *.DNG

In both cases, the -format switch specifies the output file type with tiff triggering the creation of TIFF files. The *.DNG portion itself captures all DNG files in a directory but {} does this in the main command at the top of this post. If you wanted JPEG ones, you would replace tiff with jpg. Shoudl you ever need them, a full list of what file types are supported is produced using the identify command (also part of ImageMagick) as follows:

identify -list format

Compressing a VirtualBox VDI file for a Linux guest

6th June 2016

In a previous posting, I talked about compressing a virtual hard disk for a Windows guest system running in VirtualBox on a Linux system. Since then, I have needed to do the same for a Linux guest following some housekeeping. The Linux distribution used is Debian so the instructions are relevant to that and maybe its derivatives such as Ubuntu, Linux Mint and their kind.

While there are other alternatives like dd, I am going to stick with a utility named zerofree to overwrite the newly freed up disk space with zeroes to aid compression later on in the process for this and the first step is to install it using the following command:

apt-get install zerofree

Once that has been completed, the next step is to unmount the relevant disk partition. Luckily for me, what I needed to compress was an area that I reserved for synchronisation with Dropbox. If it was the root area where the operating system files are kept, a live distro would be needed instead. In any event, the required command takes the following form with the mount point being whatever it is on your system (/home, for instance):

sudo umount [mount point]

With the disk partition unmounted, zerofree can be run by issuing a command that looks like this:

zerofree -v /dev/sdxN

Above, the -v switch tells zerofree to display its progress and a continually updating percentage count tells you how it is going. The /dev/sdxN piece is generic with the x corresponding to the letter assigned to the disk on which the partition resides (a, b, c or whatever) and the N is the partition number (1, 2, 3 or whatever; before GPT, the maximum was 4). Putting all this together, we get an example like /dev/sdb2.

Once, that had completed, the next step is to shut down the VM and execute a command like the following on the host Linux system ([file location/file name] needs to be replaced with whatever applies on your system):

VBoxManage modifyhd [file location/file name].vdi --compact

With the zero filling in place, there was a lot of space released when I tried this. While it would be nice for dynamic virtual disks to reduce in size automatically, I accept that there may be data integrity risks with those so the manual process will suffice for now. It has not been needed that often anyway.

Migrating to Windows 10

10th August 2015

While I have had preview builds of Windows 10 in various virtual machines for the most of twelve months, actually upgrading physical and virtual devices that you use for more critical work is a very different matter. Also, Windows 10 is set to be a rolling release with enhancements coming on an occasional basis so I would like to see what comes before it hits the actual machines that I need to use. That means that a VirtualBox instance of the preview build is being retained to see what happens to that over time.

Some might call it incautious but I have taken the plunge and completely moved from Windows 8.1 to Windows 10. The first machine that I upgraded was more expendable and success with that encouraged me to move onto others before even including a Windows 7 machine to see how that went. The 30-day restoration period allows an added degree of comfort when doing all this. The list of machines that I upgraded were a VMware VM with 32-bit Windows 8.1 Pro (itself part of a 32-bit upgrade cascade involving Windows 7 Home and Windows 8 Pro), a VirtualBox VM with 64-bit Windows 8.1, a physical PC that dual booted Linux Mint 17.2 and 64-bit Windows 8.1 and an HP Pavilion dm4 laptop (Intel Core i3 with 8 GB RAM and a 1 TB SSHD) with Windows 7.

The main issue that I uncovered with the virtual machines is that the Windows 10 update tool that is downloaded onto Windows 7 and 8.x does not accept the graphics capability on there. This is a bug because the functionality works fine on the Windows Insider builds. The solution was to download the appropriate Windows 10 ISO image for use in the ensuing upgrade. There are 32-bit and 64-bit disk images with Windows 10 and Windows 10 Pro installation files on each. My own actions used both disk images.

During the virtual machine upgrades, most of the applications that considered important were carried over from Windows 8.1 to Windows without a bother. Anyone would expect Microsoft’s own software like Word, Excel and others to make the transition, but others like Adobe’s Photoshop and Lightroom made it too, as did Mozilla’s Firefox, albeit requiring a trip to Settings to set it as the default option for opening web pages. Less well-known desktop applications like Zinio (digital magazines) or Mapyx Quo (maps for cycling, walking and the like) were the same. Classic Shell was an exception but the Windows 10 Start Menu suffices for now anyway. Also, there was a need to reinstate Bitdefender Antivirus Plus using its new Windows 10 compatible installation file. Still, the experience was a big change from the way things used to be in the days when you used to have to reinstall nearly all your software following a Windows upgrade.

The Windows 10 update tool worked well for the Windows 8.1 PC, so no installation disks were needed. Neither was the bootloader overwritten so the Windows option needed selecting from GRUB every time there was a system reboot as part of the installation process, a temporary nuisance that was tolerated since booting into Linux Mint was preserved. Again, no critical software was lost in the process apart from Kaspersky Internet Security, which needed the Windows 10 compatible version installed, much like Bitdefender, or Epson scanning software that I found was easy to reinstall anyway. Usefully, Anquet’s Outdoor Map Navigator (again used for working with walking and cycling maps) continued to function properly after the changeover.

For the Windows 7 laptop, it was much the same story, albeit with the upgrade being delivered using Windows Update. Then, the main Windows account could be connected to my Outlook account to get everything tied up with the other machines for the first time. Before the obligatory change of background picture, the browns in the one that I was using were causing interface items to appear in red, not exactly my favourite colour for application menus and the like. Now they are in blue and all the upheaval surrounding the operating system upgrade had no effect on the Dropbox or Kaspersky installations that I had in place before it all started. If there is any irritation, it is that unpinning of application tiles from the Start Menu or turning off of live tiles is not always as instantaneous as I would have liked and that is all done now anyway.

While writing the above, I could not help thinking that more observations on Windows 10 may follow, but these will do for now. Microsoft had to get this upgrade process right and it does appear that they have, so credit is due to them for that. So far, I have Windows 10 to be stable and will be seeing how things develop from here, especially when those new features arrive occasionally as is the promise that has been made to us users. Hopefully, that will be as painless as it needs to be to ensure trust is retained.

A little look at Debian 7.0

12th May 2013

Having a virtual machine with Debian 6 on there, I was interested to hear that Debian 7.0 is out. In another VM, I decided to give it a go. Installing it on there using the Net Install CD image took a little while but proved fairly standard with my choice of the GUI-based option. GNOME was the desktop environment with which I went and all started up without any real fuss after the installation was complete; it even disconnnected the CD image from the VM before rebooting, a common failing in many Linux operating installations that lands into the installation cycle again unless you kill the virtual machine.

Though the GNOME desktop looked familiar, a certain amount of conservatism reigned too since the version was 3.4.2. That was no bad thing since raiding the GNOME Extension site for a set of mature extensions was made all the more easy. In fact, a certain number of these was included in the standard installation anyway and the omission of a power off entry on the user menu was corrected as a matter of course without needing any intervention from this user. Adding to what already was there made for a more friendly desktop experience in a short period of time.

Debian’s variant of Firefox , Iceweasel, is version 10 so a bit of tweaking is needed to get the latest version. LibreOffice is there now too and it’s version 3.5 rather than 4. Shotwell too is the older 0.12 and not the 0.14 that is found in the likes of Ubuntu 13.04. As it happens, GIMP is about the only software with a current version and that is 2.8; a slower release cycle may be the cause of that though. All in all, the general sense is that older versions of current software are being included for the sake of stability and that is sensible too so I am not complaining very much about this at all.

The reason for not complaining is that the very reason for having a virtual machine with Debian 6 on there is to have Zinio and Dropbox available too. Adobe’s curtailment of support for Linux means that any application needing Adobe Air may not work on a more current Linux distribution. That affects Zinio so I’ll be retaining a Debian 6 instance for a while yet unless a bout of testing reveals that a move to the newer version is possible. As for Dropbox, I am sure that I can recall why I moved it onto Debian but it’s working well on there so I am in no hurry to move it over either. There are times when slower software development cycles are better…

Getting rid of a Dropbox error message on a Linux-powered PC

24th September 2012

One of my PC’s has ended up becoming a testing ground for a number of Linux distributions. The list has included openSUSE, Fedora, Arch and LMDE with Sabayon being the latest incumbent. From Arch onwards in that list though, a message has appeared on loading the desktop with every one of these when I have Dropbox’s client set up on there:

Unable to monitor entire Dropbox folder hierarchy. Please run “echo 100000 | sudo tee /proc/sys/fs/inotify/max_user_watches” and restart Dropbox to correct the problem.

Even applying the remedy that the message suggests won’t permanently fix the problem. For that, you need to edit /etc/sysctl.conf with superuser access and add the following line to it:

fs.inotify.max_user_watches = 100000

With that in place, you can issue the following command to fix the problem in the current session (assuming your user account is listed in /etc/sudoers):

sudo sysctl -p & dropbox stop & dropbox start

A reboot should demonstrate that the messages no longer appear again. For a good while, I had ignored it but curiosity eventually got me to find out how it could be stopped and led to what you find above.

A little more freedom

10th December 2011

A few weeks ago, I decided to address the fact that my Toshiba laptop have next to useless battery life. The arrival of an issue of PC Pro that included a review of lower cost laptops was another spur and I ended up looking on the web to see what was in stock at nearby chain stores. In the end, I plumped for an HP Pavilion dm4 and it was Argos that supplied yet another piece of computing kit to me. In fact, they seem to have a wider range of laptops than PC World!

The Pavillion dm4 seems to come in two editions and I opted for the heavier of these though it still is lighter than my Toshiba Equium as I found on a recent trip away from home. Its battery life is a revelation for someone who never has got anything better than three hours from a netbook. Having more than five hours certainly makes it suitable for those longer train journeys away from home and I have seen remaining battery life being quoted as exceeding seven hours from time to time though I wouldn’t depend on that.

Of course, having longer battery life would be pointless if the machine didn’t do what else was asked of it. It comes with the 64-bit of Windows 7 and this thought me that this edition of the operating system also runs 32-bit software, a reassuring discovering. There’s a trial version of Office 2010 on there too and, having a licence key for the Home and Student edition, I fully activated it. Otherwise, I added a few extras to make myself at home such as Dropbox and VirtuaWin (for virtual desktops as I would in Linux). While I playing with the idea of adding Ubuntu using Wubi, I am not planning to set up dual booting of Windows and Linux like I have on the Toshiba. Little developments like this can wait.

Regarding the hardware, the CPU is an Intel Core i3 affair and there’s 4 MB of memory on board. The screen is a 14″ one and that makes for a more compact machine without making it too diminutive. The keyboard is of the scrabble-key variety and works well too as does the trackpad. There’s a fingerprint scanner for logging in and out without using a password but I haven’t got to checking how this works so far. It all zips along without any delays and that’s all that anyone can ask of a computer.

There is one eccentricity in my eyes though and it seems that the functions need to be used in combination with Fn for them to work like they would on a desktop machine. That makes functions like changing the brightness of the screen, adjusting the sound of the speakers and turning the WiFi on and off more accessible. My Asus Eee PC netbook and the Toshiba Equium both have things the other way around so I found this set of affairs unusual but it’s just a point to remember rather than being a nuisance.

HP may have had its wobbles regarding its future in the PC making business but the Pavilion feels well put together and very solidly built. It commanded a little premium over the others on my shortlist but it seems to have been worth it. If HP does go down the premium laptop route as has been reported recently, this is the kind of quality that they would need to deliver to just higher prices. Saying that, is this the time to do such a thing would other devices challenging the PC’s place in consumer computing? It would be a shame to lose the likes of the Pavilion dm4 from the market to an act of folly.

Adding Software to Arch Linux from the AUR

3rd December 2011

There are packages absent from the Arch Linux repositories that could come in useful. When you are after one of these, then it’s time to search the Arch User Repository (AUR). In here, I have found the likes of Microsoft Core Fonts, Adobe Reader and Dropbox. There may be others but these examples are what comes to mind as I write this. In time, it may be that packages make if from the AUR into the Arch community repository but you have to use the former if you cannot wait.

Just search the AUR for what you want and download the tarball (tar.gz file) from the webpage where you find it. Then, I recommend extracting it to /tmp where clearance  at boot time means that you don’t need to do it yourself. Then, going into the appropriate subfolder in /tmp (acroread for Adobe Reader, for instance) and issue the following command:

makepkg

This will attempt to create a package file where you are working for installation by pacman. If dependencies are absent, you will be told and these may need another AUR search in some cases though most are included in the repositories. Once dependencies, have been sorted, just issue the makepkg command again to create the xz file that pacman needs to perform the installation. To do so, issue the following command from the same directory either as root or by using sudo if your user account has such privileges:

pacman -U *.xz

There usually is but one xz archive in a package folder so I have been taking the easy route of not looking up the name all of the time. Of course, you can do that for safety if you want.

With pacman not looking at the AUR, you have to do more work to get upgrades to happen if you want to avoid without having to repeat the above process all of the time. There is a package in the AUR called yaourt that needs package-query from the same place as well. Before any of these, yajl needs to installed from one of the default repositories. Once yaourt is in place, then the following does the updates for you:

yaourt -Syu --aur

Again, it might be best run this as root or using sudo though that gives messages from makepkg about not running it as a privileged user. However, I reckon that those might need to be ignored. When I tried it, the Citrix update failed though the Dropbox one succeeded. This experience might be worth bearing in mind. Saying that, I have found installing and updating software from the AUR not to be too onerous a process so far. Anything that gives a little more freedom only can be a good thing.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.