TOPIC: DROPBOX
Getting rsync to resolve symbolic links
11th September 2024Given how Dropbox changed its handling of symbolic links in 2019 such that internal links within a Dropbox file hierarchy got fixed and links leading outside from the Dropbox area no longer worked. Thankfully, the rsync
utility found in many Linux and UNIX settings does not do that, as long as you have called it correctly.
By default, symbolic links are synchronised like any other file. That is what Dropbox does now. To get rsync
to resolve the links as shortcuts to either a single file or more likely a folder containing more than one file, it needs the -L
switch or option in the command. When that is present, the linked file or files will get synchronised and honours the point of having these links in the first place: allowing more flexibility with folder structures and avoiding any duplication of files and folders.
A desktop Markdown editing environment
8th November 2022Earlier this year, I changed over two websites from dynamic versions using content management systems to static ones by using Hugo to build them from Markdown files. That meant that I needed to look at the editing of Markdown, even if it is a fairly simple file format. For one thing, Grammarly can be incorporated into WordPress, so I did not want to lose something like that.
The latter point meant that I was steered away from plain text editors. Otherwise, there are online ones like StackEdit and Dillinger, but the Firefox Grammarly plugin only appears to work on the first of these, and even then, only partially in my experience. While Dillinger does offer connections to online file storage providers like Google, Dropbox and OneDrive, I wanted to store files on my desktop for upload to a web server. It also works with GitHub, but I prefer to use another web hosting provider.
There are various specialised Markdown editors for desktop usage like Typora, ReText, Formiko or Ghostwriter, yet I chose none of these. My actual choice may surprise many: it was Visual Studio Code. The availability of a Grammarly plug-in was what swayed it for me, even if it did need to be switched on for Markdown files. In many ways, it does work as smoothly as elsewhere because it gets fooled by links and other code-like pieces of text. Also, having the added ability to add words to a custom dictionary would be ideal. Some rule overriding is available, but I am not sure that everything is covered, even if the list of options is lengthy. Some time is needed to inspect all of them before I proceed any further. Thus far, things are working well enough for me.
Limiting Google Drive upload & synchronisation speeds using Trickle
9th October 2021Having had a mishap that lost me some photos in the early days of my dalliance with digital photography, I have been far more careful since then and that now applies to other files as well. Doing regular backups is a must that you find reiterated by many different authors, and the current computing climate makes doing that more vital than it ever was.
So, as well as having various local backups, I also have remote ones in the form of OneDrive, Dropbox and Google Drive. While these more correctly are file synchronisation services, disciplined use can make them useful as additional storage facilities in the interests of maintaining added resilience. There also are dedicated backup services that I have seen reviewed in the likes of PC Pro magazine, but I have to make use of those.
Insync
Part of my process for dealing with new digital photo files is to back them up to Google Drive, and I did that with a Windows client in the early days but then moved to Insync running on Linux Mint. One drawback to the approach is that this hogs the upload bandwidth of an internet connection that has yet to move to fibre from copper cabling. While having fibre connections to a local cabinet helps, a 100 KiB/s upload speed is easily overwhelmed and digital photo file sizes keep increasing. It does not help that I insist on using more flexible raw formats like DNG, CR2 or CR3 either.
While making fewer images could help to cut the load, I still come away from an excursion with many files because I get so besotted with my surroundings. This means that upload sessions take numerous hours and can extend across calendar days. Ultimately, this makes my internet connection far less usable; hence I want to throttle upload speed, much like what is possible in the Transmission BitTorrent client or in the Dropbox client. Since this is not available in Insync, I have tried using the trickle command instead, and an example is below:
trickle -d 2000 -u 50 insync
Here, the upload speed is limited to 50 KiB/s while the download speed is limited to 2000 KiB/s. In my case, the latter of these hardly matters, while the former leaves me with acceptable internet usability. Insync does not work smoothly with this, though, so occasional restarts are needed to keep file uploads progressing and CPU load also is higher. As rough as the user experience feels, uploads can continue in parallel with other work.
gdrive
One other option that I am exploring is the use of the command-line tool gdrive and this appears to work well with trickle. After downloading and installing the tool, getting going is a matter of issuing the following command and following the instructions:
gdrive about
On web servers, I even have the tool backing up things to Google Drive on a scheduled basis. Because of a Google Drive limitation that I have encountered not only with gdrive but also with Insync and Google's own Windows Google Drive client, synchronisation only happens with two new folders, one local and the other remote. Handily, gdrive supports the usual bash style commands for working with remote directories, so something like the following will create a directory on Google Drive:
gdrive mkdir ttdc [ID for parent folder]
Here, the ID for the parent folder may be omitted, though it can be obtained by going to Google Drive online and getting a link location by right-clicking on a folder and choosing the appropriate context menu item. This gets you something like the following and the required identifier is found between the last slash and the first question mark in the address string (so as not to share any real links, I made the address more general below):
https://drive.google.com/drive/folders/[remote folder ID]?usp=sharing
Then, synchronisation uses a command like the following:
gdrive sync upload [local folder or file path] [remote folder ID]
There also is the option to do a one-way upload, and this is the form of the command used:
gdrive upload [local folder or file path] -p [remote folder ID]
Because every file or folder object has its own ID on Google Drive, it is possible to create two objects on there that appear to have the same name, though that is sure to cause confusion even if you know what is happening. It is possible in each of the above to throttle them using trickle as well:
trickle -d 2000 -u 50 gdrive sync upload [local folder or file path] [remote folder ID]
trickle -d 2000 -u 50 gdrive upload [local folder or file path] -p [remote folder ID]
Handily, this works without the added drama seen with Insync and lends itself to scripting as well, so it could be something that I will incorporate into my current workflow. One thing that needs to be watched is file upload failures, but there may be ways to catch those and retry them, which would be another thing that needs doing. This is built into Insync, and it would be a learning opportunity if I were to stick with gdrive instead.
Rethinking photo editing
17th April 2018Photo editing has been something that I have been doing since my first-ever photo scan in 1998 (I believe it was in June of that year but cannot be completely sure nearly twenty years later). Since then, I have been using a variety of tools for the job and wondered how other photos can look better than my own. What cannot be excluded is my preference for being active in the middle of the day when light is at its bluest, as well as a penchant for using a higher ISO of 400. In other words, what I do when making photos affects how they look afterwards as much as the weather that I had encountered.
My reason for mentioning the above aspects of photographic craft is that they affect what you can do in photo editing afterwards, even with the benefits of technological advancement. My tastes have changed over time, so the appeal of re-editing old photos fades when you realise that you only are going around in circles and there always are new ones to share, so that may be a better way to improve.
When I started, I was a user of Paint Shop Pro but have gone over to Adobe since then. First, it was Photoshop Elements, but an offer in 2011 lured me into having Lightroom and the full version of Photoshop. Nowadays, I am a Creative Cloud photography plan subscriber, so I get to see new developments much sooner than once was the case.
Even though I have had Lightroom for all that time, I never really made full use of it and preferred a Photoshop-based workflow. Lightroom was used to select photos for Photoshop editing, mainly using adjustments for such things as tones, exposure, levels, hue and saturation. Removal of dust spots, resizing and sharpening were other parts of a still minimalist approach.
What changed all this was a day spent pottering about the 2018 Photography Show at the Birmingham NEC during a cold snap in March. That was followed by my checking out theĀ Adobe YouTube Channel afterwards, where there were videos of the talks featured every day of the four-day event. Here are some shortcuts if you want to do some catching up yourself: Day 1, Day 2, Day 3, and Day 4. Be warned though that these videos are long in that they feature the whole day and there are enough gaps that you may wish to fast-forward through them. Even so, there is quite a bit of variety of things to see.
Of particular interest were the talks given by the landscape photographer David Noton who sensibly has a philosophy of doing as little to his images as possible. It helps that his starting points are so good that adjusting black and white points with a little tonal adjustment does most of what he needs. Vibrancy, clarity and sharpening adjustments are kept to a minimum, while some work with graduated filters evens out exposure differences between skies and landscapes. It helps that all this can be done in Lightroom, so that set me thinking about trying it out for size, and the trick of using the backslash (\) key to switch between raw and processed views is a bonus granted by non-destructive editing. Others may have demonstrated the creation of composite imagery, but simplicity is more like my way of working.
It is confusing that we now have cloud-based Lightroom CC, while the previous desktop version is called Lightroom Classic CC. Although the former offers easy dust spot removal and other features, I prefer the latter because I do not want to upload my entire image library, and I already use Google Drive and Dropbox for off-site backup. The mobile app is interesting since it allows capturing images on mobile devices in Adobe's raw DNG format. My workflow is now more Lightroom-based than before, and I appreciate the new technology, especially as Adobe develops its Sensai artificial intelligence engine. Because Adobe has access to numerous images through Lightroom CC and Adobe Stock (formerly Fotolia), it has abundant data to train this AI system.
Getting rid of a Dropbox error message on a Linux-powered PC
24th September 2012One of my PC's has ended up becoming a testing ground for a number of Linux distributions. The list has included openSUSE, Fedora, Arch and LMDE with Sabayon being the latest incumbent. From Arch onwards in that list, though, a message has appeared on loading the desktop with every one of these when I have Dropbox's client set up on there:
Unable to monitor entire Dropbox folder hierarchy. Please run "echo 100000 | sudo tee /proc/sys/fs/inotify/max_user_watches" and restart Dropbox to correct the problem.
Even applying the remedy that the message suggests won't permanently resolve the issue. For that, you need to edit /etc/sysctl.conf
with superuser access and add the following line to it:
fs.inotify.max_user_watches = 100000
With that in place, you can issue the following command to sort out the problem in the current session (assuming your user account is listed in /etc/sudoers
):
sudo sysctl -p & dropbox stop & dropbox start
A reboot should demonstrate that the messages no longer appear again. For a good while, I had ignored it, but curiosity eventually got me to find out how it could be stopped and led to what you find above.
Adding Software to Arch Linux from the AUR
3rd December 2011There are packages absent from the Arch Linux repositories that could come in useful. When you are after one of these, then it's time to search the Arch User Repository (AUR). In here, I have found the likes of Microsoft Core Fonts, Adobe Reader and Dropbox. While there may be others, these examples are what comes to mind as I write this. In time, it may be that packages make if from the AUR into the Arch community repository, but you have to use the former if you cannot wait.
Just search the AUR for what you want and download the compressed tarball (tar.gz
file) from the webpage where you find it. Then, I recommend extracting it to /tmp
where clearance at boot time means that you don't need to do it yourself. Then, go into the appropriate subfolder in /tmp
(acroread
for Adobe Reader, for instance) and issue the following command:
makepkg
This will attempt to create a package file where you are working for installation by pacman. If dependencies are absent, you will be told and these may need another AUR search in some cases, though most are included in the repositories. Once dependencies, have been sorted, just issue the makepkg
command again to create the xz
file that pacman needs to perform the installation. To do so, issue the following command from the same directory either as root or by using sudo if your user account has such privileges:
pacman -U *.xz
There is usually just one xz
archive in a package folder, so I have been taking the easy route of not looking up the name all the time. Of course, you can do that for safety if you like.
With pacman not looking at the AUR, you have to do more work to get upgrades to happen should you want to avoid without having to repeat the above process all the time. There is a package in the AUR called yaourt
that needs package-query from the same place as well. Before any of these, yajl
needs to be installed from one of the default repositories. Once yaourt
is in place, then the following does the updates for you:
yaourt -Syu --aur
Again, it might be best to run this as root or using sudo
though that gives messages from makepkg
about not running it as a privileged user. However, I reckon that those might need to be ignored. When I tried it, the Citrix update failed, though the Dropbox one succeeded. This experience might be worth bearing in mind. Saying that, I have found installing and updating software from the AUR not to be too onerous a process so far. Anything that gives a little more freedom only can be a good thing.
Moving from Ubuntu 10.10 to Linux Mint 10
23rd April 2011With a long Easter weekend available to me and with thoughts of forthcoming changes in the world of Ubuntu, I got to wonder about the merits of moving my main home PC to Linux Mint instead. Though there is a rolling variant based on Debian, I went for the more usual one based on Ubuntu that uses GNOME. For the record, Linux Mint isn't just about the GNOME desktop, but you also can have it with Xfce, LXDE and KDE desktops as well. While I have been known to use Lubuntu and like its LXDE implementation, I stuck with the option with which I have most experience.
Once I selected the right disk for the bootloader, the main installation of Mint went smoothly. By default, Ubuntu seems to take care of this, while Mint leaves it to you. When you have your operating system files on sdc
, installation on the default of sda
isn't going to produce a booting system. Instead, I ended up with GRUB errors and, while I suppose that I could have resolved these, the lazier option of repeating the installation with the right bootloader location was the one that I chose. It produced the result that I wanted: a working and loading operating system.
However, there was not something not right about the way that the windows were displayed on the desktop, with title bars and window management not working as they should. Creating a new account showed that it was the settings that were carried over from Ubuntu in my home area that were the cause. Again, I opted for a less strenuous option and moved things from the old account to the new one. One outcome of that decision was that there was a lot of use of the chown command to get file and folder permissions set for the new account. To make this all happen, the new account needed to be made into an Administrator just like its predecessor; by default, more restrictive desktop accounts are created using the Users and Groups application from the Administration submenu. Once I was happy that the migration was complete, I backed up any remaining files from the old user folder and removed it from the system. Some of the old configuration files were to find a new life with Linux Mint.
In the middle of the above, I also got to customise my desktop to get the feel that is amenable. For example, I do like a panel at the top and another at the bottom. By default, Linux Mint only comes with the latter. The main menu was moved to the top because I have become used to having there, and switchers for windows and desktops were added at the bottom. They were only a few from what has turned out not to be a short list of things that I fancied having: clock, bin, clearance of desktop, application launchers, clock, broken application killer, user switcher, off button for PC, run command and notification area. It all was gentle tinkering, but still is the sort of thing that you wouldn't want to have to do over and over again. Let's hope that is the case for Linux Mint upgrades in the future. That the configuration files for all of these are stored in the home area hopefully should make life easier, especially when an in-situ upgrade like that for Ubuntu isn't recommended by the Mint team.
With the desktop arranged to my liking, the longer job of adding to the collection of software on there, while pruning a few unwanted items too, was next. Having had Apache, PHP and MySQL on the system before I popped in that Linux Format magazine cover disk for the installation, I wanted to restore them. To get the off-line websites back, I had made copies of the old Apache settings that simply were copied over the defaults in /etc/apache
(in fact, I simply overwrote the apache
directory in /etc
, but the effect was the same). Using MySQL Administrator enabled the taking of a backup of the old database too. In the interests of spring-cleaning, I only migrated a few of the old databases from the old system to the new one. In fact, there was an element of such tidying in my mind when I decided to change Linux distribution in the first place; Ubuntu hadn't been installed from afresh onto the system for a while anyway and some undesirable messages were appearing at update time though they were far from being critical errors.
The web server reinstatement was only part of the software configuration that I was doing, and there was a lot of use of apt-get while this was in progress. A rather diverse selection was added: Emacs, NEdit, ClamAV, Shotwell (just make sure that your permissions are sorted first before getting this to use older settings because anything inaccessible just gets cleared out; F-Spot was never there at first in my case, but it may differ for you), UFRaw, Chrome, Evolution (never have been a user of Mozilla Thunderbird, the default email client on Mint), Dropbox, FileZilla, MySQL Administrator, MySQL Query Browser, NetBeans, POEdit, Banshee (while Rhythmbox is what comes with Mint, I replaced it with this), VirtualBox and GParted. This is quite a list and while I maybe should have engaged the services of dpkg to help automate things, I didn't do that on this occasion, though Mint seems to have a front end for it that does the same sort of thing. Given that the community favours clean installations, it's little that something like this is on offer in the suite of tools in the standard installation. This is the type of rigmarole that one would not draw on themselves too often.
With desktop tinkering and software installations complete, it was time to do a little more configuration. To get my HP laser printer going, I ran hp-setup to download the (proprietary, RMS will not be happy...) driver for it because it otherwise wouldn't work for me. Fortune was removed from the terminal sessions because I like them to be without such things. To accomplish this, I edited /etc/bash.bashrc
and commented out the /usr/games/fortune
line before using apt-get to clear the software from my system. Being able to migrate my old Firefox and Evolution profiles, albeit manually, has become another boon. Undoubtedly, there are more adjustments that I could be making, but I am happy to do these as and when I get to them. So far, I have a more than usable system, even if I engaged in more customisation than many users would go doing.
Let's finish this with some of my impressions of Linux Mint. What goes without saying is that some things are done differently, which is to be expected. Distribution upgrades are just one example, while there are tools available to make clean installations that little bit easier. To my eyes, the desktop looks very clean and fond display is carried over from Ubuntu, not at all a bad thing. While it may sound like a small matter, it does appear to me that Fedora and openSUSE could learn a thing or too about how to display fonts onscreen on their systems. It is the sort of thing that adds the spot of polish that leaves a much better impression. So far, it hasn't been any hardship to find my way around; it helps that I can make the system fit my wants and needs. That it looks set to stay that way is another bonus. We have a lot of change coming in the Linux world, with GNOME 3 on the way and Ubuntu's decision to use Unity as their main desktop environment. While watching both of these developments mature, it looks as if I'll be happily using Mint. Change can refresh, while a bit of stability is good too.
On upgrading from Fedora 13 to Fedora 14
7th November 2010My Fedora box recently got upgraded to the latest version of the distribution (14) and I stuck to a method that I have used successfully before and one that isn't that common with variants of Linux either. What I did was to go to the Fedora website and download a full DVD image, burn it to a disk and boot from that. Then, I chose the upgrade option from the menus and all went smoothly with only commonplace options needing selection from the menus and no data got lost either. Apparently, this way of going about things is only offered by the DVD option because the equivalent Live CD versions only do full installations.
However, there was another option that I fancied trying, but was stymied by messages about a troublesome Dropbox repository. As I later discovered, that would have been easily sorted, only for my opting for a tried and tested method instead. This was a pity because only two commands would have needed to be issued when logged in as root, and it would have been good to have had a go with them:
yum update yum
yum --releasever=14 update --skip-broken
These may have done what I habitually do with Ubuntu upgrades but trying them out either will have to await the release of the next version or my getting around to setting up a Fedora virtual machine to see what happens. The latter course of action might be sensible anyway to see if all works without any problems before doing it for a real PC installation.