Technology Tales

Adventures & experiences in contemporary technology

Welcome

  • Technology is pervasive these days and touches our lives in many ways we do not realise. Software matters as much as hardware, and there is more to the latter than consumer devices we use for computing, telephony, photography or videography. From all this, we should recognise that computer code is at work in many different places.

  • All the appraisals and how-tos that you find here come from the above. Devices may not feature that much but working with computer software does, and that even extends to its creation. In addition, system management and data computing inspire new posts on finding something worthwhile to share. The hope is that you encounter something interesting in whatever you find.

What to do when Tuta Mail issues this message when logging into an account on macOS: Could not access Secret Storage

24th September 2024

Two things changed before Tuta Mail stopped working as before: modifying Keychain Access settings and upgrading macOS from Sonoma to Sequoia. Either could have been a cause or none of them. The first of these was more likely a culprit than the other.

The result was the same: logging into Tuta Mail yielded an error like this: Could not access Secret Storage. The solution essentially is a two-step process: remove the app and delete its settings folder. Reinstallation then happens after these.

In Finder, go to Applications and move Tuta Mail to the Bin before clearing it from there. That uninstalls the app.

The next step needs you show hidden files and folders using the Command + Shift + . shortcut. Then, go to your home folder (this may need use of the Command + Shift + H shortcut). Open up the Library folder and find the folder called Application Support. Enter that and find the subfolder named tutanota-desktop. That needs to go to the Bin too before expunging it from there. Doing that provides the clean slate for restoration to commence.  After this, using the Command + Shift + . shortcut again hides the normally hidden files and folders once more.

Nothing is resolved with the removal of /Users/[username]/Library/Application Support/tutanota-desktop. Using the rm command from the command line interface will remove it faster than Finder, though that may be easier for many users.

Clearing the Julia REPL

23rd September 2024

During development, there are times when you need to clear the Julia REPL. It can become so laden with content that it becomes hard to perform debugging of your code. One way to accomplish this is issuing the CTRL + L keyboard shortcut while focus is within the REPL; you need to click on it first. Another is to issue the following in the REPL itself:

print("\033c")

Here \033 is an escape code in octal format. It is often used in terminal control sequences. The c character is what resets the terminal to its initial state. Printing this sequence is what does the clearance and variations can be used to clear other kinds of console screens too. That makes it a more generic solution.

Dropping to an underlying shell using the ; character is another possibility. Then, you can use the clear or cls commands as needed; the latter is for Windows systems.

One last option is to define a Julia function for doing this:

function clear_console()
    run(`clear`)  # or `cls` for Windows
end

Calling the clear_console function then clears the screen programmatically, allowing for greater automation. The run function is the one that sends that command in backticks to the underlying shell for execution. Even using that alone should work too.

Little helpers

22nd September 2024

This could have been a piece that appeared on my outdoors blog until I got second thoughts. One reason why I might have done so is that I am making more use of Perplexity for searching the web and gaining more value from its output. However, that is proving more useful in writing what you find on here. Knowing the sources for a dynamically generated article adds more confidence when fact checking, and it is remarkable what comes up that you would find quickly with Google. There is added value with this one.

A better candidate would have been Anthropic’s Claude. That has come in handy when writing trip reports. Being able to use a stub to prototype a blog entry really has its uses. The reality is that everything gets rewritten before anything gets published; these tools are never so good as to feature everything that you want to mention, even if they do a good job of mimicking your writing tone and style. Nevertheless, being able to work with the content beyond doing a brain dump from one’s memory is an undeniable advance.

Sometimes, there are occasions when using Bing’s access to OpenAI through Copilot helps with production of images. In reality, I do have an extensive personal library of images, so they possibly should suffice in many ways. However, curiosity about the technology overrides the effort that photo processing requires.

While there may be some level of controversy surrounding the use of AI tools in content creation, using such tooling for proofing content should not raise too much ire. Grammarly comes up a lot, though it is LanguageTool that I use to avoid excessive butting into my writing style. That has changed to comply with rules that had passed me without my noticing, but there are other things that need to be turned off. Configuring the proof tools in other ways might be better, so that is something to explore, or we could end up with too much standardisation of writing; there needs to be room for human creativity at all times.

All of these are just a sample of what is available. Just checking in with The Rundown AI will reveal that there is an onslaught of innovation right now. Hype also is a problem, yet we need to learn to use these tools. The changeover is equivalent to the explosive increase in availability of personal computing a generation ago. That brought its own share of challenges (some were on the curve while others were not) until everything settled down, and it will be the same with what is happening now.

Avoiding errors caused by missing Julia packages when running code on different computers

15th September 2024

As part of an ongoing move to multi-location working, I am sharing scripts and other artefacts via GitHub. This includes Julia programs that I have. That has led me to realise that a bit of added automation would help iron out any package dependencies that arise. Setting up things as projects could help, yet that feels a little too much effort for what I have. Thus, I have gone for adding extra code to check on and install any missing packages instead of having failures.

For adding those extra packages, I instate the Pkg package as follows:

import Pkg

While it is a bit hackish, I then declare a single array that lists the packages to be checked:

pkglits =["HTTP", "JSON3", "DataFrames", "Dates", "XLSX"]

After that, there is a function that uses a try catch construct to find whether a package exists or not, using the inbuilt eval macro to try a using declaration:

tryusing(pkgsym) = try
@eval using $pkgsym
return true
catch e
return false
end

The above function is called in a loop that both tests the existence of a package and, if missing, installs it:

for i in 1:length(pkglits)
rslt = tryusing(Symbol(pkglits[i]))
if rslt == false
Pkg.add(pkglits[i])
end
end

Once that has completed, using the following line to instate the packages required by later processing becomes error free, which is what I sought:

using HTTP, JSON3, DataFrames, Dates, XLSX

Saving yourself a reboot: remounting any overlooked volumes in Linux

14th September 2024

Recently, I got things a little out of order when starting up my main Linux system after an absence. Usually, I start up my NAS first so that the volumes get mounted when I start my Linux machine. However, it happened that I near enough started them together. Thus, my workstation completed it startup without having the NAS volumes mounted. A reboot would have sorted this, but there was another way: issuing the command that you see below:

sudo mount -a

This looked in my /etc/fstab file and mounted anything that was missing as long as the noauto option was not set. Because this was executed after the NAS had completed its own boot process, it volumes were not mounted on my system and fully available for what I needed to do next. If I had wanted to see what had been mounted, then I needed to issue the following command instead:

sudo mount -av

In addition to the a switch that triggers the mounting of missing volumes, there is now a v (for verbose) one for telling you what has happened. Needless to say, all this happens only if your /etc/fstab file is set up properly. If you are adding a new volume, and I was not, it does no harm to mount it manually before updating the configuration file. That should catch any errors first.

What to do when a GPG signature becomes invalid for a package repository on Linux Mint

12th September 2024

During a package update on my main Linux system, I encountered the following kind of error message:

An error occurred during the signature verification. The repository is not updated and the previous index files will be used. GPG error: https://cli.github.com/packages stable InRelease: The following signatures were invalid: EXPKEYSIG <GPG Key> GitHub CLI

The message indicated a problem with the GPG signature verification for the GitHub CLI repository. The cause was that the signature for the repository was invalid, preventing the package manager from updating the repository’s index files. The first step then was to remove the invalid GPG key using the following command:

sudo apt-key del <GPG Key>

With the invalid GPG key removed, the next step is to add the new GPG key for the GitHub CLI repository by issuing the following command:

curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | sudo tee /usr/share/keyrings/githubcli-archive-keyring.gpg > /dev/null

Once I had the new GPG key, I was able to use my usual system update process without any problem. The error message was gone, and updates and upgrades proceeded as intended.

Getting rsync to resolve symbolic links

11th September 2024

Given how Dropbox changed its handling of symbolic links in 2019 such that internal links within a Dropbox file hierarchy got fixed and links leading outside from the Dropbox area no longer worked. Thankfully, the rsync utility found in many Linux and UNIX settings does not do that as long as you have called it correctly.

By default, symbolic links are synchronised like any other file. That is what Dropbox does now. To get rsync to resolve the links as shortcuts to either a single file or more likely a folder containing more than one file, it needs the -L switch or option in the command. When that is present, the linked file or files will get synchronised and honours the point of having these links in the first place: allowing more flexibility with folder structures and avoiding any duplication of files and folders.

 

Unzipping more than one file at a time in Linux and macOS

10th September 2024

To me, it sounded like a task for shell scripting, but I wanted to extract three zip archives in one go. They had come from Google Drive and contained different splits of the files that I needed, raw images from a camera. However, I found a more succinct method than the line of code that you see below (it is intended for the BASH shell):

for z in *.zip; do; unzip "$z"; done

That loops through each file that matches a glob string. All I needed was something like this:

unzip '*.zip'

Without embarking on a search, I got close but have not quoted the search string. Without the quoting, it was not working for me. To be sure that I was extracting more than I needed, I made the wildcard string more specific for my case.

Once the extraction was complete, I moved the files into a Lightroom Classic repository for working on them later. All this happened on an iMac, but the extraction itself should work on any UNIX-based operating system, so long as the shell supports it.

A way to survey hours of daylight for locations of interest

9th September 2024

A few years back, I needed to get sunrise and sunset information for a location in Ireland. This was to help me plan visits to a rural location with a bus service going nearby, and I did not want to be waiting on the side of the road in the dark on my return journey. It ended up being a project that I undertook using the Julia programming language.

This had other uses too: one was the planning of trips to North America. This was how I learned that evenings in San Francisco were not as long as their counterparts in Ireland. Later, it had its uses in assessing the feasibility of seeing other parts of the Pacific Northwest during the month of August. Other matters meant that such designs never came to anything.

The Sunrise Sunset API was used to get the times for the start and end of daylight. That meant looping through the days of the year to get the information, but I needed to get the latitude and longitude information from elsewhere in order to fuel that process. While Google Maps has its uses with this, it is a manual and rather fiddly process. Sparing use of Nomintim‘s API is what helped with increasing the amount of automation and user-friendliness, especially what comes from OpenStreetMap.

Accessing using Julia’s HTTP package got me the data in JSON format that I then converted into atomic vectors and tabular data. The end product is an Excel spreadsheet with all the times in UTC. A next step would be to use the solar noon information to port things to the correct timezone. It can be done manually in Excel and its kind, but some more automation would make things smoother.

Pandemic camera

8th September 2024

Back at the end of 2019, I acquired a Canon EOS 90D, possibly the swansong for mid-range Canon SLR cameras. Much effort is going into mirrorless cameras, yet I retain affection for SLR cameras because of their optical viewfinders. That may have been part of the reason for the acquisition, when I already had an ageing Pentax K5 Mark II. Buying SLR cameras is one way to keep them in production.

Little did I know what lay ahead in 2020 at that stage. Until recently, this was not to be a camera that travelled widely, such were the restrictions. Nevertheless, battery life is superb and handling is good too. The only absence is not having a level in the viewfinder like the Pentax K3 Mark III or maybe any mirrorless camera.

The newer file type of CR3 caught me out at first until I adjusted my command line tooling to deal with that. File sizes were larger as well, which has an impact on storage. Otherwise, there was little to change in my workflow. That would take other technological changes, like the increasing amount of AI being built into Adobe software.

Outdoor photography is my mainstay, and it excelled at that. The autofocus works well on its 24 to 135 mm zoom lens, except perhaps from focussing on skyscapes at times. Metering produced acceptable results, though it differed from the Pentax output to which I had become accustomed. All in all, it slipped into a role like other cameras that I had.

Throughout 2020 and 2021, it provided the required service alongside other cameras that I had. The aforementioned Pentax remained in use, like an Olympus and another Canon. Overseas travel curtailed horizons, so it was around local counties like Cheshire, Derbyshire, Staffordshire and Shropshire. In September 2020, it travelled to Llandudno in North Wales, an exception to the general trend of English hikes and cycles.

Since then, it has been superseded, though. A Pentax K3 Mark III made it into my possession to become my main camera, returning me near enough to my pre-2020 practice. Curiosity about Canon mirrorless options added a Canon EOS RP and a 24 to 240 mm zoom lens. That has shorter battery life than is ideal, and its level is not as helpful as that on the Pentax K3 Mark III or the aforementioned Olympus. If anything, it may get replaced while the EOS 90D remains. My getting a new base in Ireland means that it has gone there to save me carrying a camera there from England. That should give it a new lease of life.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.