Keeping a graphical eye on CPU temperature and power consumption on the Linux command line
Following my main workstation upgrade in January, some extra monitoring has been needed. This follows on from the experience with building its predecessor more than three years ago.
Being able to do this in a terminal session keeps things lightweight, and I have done that with text displays like what you see below using a combination of sensors and nvidia-smi in the following command:
watch -n 2 "sensors | grep -i 'k10'; sensors | grep -i 'tdie'; sensors | grep -i 'tctl'; echo "" | tee /dev/fd/2; nvidia-smi"
Everything is done within a watch command that refreshes the display every two seconds. Then, the panels are built up by a succession of commands separated with semicolons, one for each portion of the display. The grep command is used to pick out the desired output of the sensors command that is piped to it; doing that twice gets us two lines. The next command, echo "" | tee /dev/fd/2, adds an extra line by sending a space to STDERR output before the output of nvidia-smi is displayed. The result can be seen in the screenshot below.

However, I also came across a more graphical way to do things using commands like turbostat or sensors along with AWK programming and ttyplot. Using the temperature output from the above and converting that needs the following:
while true; do sensors | grep -i 'tctl' | awk '{ printf("%.2f\n", $2); fflush(); }'; sleep 2; done | ttyplot -s 100 -t "CPU Temperature (Tctl)" -u "°C"
This is done in an infinite while loop to keep things refreshing; the watch command does not work for piping output from the sensors command to both the awk and ttyplot commands in sequence and on a repeating, periodic basis. The awk command takes the second field from the input text, formats it to two places of decimals and prints it before flushing the output buffer afterwards. The ttyplot command then plots those numbers on the plot seen below in the screenshot with a y-axis scaled to a maximum of 100 (-s), units of °C (-u) and a title of CPU Temperature (Tctl) (-t).

A similar thing can be done for the CPU wattage, which is how I learned of the graphical display possibilities in the first place. The command follows:
sudo turbostat --Summary --quiet --show PkgWatt --interval 1 | sudo awk '{ printf("%.2f\n", $1); fflush(); }' | sudo ttyplot -s 200 -t "Turbostat - CPU Power (watts)" -u "watts"
Handily, the turbostat can be made to update every so often (every second in the command above), avoiding the need for any infinite while loop. Since only a summary is needed for the wattage, all other output can be suppressed, though everything needs to work using superuser privileges, unlike the sensors command earlier. Then, awk is used like before to process the wattage for plotting; the first field is what is being picked out here. After that, ttyplot displays the plot seen in the screenshot below with appropriate title, units and scaling. All works with output from one command acting as input to another using pipes.

All of this offers a lightweight way to keep an eye on system load, with the top command showing the impact of different processes if required. While there are graphical tools for some things, command line possibilities cannot be overlooked either.
Avoiding repeated token requests by installing the Git credential helper on Linux Mint
On a new machine, I found asking for the same access token repeatedly. Since this is a long string, that is convenient and does not take long to become irritating. Thus, I sought a way to make it more streamlined. My initial attempt produced the following message:
git: 'credential-libsecret' is not a git command
The main cause for the above was the absence of the libsecret credential helper, crucial for managing credentials securely in a keyring, from my system. The solution was to install the required packages from the command line:
sudo apt install libsecret-1-0 libsecret-1-dev
Following installation, the next step was to navigate to the appropriate directory and execute the make command to compile the files within the directory, transforming them into an executable credential helper:
cd /usr/share/doc/git/contrib/credential/libsecret; sudo make
With the credential helper fully built, Git needed to be configured to use it by executing the following:
git config --global credential.helper /usr/share/doc/git/contrib/credential/libsecret/git-credential-libsecret
Since one error message is enough for any new activity, it made sense to confirm that the credential helper resided in the correct location. That was accomplished by issuing this command:
ls -l /usr/share/doc/git/contrib/credential/libsecret/git-credential-libsecret
All was well in my case, saving the need to reinstall Git or repeat the manual compilation of the credential helper. When all was done, I was ready to automate things further.
Resolving Python UnicodeEncodeError messages issued while executing scripts using the Windows Command Line
Recently, I got caught out by this message when summarising some text using Python and Open AI's API while working within VS Code:
UnicodeEncodeError: 'charmap' codec can't encode characters in position 56-57: character maps to <undefined>
There was no problem on Linux or macOS, but it was triggered on the Windows command line from within VS Code. Unlike the Julia or R REPL's, everything in Python gets executed in the console like this:
& "C:/Program Files/Python313/python.exe" script.py
The Windows command line shell operated with cp1252 character encoding, and that was tripping up the code like the following:
with open("out.txt", "w") as file:
file.write(new_text)
The cure was to specify the encoding of the output text as utf-8:
with open("out.txt", "w", encoding='utf-8') as file:
file.write(new_text)
After that, all was well and text was written to a file like in the other operating systems. One other thing to note is that the use of backslashes in file paths is another gotcha. Adding an r before the quotes gets around this to escape the contents, like using double backslashes. Using forward slashes is another option.
with open(r"c:\temp\out.txt", "w", encoding='utf-8') as file:
file.write(new_text)
Finding human balance in an age of AI code generation
Recently, I was asked about how I felt about AI. Given that the other person was not an enthusiast, I picked on something that happened to me, not so long ago. It involved both Perplexity and Google Gemini when I was trying to debug something: both produced too much code. The experience almost inspired a LinkedIn post, only for some of the thinking to go online here for now. A spot of brainstorming using an LLM sounds like a useful exercise.
Going back to the original question, it happened during a meeting about potential freelance work. Thus, I tapped into experiences with code generators over several decades. The first one involved a metadata-driven tool that I developed; users reported that there was too much imperfect code to debug with the added complexity that dealing with clinical study data brings. That challenge resurfaced with another bespoke tool that someone else developed, and I opted to make things simpler: produce some boilerplate code and let users take things from there. Later, someone else again decided to have another go, seemingly with more success.
It is even more challenging when you are insufficiently familiar with the code that is being produced. That happened to me with shell scripting code from Google Gemini that was peppered with some Awk code. There was no alternative but to learn a bit more about the language from Tutorials Point and seek out an online book elsewhere. That did get me up to speed, and I will return to these when I am in need again.
Then, there was the time when I was trying to get a Julia script to deal with Google Drive needing permissions to be set. This started Google Gemini into adding more and more error checking code with try catch blocks. Since I did not have the issue at that point, I opted to halt and wait for its recurrence. When it did, I opted for a simpler approach, especially with the gdrive CLI tool starting up a web server for completing the process of reactivation. While there are times when shell scripting is better than Julia for these things, I added extra robustness and user-friendliness anyway.
During that second task, I was using VS Code with the GitHub Copilot plugin. There is a need to be careful, yet that can save time when it adds suggestions for you to include or reject. The latter may apply when it adds conditional logic that needs more checking, while simple code outputting useful text to the console can be approved. While that certainly is how I approach things for now, it brings up an increasingly relevant question for me.
How do we deal with all this code production? In an environment with myriads of unit tests and a great deal of automation, there may be more capacity for handling the output than mere human inspection and review, which can overwhelm the limitations of a human context window. A quick search revealed that there are automated tools for just this purpose, possibly with their own learning curves; otherwise, manual working could be a better option in some cases.
After all, we need to do our own thinking too. That was brought home to me during the Julia script editing. To come up with a solution, I had to step away from LLM output and think creatively to come up with something simpler. There was a tension between the two needs during the exercise, which highlighted how important it is to learn not to be distracted by all the new technology. Being an introvert in the first place, I need that solo space, only to have to step away from technology to get that when it was a refuge in the first place.
For anyone with a programming hobby, they have to limit all this input to avoid being overwhelmed; learning a programming language could involve stripping out AI extensions from a code editor, for instance, LLM output has its place, yet it has to be at a human scale too. That perhaps is the genius of a chat interface, and we now have Agentic AI too. It is as if the technology curve never slackens, at least not until the current boom ends, possibly when things break because they go too far beyond us. All this acceleration is fine until we need to catch up with what is happening.
Incorporating tmux in a terminal workflow
As part of a recent workstation upgrade and subsequent AI explorations to see what runs on a GPU, I got to use tmux to display two panes within a terminal session on Linux Mint, each with output from a different system monitoring command; one of these was top for monitoring system processes in a more in-depth way. Some of that need has passed, yet I retain tmux and even set to open in a new terminal session by adding the following code to my .bashrc file:
if command -v tmux &> /dev/null && [ -z "$TMUX" ]; then
tmux new
fi
This tests if tmux is installed and that this is not running in an existing tmux session before opening a new tmux session. You can also attach to an existing session or use a new default session if you like. That changes the second line of the above code to this:
tmux attach -t default || tmux new -s default
Wanting to have everything fresh in a new session, I decided against that. While I have gone away from using tmux panes for the moment, there is a cheat sheet that could have uses if I do, and another post elsewhere describes resizing the panes too, which came in very useful for that early dalliance while system monitoring.
From convex to concave: reflections on decades of computer monitor usage
Within the last week, I changed my monitor and am without an Iiyama in my possession for the first time since 1997. The first one was a 17" CRT screen that accompanied my transition from education into work. Those old screens were not long-lasting, though, especially since it replaced a 15" Dell screen that had started to work less well than I needed; the larger size was an added attraction after I saw someone with a 21" Iiyama at the university where I was pursuing a research degree.
Work saw me using a 21" Philips screen myself for a time before Eizo flat screen displays were given to us as part of a migration to Windows 2000. That inspired me to get a 17" Iiyama counterpart to what I had at work. Collecting that sent me on an errand to a courier's depot on the outskirts of Macclesfield. The same effort may have been accompanied by my dropping my passport, which I was using for identification. That thankfully was handed into the police, so I could get it back from them, even if I was resigned to needing a new one. More care has been taken since then to avoid a repeat.
The screen worked well, though I kept the old one as a backup for perhaps far too long. It took some years to pass before I eventually hauled it to the recycling centre; these days, I might try a nearby charity shop before setting off on such a schlep. In those times, LCD screens lasted so well that they could accumulate if you were not careful. The 17" Iiyama accompanied my migration from Windows to Linux and a period of successful and ill-fated PC upgrades, especially a run of poor luck in 2009.
2010 saw me change my place of work, and a 24" Iiyama was acquired just before then. Again, its predecessor was retained in case anything went awry and eventually went to a charity shop from where I could go into a new life. There was no issue with the new acquisition, and it went on to do nearly twelve years of work for me. A 34" Iiyama replaced it a few years ago, yet I wonder if that decision was the best. Apart from more than a decade of muck on the screen, nothing else was amiss. Even a major workstation upgrade in 2021 did little to challenge it. Even so, it too went to a charity shop searching for a new home.
This year's workstation overhaul did few favours to that 34" successor. While it was always sluggish to wake, it did nothing like going into a cycle of non-responsiveness that it had on numerous occasions in the last few months. Compatibility with a Mac Mini could be better, too. The result is that I am writing these words using a Philips B346C1 instead, and it has few of the issues that beset the Iiyama, save for needing to remove and insert an HDMI cable for a Mac Mini at times.
Screen responsiveness is a big improvement, especially when switching between machines using a KVMP switch. Wake up times are noticeably shorter, and there is much better reliability. However, it did take a deal of time to optimise its settings to my liking. The OSD may be more convenient than the Iiyama, yet having Windows software that did the same thing made configuration at lot easier. While getting acceptable output across Windows, Linux and macOS has been a challenge, there is a feeling that things are nearly there.
Another matter is the fact that this is a curved screen. In some ways, that is akin to the move from a 24" screen to a 34" one when fonts and other items needing enlarging for the bigger screen. After a burst of upheaval, eventually things do settle down and acclimatisation ensues. Even though further tinkering cannot be ruled out, there is a sounder base for computing after the changeover.
Possibly a retrograde way to keep an old scanner going on Linux Mint 22?
For making a copy of a document for official purposes, I needed to get my scanner going with the new workstation. The device is an Epson Perfection 4490 Photo that I acquired in 2007 after its Canon predecessor, a CanoScan 5000F, began to malfunction. It has served me well since then, though digital photography has meant that scanning images is not something that I do frequently these days, the last time being in early 2022 to get larger images into my online photo gallery.
The age means that software support is an issue, more particularly for Windows 11. However, Linux can leave old devices behind it too. It does not help that there is little incentive for Epson to update its drivers either. Thus, Linux Mint's move from LIBSANE to LIBSANE1 makes things less straightforward when the Epson software needs the former.
While you can take apart a DEB file, re-edit its components before creating a new version, that sounds tricky to me. Nevertheless, it may be the way to go for others. Instead, I downloaded a DEB file for LIBSANE from Ubuntu and installed that instead. With that installed, the Epson software installed fully, allowing VueScan to work as I needed. Thus, the document got copied as I needed, and the rest then could happen as required.
When I went looking up solutions to my conundrum on Perplexity, it kept telling me that it was not the best way to go. However, I still took the chance, knowing that I could roll things back if needed. Computers never know you that well without a multitude of data, so the safety first approach has its merits, even if it can be overly cautious in some cases.
If I ever do need to replace the scanner, I probably would replace the printer with a multi-function device at the same time. The move would save some desk space, and I have had a good experience with such a device elsewhere. For now, though, such a move is on the long finger; securing a new freelance contract is higher up any to-do list.
Upheaval and miniaturisation
The ongoing AI boom got me refreshing my computer assets. One was a hefty upgrade to my main workstation, still powered by Linux. Along the way, I learned a few lessons:
- Processing with LLM's only works on a graphics card when everything can remain within its onboard memory. It is all too easy to revert to system memory and CPU usage, given the amount of memory you get on consumer graphics cards. That applies even with the latest and greatest from Nvidia, when the main use case is for gaming. Things become prohibitively expensive when you go on from there.
- Even with water cooling, keeping a top of the range CPU cool and its fans running quietly remains a challenge, more so than when I last went for a major upgrade. It takes time for things to settle down.
- My Iiyama monitor now feels flaky with input from the latest technology. This is enough to make me look for a replacement, and it is waking up from dormancy that is the real issue. While it was always slow, plugging out from mains electricity and then back in again is a hack that is needed all too often.
- KVM switches may need upgrading to work with the latest graphical input. The monitor may have been a culprit with the problems that I was getting, yet things were smoother once I replaced the unit that I had been using with another that is more modern.
- AMD Ryzen 9 chips now have onboard graphics, a boon when things are not proceeding too well with a dedicated graphics card. Even though this was not the case when the last major upgrade happened, there were no issues like what I faced this time around.
- Having LED's on a motherboard to tell what might be stopping system startup is invaluable. This helped in July 2021 and averted confusion this time around as well. While only four of them were on offer, knowing which of CPU, DRAM, GPU or system boot needs attention is a big help.
- Optical drives are not needed any longer. Booting off a USB drive was enough to get Linux Mint installed, once I got the image loaded on there properly. Rufus got used, and I needed to select the low-level writing option before things proceeded as I had hoped.
Just like 2021, the 2025 upgrade cycle needed a few weeks for everything to settle down. The previous cycle was more challenging, and this was not just because of an accompanying heatwave. The latest one was not so bedevilled.
Given the above, one might be tempted to go for a less arduous path, like my acquisition of an iMac last year for another place that I own. After all, a Mac Mini packs in quite a lot of power, and it is not the only miniature option. Now that I have one, I have moved image processing off the workstation and onto it. The images are stored on the Linux machine and edited on the Mac, which has plenty of memory and storage of its own. There is also an M4 chip, so processing power is not lacking either.
It could have been used for work affairs, yet I acquired a Geekom A8 for just that. Though seeking work as I write this, my being an incorporated freelancer means that having a dedicated machine that uses my main monitor has its advantages. Virtualisation can allow drift from business affairs to business matters, that is not so easy when a separate machine is involved. There is no shortage of power either with an AMD Ryzen 9 8945HS and Radeon 780M Graphics on board. Add in 32 GB of memory and 2 TB of storage and all is commodious. It can be surprising what a small package can do.
The Iiyama's travails also pop up with these smaller machines, less so on the Geekom than with the Mac. The latter needs the HDMI cable to be removed and reinserted after a delay to sort out things. Maybe that new monitor may not be such an off the wall idea after all.
The Evoluent VerticalMouse 4 and Linux: a worthwhile ergonomic investment
For anyone spending long hours at a computer, wrist pain is an occupational hazard that tends to arrive slowly and outstay its welcome. The Evoluent VerticalMouse 4 has built a strong following among people looking to address that discomfort, its design placing the hand upright in a natural handshake position rather than forcing the palm flat against a desk. For Linux users, however, the out-of-the-box experience leaves something to be desired, and a few terminal commands are needed before the mouse behaves as it should.
The Ergonomic Case
The fundamental problem with a conventional mouse is that it holds the hand palm-down, a position that twists the forearm bones and places sustained tension on the wrist and median nerve. Used daily over years, this is a well-established route to RSI and, in more serious cases, carpal tunnel syndrome. The Evoluent addresses this by rotating the mouse through 90 degrees, so the hand rests on its side, eliminating that forearm twist. Terence Eden, writing about his experience with several generations of the device, notes that since adopting the vertical mouse range his wrists have been free of pain, and references an evaluation by a team of physical therapists, ergonomists and medical doctors at the University of California, Berkeley health services clinic, who found that the mouse promoted a neutral wrist and forearm posture.
The VM4R (the right-handed wired version) is the same fundamental shape as the earlier generation 3, but with a wider base and a larger lip along the lower edge to prevent the little finger from dragging on the desk surface. The pointer speed button, which on the generation 3 was located on the underside of the mouse (an awkward place to reach during use), has been moved to the top on the VM4R, where it can be operated without lifting or repositioning the hand. Two thumb buttons are included, a feature that is particularly useful for anyone whose index finger is suffering the effects of years of left-clicking. The chrome finish looks striking, though it does attract fingerprints.
What Linux Gets Right Out of the Box
The basic mouse functions work without any configuration on Linux, confirmed on Ubuntu 9.10 and higher. The cursor moves, the scroll wheel scrolls, and the pointer speed button changes the tracking speed as expected. The one immediate gap is that the scroll wheel click does not perform a middle-click paste, as Linux users might expect, and the additional buttons do not behave intuitively until they are remapped. Since Evoluent provides no Linux driver or configuration utility (their software is Windows-only), all customisation goes through the terminal.
Running xinput list will show the mouse in the list of input devices, typically appearing as:
"Evoluent VerticalMouse 4" id=8 [XExtensionPointer]
Despite the mouse reporting 14 buttons, only 6 are physical buttons. The physical layout, as documented by Eden, is as follows: button 1 is the index finger (left click), button 2 is the ring finger (middle click), button 3 is the little finger (right click), buttons 4 and 5 are scroll up and scroll down respectively, button 8 is the upper thumb button, button 9 is the scroll wheel click and button 10 is the lower thumb button.
Remapping the Buttons: the Quick Method
The xinput{target="_blank"} tool allows button remapping in a single command. The following mapping, used by Eden himself, disables the index and ring fingers as direct clickers, makes the lower thumb button act as the primary left click, restores middle-click paste to the scroll wheel click, and sets the little finger as a right click:
xinput set-button-map "Evoluent VerticalMouse 4" 0 3 0 4 5 6 7 0 2 1 2
The position of each number in the string corresponds to a button number, and the value at each position is the action that button will perform. Experiment with different values until the layout suits your workflow. This change takes effect immediately, but it is session-only: a reboot will reset the buttons to their defaults.
As MrEricSir notes in his guide to the mouse on Linux, the numeric device ID shown by xinput list will vary from one machine to the next and should not be relied upon in scripts. Using the full device name in quotes, as shown above, is the more reliable approach.
Making the Remapping Permanent: the Xorg Method
To make the button mapping survive a reboot, an Xorg configuration file is the recommended route. The first step is to identify the USB ID of the mouse by running lsusb and looking for the Evoluent entry in the output:
Bus 004 Device 004: ID 1a7c:0191 Evoluent VerticalMouse 4
The portion you need is 1a7c:0191, though as MrEricSir points out, this will likely differ on your own system. With that value in hand, create the configuration file:
sudo nano /usr/share/X11/xorg.conf.d/90-evoluent.conf
Paste in the following block, substituting your own USB ID and preferred button mapping string:
Section "InputClass"
Identifier "Evoluent"
MatchUSBID "1a7c:0191"
Option "ButtonMapping" "0 3 0 4 5 6 7 0 2 1 2"
EndSection
Note that the button mapping string in this file does not begin with the device ID: that was only required for the xinput command. Save, close and reboot, and the mapping will be applied automatically on every subsequent login. The configuration file path above is confirmed to work on Ubuntu; on other distributions, the relevant directory may differ.
One known complication with the Xorg approach is that the USB ID can change if the mouse is plugged into a different USB port, as some systems enumerate devices differently by port. If the mapping stops working after a port change, running lsusb again will confirm whether the ID has changed, and the configuration file will need updating accordingly.
Making the Remapping Permanent: the Script Method
A commenter on Eden's post, Lukas, offers an alternative that sidesteps the USB ID problem entirely by extracting the device's current session ID dynamically at startup using grep:
xinput --set-button-map $(xinput list | grep -i "evoluent verticalmouse" | grep -o "id=[0-9]*" | grep -o '[0-9]*') 1 3 3 4 5 6 7 9 2 8 11 12 13 14
This can be saved as a shell script, made executable with chmod +x, and added to the startup applications of your desktop environment. Because xinput needs the display server to be fully initialised before it can act, adding a sleep 5 line at the top of the script is advisable to prevent it from running too early on login.
A Note on the Scroll Wheel
One recurring theme in user reports is that the scroll wheel on the VM4R has a noticeably tactile, audible click with each scroll step. Eden remarks that he personally prefers a free-spinning wheel and found the VM4R's scroll noticeably stiffer than the generation 2 model. A separate user, James, describes a more persistent problem across three separate Evoluent mice: the scroll wheel intermittently loses stability, causing the page to jump unexpectedly rather than scrolling smoothly one step at a time. Eden's suggested remedy is to clean the wheel gently with isopropyl alcohol, as dust and debris can accumulate inside the mechanism over time.
Disabling the LED
The Evoluent logo on the mouse glows constantly during use. For those who find this distracting, particularly in a dark environment, there is a hardware method to disable it that does not require any software. Unplug the mouse, then hold down the negative (−) end of the pointer speed rocker button on the side of the mouse and plug it back in while keeping the button held. The logo light will remain off until the procedure is reversed by holding the positive (+) end of the same button during reconnection. This method is confirmed to work on later production units of the VM4R.
Using the Mouse on Wayland
The methods described above, xinput and Xorg configuration files, are specific to the X11 display server. Wayland is now the default on most major distributions, including Ubuntu and Fedora, and neither approach will work there. The situation under Wayland is, however, somewhat better than it first appears.
For basic use, several users have reported that the Evoluent VerticalMouse 4 works largely as expected under Wayland with no configuration at all, with button behaviour closer to what most users would want out of the box. If that is sufficient for your needs, nothing further is required.
For custom button remapping under Wayland, the modern approach documented by Olivier Mehani in a December 2024 guide uses the hwdb subsystem, which operates at the udev level below the display server entirely. This makes it display-server agnostic and avoids the need for any background daemons or services.
The first step is to identify the scan codes of the buttons you wish to remap. Install evtest and run it as root to observe raw input events from the mouse:
sudo evtest
Select the Evoluent device from the list and press each button you want to remap. The output will show a scan code value in the MSC_SCAN field for each button press. On the VM4R, the lower thumb button has scan code 90004 and the upper thumb button has scan code 90006.
Next, determine the key codes you wish to map those buttons to. The libinput tool can show key codes as you press them:
sudo libinput debug-events --device /dev/input/eventX --show-keycodes
Substitute the correct event device for your keyboard, which can be found by running libinput debug-events without a --device argument and looking for the keyboard in the list. With both the scan codes and the desired key codes identified, create a file in /etc/udev/hwdb.d/. The filename should end in .hwdb, for example 70-evoluent.hwdb. The following example remaps the two thumb buttons to the Shift key and the Super key respectively, as Mehani uses them:
evdev:name:Evoluent VerticalMouse 4:*
ID_INPUT_KEY=1
ID_INPUT_KEYBOARD=1
KEYBOARD_KEY_90004=leftshift
KEYBOARD_KEY_90006=leftmeta
The ID_INPUT_KEY and ID_INPUT_KEYBOARD lines mark the mouse as a device capable of generating key events, which is required for the remapping to take effect. Once the file is saved, rebuild and reload the hardware database:
sudo systemd-hwdb update
sudo udevadm trigger
Restart your Wayland session and the new mappings will be active. The key names used in the file are lowercase versions of the Linux input event constants with the KEY_ prefix removed, so KEY_LEFTSHIFT becomes leftshift and KEY_LEFTMETA becomes leftmeta. Adjust the scan codes and key names to suit your own preferred layout.
Worth the Effort
The Evoluent VerticalMouse 4 is a well-considered device for anyone whose wrists are beginning to protest at years of conventional mouse use. For Linux users, the absence of any official configuration software is a genuine gap, but the community has documented reliable workarounds thoroughly. On X11, the xinput command and Xorg configuration files give you a permanent, flexible setup with minimal effort. On Wayland, Mehani's December 2024 guide demonstrates that the hwdb approach provides an equally robust solution that works at the kernel input level, independent of any display server. As Eden observes in his review, the lack of a good mouse configuration interface on Linux is a genuine oversight, but it is one that a little patience can overcome.
Displaying superscripted text in Hugo website content
In a previous post, there was a discussion about displaying ordinal publishing dates with superscripted suffixes in Hugo and WordPress. Here, I go further with inserting superscripted text into Markdown content. Because of the default set up for the Goldmark Markdown renderer, it is not as simple as adding <sup>...</sup> constructs to your Markdown source file. That will generate a warning like this:
WARN Raw HTML omitted while rendering "[initial location of Markdown file]"; see https://gohugo.io/getting-started/configuration-markup/#rendererunsafe
You can suppress this warning by adding the following to your site configuration:
ignoreLogs = ['warning-goldmark-raw-html']
Because JavaScript can be added using HTML tags, there is an added security hazard that could be overlooked if you switch off the warning as suggested. Also, Goldmark does not interpret Markdown specifications of superscripting without an extension whose incorporation needs some familiarity with Go development.
That leaves using a Shortcode. These go into layouts/shortcodes under your theme area; the file containing mine got called super.html. The content is the following one-liner:
<sup>{{ .Get 0 | markdownify }}/sup>
This then is what is added to the Markdown content:
{{< super "th" >}}
What happens here is that the Shortcode picks up the content within the content within the quotes and encapsulates it with the HTML superscript tags to give the required result. This approach can be extended for subscripts and other similar ways of rendering text, too. All that is required is a use case, and the rest can be put in place.