Technology Tales

Adventures & experiences in contemporary technology

Carrying out a hard reset of a home KVM switch

20th March 2017

During a recent upgrade from Linux Mint 18 to Linux Mint 18.1 on a secondary machine, I ran into bother with my Startech KVM (keyboard, video, mouse and audio sharing) switch. The PC failed to recognise the attachment of my keyboard and mouse so an internet search began.

Nothing promising came from it apart from resetting the KVM switch. In other words, the solution was to turn it off and back on again. That was something that I did try without success. What I had overlooked was that there USB connections to PC’s that fed the device with a certain amount of power and that was enough to keep it on.

Unplugging those USB cables as well as the power cable was needed to completely switch off the device. That provided the reset that I needed and all was well again. Otherwise, I would have been baffled enough to resort to buying a replacement KVM switch so the extra information avoided a purchase that could have cost in the region of £100. In other words, a little research had saved me money.

A new look

11th October 2021

Things have been changing on here. Much of that has been behind the scenes with a move to a new VPS for extra speed and all the upheaval that brings. It also gained me a better system for less money than the old upgrade path was costing me and everything feels more responsive as well. Extra work has gone into securing the website as well and I have learned a lot as that has progressed. New lessons were added to older, and sometimes forgotten, ones.

The more obvious change for those who have been here before is that the visual appearance has been refreshed. A new theme has been applied with a multitude of tweaks to make it feel unique and to iron out any rough edges that there may be. This remains a WordPress-based website and new theme is a variant of the Appointee subtheme of the Appointment theme. WordPress does only supports child theming but not grandchild theming so I had to make a copy of Appointee of my own so I could modify things as I see fit.

To my eyes, things do look cleaner, crisper and brighter so I hope that it feels the same to you. Like so many designs these days, the basis is the Bootstrap framework and that is no bad thing in my mind though the standardisation may be too much for some tastes. What has become challenging is that it is getter harder to find new spins on more traditional layouts with everything going for a more magazine-like appearance and summaries being shown on the front page instead of complete articles. That probably reflects how things are going for websites these days so it may be that the next refresh could be more home grown and that is a while away yet.

As the website heads towards its sixteenth year, there is bound to be continuing change. In some ways, I prefer that some things remain unchanged so I use the classic editor instead of Gutenburg because that works best for me. Block-based editing is not for me since I prefer to tinker with code anyway. Still, not all of its influences can be avoided and I have needed to figure out the new widgets interface. It did not feel that intuitive but I suppose that I will grow accustomed to it.

My interest in technology continues even if it saddens me at time and some things do not impress me; the Windows 11 taskbar is one of those so I will not be in any hurry to move away from Windows 10. Still, the pandemic has offered its own learning with virtual conferencing allowing one to lurk and learn new things. For me, this has included R, Python, Julia and DevOps among other things. That proved worthwhile during a time with many restrictions. All that could yield more content yet and some already is on the way.

As ever, it is my own direct working with technology that yields some real niche ideas that others have not covered. With so many technology blogs out there, they may be getting less and less easy to find but everyone has their own journey so I hope to encounter more of them. There remain times when doing precedes telling and that is how it is on here. It is not all about appearances since content matters as much as it ever did.

Firefox spell checking: getting rid of a mispelling from your dictionary.

22nd October 2007

Mozilla Firefox includes a spell checker and like any such function, it offers a chance to add words to a custom dictionary. Of course, you can also add misspellings too and these definitely need to be removed. With Word, it’s a matter of looking for custom.dic and deleting the nefarious item. With Firefox, it’s similar, at least on Windows anyway. The file that need to edit is persdict.dat and you’ll find it in C:\Documents and Settings\[user name]\Application Data\Mozilla\Firefox\Profiles\[random name].default. My search for the relevant information took me over to Lifehacker.

Update 2012-12-11: For users of Linux, the location of the above file is as follows: /home/[user id]/.mozilla/firefox/[random name].default. Once you find persdict.dat in there, the required editing can be perfomed.

Looking at a few Operating Systems

19th February 2011

The last few weeks have seen me poking around with a few different operating systems to see how they perform. None of these were particularly in-depth in their nature but brushes with alternatives to what I currently use for much of the time. While I am too sure what exactly has kicked off all of this curiosity, all of the OS’s that I have examined have been of the UNIX/Linux variety. With the inclusion of Unity in the forthcoming Ubuntu “Natty Narwhal” 11.04, I am mindful of the need to be keeping an eye on alternative options should there ever be a need to jump ship. However, a recent brush with an alpha version has reassured me a little. Then there are interesting OS releases too and I recently forgot the Ubuntu password (a silly thing to do, I know) for my Toshiba laptop too so I suppose that a few things are coming together.

It was that latter development that got me looking in amazement at the impressive minimalism of CrunchBang Linux before settling on Lubuntu to see how it did; these were Live CD runs so I tried before I committed to installing. It helped that the latter was based on Ubuntu as its name suggests so I wasted little name in finding my way around the LXDE desktop. By default, everything supplied with the distro is lightweight with Chromium coming in place of Firefox. There’s no sign of OpenOffice.org either with offerings like Abiword coming in its stead. For the sake of familiarity, I started to add the weight of things without reducing the speed of things, it seems. Well, the speedy start-up wasn’t afflicted anyway. Being an Ubuntu clone meant that it didn’t long to add on Firefox using the apt-get command. LibreOffice was downloaded for installation using the dpkg command and it seems much more fleet-footed than its OpenOffice.org counterpart. As if these nefarious actions weren’t enough, I started to poke in the settings to up the number of virtual desktops too. All in all, it never stopped me going against what be termed the intent of the thing. In spite of what Linux User & Developer has had to say, I think the presentation of the LXDE desktop isn’t unpleasant either. In fact, I reckon that I quite like it and the next thing to do is to restore the entry for Windows 7 on the GRUB menu. Well, there’s always somthing that needs doing…

While I may have learned about it after the event, the release of Debian “Squeeze” 6.0 was of interest to me too. Well, I have used it a fair bit in the last few years and retain a soft spot for it. The new release comes on two kernels: GNU/Linux and FreeBSD. Regarding the latter, I did try having a look but it locked up my main home PC when I tried booting it up in a VirtualBox virtual machine. Given that it’s a technical preview anyway, I think it better to leave it mature for a while no matter how fascinating the prospect may be. Or is it VirtualBox 4.x that hasn’t around long enough? Debian’s latest Linux incarnations showed no such inclinations though I found that the CD ISO image that I’d downloaded didn’t give such a complete system when I fired it up after doing the installation. Being someone that knows his way around Linux anyway, it was no problem to add the missing pieces using apt-get though that’d stop it being an option for new users unless the DVD installation yields more complete results. Other than that, it worked well and I lost no time getting to grips with the OS and it’s gained a much fresher feel than version 5.x (“Lenny”). In summary, I look forward to continuing my investigations of the new Debian.

To round up my explorations of different UNIX/Linux operating systems, I have updated my test installations of Ubuntu 11.04. Initial looks at the next Ubuntu release weren’t so encouraging but things are coming along by all accounts. For one thing, Unity can be switched off in favour of the more familiar GNOME desktop that we’ve had for the last few years. The messages that popped up telling you that there’s no 3D graphics support on your machine have been replaced by graceful degradation to the GNOME and that’s no bad thing either. In case it hasn’t been so obvious, I am one of those who needs convincing by the likes of Unity and GNOME Shell so I’ll sit on the fence for a while. After all, there always are alternatives like LXDE if I want to decamp to something else entirely. One of the nice things about Linux is the amount of that we all have; it might be tricky to choose sometimes but it always is good to be able to find a niche somewhere else when someone makes a decision that doesn’t suit you.

Creating waterfall plots in SAS using PROC GCHART

17th March 2012

Recently, I needed to create a waterfall plot couldn’t use PROC SGPLOT since it was incompatible with publishing macros that use PROC GREPLAY on the platform that I was using; SGPLOT doesn’t generate plots in SAS catalogs but directly creates graphics files instead. Therefore, I decided that PROC GCHART needed to be given a go and it delivered what was needed .

The first step is to get the data into the required sort order:

proc sort data=temp;
by descending result;
run;

Then, it is time to add an ID variable for use in the plot’s X-axis (or midpoint axis in PROC GCHART) using an implied value retention to ensure that every record in the dataset had a unique identifier:

data temp;
set temp;
id+1;
run;

After that, axes have to be set up as needed. For instance, the X-axis (the axis2 statement below) needs to be just a line with no labels or tick marks on there and the Y-axis was fully set up with these, turning the label from vertical to horizontal as needed with the ANGLE option controlling the overall angle of the word(s) and the ROTATE option dealing with the letters, and a range declaration using the ORDER option.

axis1 label=none major=none minor=none value=none;
axis2 label=(rotate=0 angle=90 "Result") order=(-50 to 80 by 10);

With the axis statements declared, the GCHART procedure can be defined. Of this, the VBAR statement is the engine of the plot creation with the ID variable used for the midpoint axis and the result variable used as the summary variable for the Y-axis. The DISCRETE keyword is needed to produce a bar for every value of the ID variable or GCHART will bundle them by default. Next, references for the above axis statements (MAXIS option for midpoint axis and AXIS option for Y-axis) are added and the plot definition is complete. One thing that has to be remembered is that GCHART uses run group processing so a QUIT statement is needed at the end to close it at execution time. This feature has its uses and appears in other procedures too though SAS procedures generally are concluded by a RUN statement.

proc gchart data=temp;
vbar id / sumvar=result discrete axis=axis2 maxis=axis1;
run;
quit;

A desktop Markdown editing environment

8th November 2022

Earlier this year, I changed over two websites from dynamic versions using content management systems to static ones by using Hugo to build them from Markdown files. That meant that I needed to look at the editing of MarkDown even if it is a fairly simple file format. For one thing, Grammarly can be incorporated into WordPress so I did not want to lose something like that.

The latter point meant that I was steered away from plain text editors. Otherwise, there are online ones like StackEdit and Dillinger but the Firefox Grammarly plugin only appears to work on the first of these, and even then only partially in my experience. Dillinger does offer connections to online file storage providers like Google, Dropbox and OneDrive but I wanted to store files on my desktop for upload to a web server. It also works with Github but I prefer to use another web hosting provider.

There are various specialised MarkDown editors for desktop usage like Typora, ReText, Formiko or Ghostwriter but I chose none of these. My actual choice may surprise many: it was Visual Studio Code. The availability of a Grammarly plug-in was what swayed it for me even if it did need to be switched on for MarkDown files. In many ways, it does work as smoothly as elsewhere because it gets fooled by links and other code-like pieces of text. Also, having the added ability to add words to a custom dictionary would be ideal. Some rule overriding is available but I am not sure that everything is covered even if the list of options is lengthy. Some time is needed to inspect all of them before I proceed any further. Thus far, things are working well enough for me.

Why all the commas?

4th December 2022

In recent times, I have been making use of Grammarly for proofreading what I write for online consumption. That has applied as much to what I write in Markdown form as it has for what is authored using content management systems like WordPress and Textpattern.

The free version does nag you to upgrade to a paid subscription, but is not my main irritation. That would be its inflexibility because you cannot turn off rules that you think intrusive, at least in the free version. This comment is particularly applicable to the unofficial plugin that you can install in Visual Studio Code. To me, the add-on for Firefox feels less scrupulous.

There are other options though, and one that I have encountered is LanguageTool. This also offers a Firefox add-on, but there are others not only for other browsers but also Microsoft Word. Recent versions of LibreOffice Writer can connect to a LanguageTool server using in-built functionality, too. There are also dedicated editors for iOS, macOS or Windows.

The one operating that does not get specific add-on support is Linux, but there is another option there. That uses an embedded HTTP server that I installed using Homebrew and set to start automatically using cron. This really helps when using the LanguageTool Linter extension in Visual Studio Code because it can connect to that instead of the public API, which bans your IP address if you overuse it. The extension is also configurable with the ability to add exceptions (both grammatical and spelling), though I appear to have enabled smart formatting only to have it mess up quotes in a Markdown file that then caused Hugo rendering to fail.

Like Grammarly, there is an online editor that offers more if you choose an annual subscription. That is cheaper than the one from Grammarly, so that caused me to go for that instead to get rephrasing suggestions both in the online editor and through a browser add-on. It is better not to get nagged all the time…

The title may surprise you, but I have been using co-ordinating conjunctions without commas for as long as I can remember. Both Grammarly and LanguageTool pick up on these, so I had to do some investigation to find a gap in my education, especially since LanguageTool is so good at finding them. What I also found is how repetitive my writing style can be, which also means that rephrasing has been needed. That, after all, is the point of a proofreading tool, and it can rankle if you have fixed opinions about grammar or enjoy creative writing.

Putting some off-copyright texts from other authors triggers all kinds of messages, but you just have to ignore these. Turning off checks needs care, even if turning them on again is easy to do. There, however, is the danger that artificial intelligence tools could make writing too uniform, since there is only so much that these technologies can do. They should make you look at your text more intently, though, which is never a bad thing because computers still struggle with meaning.

Removing duplicate characters from strings using BASH scripting

30th March 2023

Recently, I wanted to extract some text from the Linux command by word number only for multiple spaces to make things less predictable. The solution was to remove the duplicate spaces. This can be done using sed but you add the complexity of regular expressions if you opt for that solution. Instead, the tr command offers a neater approach. For removing duplicate spaces, the command takes the following form:

echo "test   test" | tr -s " "

Since I was piping some text to the command, that is what I have above. The tr command is intended to replace or delete characters and the -s switch is a shorthand for --squeeze-repeats. The actual character to be deduplicated is passed in quotes at the end; here, it is a space but it could be anything that is duplicated. The resulting text in this example becomes:

test test

After the processing, there is now only one space separating the two words, which is the solution that I sought. It certainly cut out any variability that I was encountering in my usage.

Lightweight Linux

2nd November 2023

While computer hardware gets ever more capable, it is easy for software developers to use all that power and leave those on older machines behind. That probably is the driver for what you find below and there also is the added option of portability too, if you wish to boot up a public computer with your own Linux operating system.

4MLinux

The origin of the start of the name comes from the following: maintenance, multimedia, mini-server and mystery. The last of these refers to gaming rather than what is unique about this effort. It could just as well have been the adjective miniature, which has done nothing to stop there being three forks: Antivirus Live CD, BakAndImgCD and The Smallest Server Suite.

Alpine

Three words come up here: small, simple and secure. Expanding on these gets to the following: a security-oriented, lightweight Linux distribution based on musl libc and busybox. The intended user base has advanced knowledge and can use this in their own projects without it getting in the way.

antiX

This is an offshoot of the now discontinued Mepis project for older machines. Once, I spotted a claim that it only needed 256 MB of RAM, but I cannot be sure of that now. It is another distro based on Debian, so things should be solid and it works of USB drives too.

Armbian

The strap line for this distro gives the game away: Linux for ARM development boards. These do not have the processing heft for heavyweight distros, though Raspberry Pi has been advancing in this regard. You do need to know your board architecture before you download, and such is the variety that it may not be supported. The use cases are desktop, server or IOT, so you should have many bases covered.

DSL

The acronym means Damn Small Linux and it essentially is a derivative of Debian shrunk to fit within 50 MB. That it’s intended for older computers or those without much in the way of processing power goes without saying.

CRUX

A lightweight 64-bit Linux operating is what this project intends to give to the world.

Fatdog64

Essentially, this is Puppy Linux with more default applications included. The installation footprint remains tidy at around 600 MB, so it remains on this list despite the name suggesting it belongs elsewhere.

LegacyOS

As if antiX could not do enough for older computers, along comes this project to refine things even further. Having some antiquated hardware around, this could be worth a look for those older laptops that I find hard to relinquish.

Porteus

Portability is the unique selling point for Porteus because it is intended to be a lightweight installation that you carry around with you on a CD, DVD or USB Flash Drive. The claim is that it boots in around 15 seconds, which is impressive if that’s the reality.

Peropesis

The description for this is small-scale, minimalist and command-line-based. There appears to be no desktop environment, which necessarily means that any Linux distro has to be lightweight. It also live boots and that may be useful for system repair. All in all, these are early days for this effort developed from scratch and it remains to be seen what its unique selling proposition might me.

Puppy

Though there are heavyweight Linux distros that need reasonably up-to-date hardware, this is one of the better-known options for older machines. There is not a single distro though and you can mix your own as well using Woof-CE. The better-known bases include Ubuntu, Debian and Slackware, yet there are others as well. The disk footprint is 300 MB or less, so this certainly qualifies it as lightweight these days.

SliTaz

Not to be content with offering a distro for desktop use on slower systems, they also include servers as a usage option. Needing only 48 MB of RAM and 35 MB of disk space looks very impressive these days and means that a 15-year-old PC need not be consigned to recycling just yet.

Tiny Core

Linux developers often want to have a hand in creating bare-bones distros that will run on practically anything and here’s another example whose size is measured in megabytes and not gigabytes. The numbers quoted on the website do not even exceed 10 MB, so many appliances could make use of this along with the usual desktop and server PC’s.

wattOS

This sounds like a stripped out version of Debian and it once used Ubuntu as its basis. Usage of memory and other computing resources is a watchword with this one, especially with its Microwatt edition and there are LXDE and MATE versions too.

Photoshop books

28th February 2007

Having exhausted the trial time on PhotoShop Elements 5, I am now having a look at its big brother PhotoShop CS2. That has got me thinking about PhotoShop books so that I become more of the possibilities and how to use them. Having a Safari subscription as I do, that naturally became my first port of call and I seemed to find two that answered my needs. Both are by Scott Kelby and they now lie on my Safari bookshelf: The Photoshop Elements 5 Book for Digital Photographers and The Photoshop CS2 Book for Digital Photographers. Even so, I am tempted to get a dead tree version of one of them and that presents a chicken and egg dilemma: the books could help choose which software to buy and the software dictates which of them will be the more useful. That said, I suspect price and features will swing it the way of Elements 5; paying over £400 for software whose capabilities I may never need does not sound financially sensible.

Update March 5th, 2007: I have now got my mits on the dead tree edition of Scott Kelby’s The Photoshop Elements 5 Book for Digital Photographers as well as Brad Hinkel’s Focal Easy Guide to Photoshop CS2. Now for some reading…

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.