Technology Tales

Adventures & experiences in contemporary technology

Moves to Hugo

30th November 2022

What amazes me is how things can become more complicated over time. As long as you knew HTML, CSS and JavaScript, building a website was not as onerous as long as web browsers played ball with it. Since then, things have got easier to use but more complex at the same time. One example is WordPress: in the early days, themes were much simpler than they are now. The web also has got more insecure over time, and that adds to complexity as well. It sometimes feels as if there is a choice to make between ease of use and simplicity.

It is against that background that I reassessed the technology that I was using on my public transport and Irish history websites. The former used WordPress, while the latter used Drupal. The irony was that the simpler website was using the more complex platform, so the act of going simpler probably was not before time. Alternatives to WordPress were being surveyed for the first of the pair, but none had quite the flexibility, pervasiveness and ease of use that WordPress offers.

There is another approach that has been gaining notice recently. One part of this is the use of Markdown for web publishing. This is a simple and distraction-free plain text format that can be transformed into something more readable. It sees usage in blogs hosted on GitHub, but also facilitates the generation of static websites. The clutter is absent for those who have no need of the Gutenberg Editor on WordPress.

With the content written in Markdown, it can be fed to a static website generator like Hugo. Using defined templates and fixed assets like CSS together with images and other static files, it can slot the content into HTML files very speedily since it is written in the Go programming language. Once you get acclimatised, there are no folder structures that cannot be used, so you get full flexibility in how you build out your website. Sitemaps and RSS feeds can be built at the same time, both using the same input as the HTML files.

In a nutshell, it automates what once needed manual effort used a code editor or a visual web page editor. The use of HTML snippets and layouts means that there is no necessity for hand-coding content, like there was at the start of the web. It also helps that Bootstrap can be built in using Node, so that gives a basis for any styling. Then, SCSS can take care of things, giving even more automation.

Given that there is no database involved in any of this, the required information has to be stored somewhere, and neither the Markdown content nor the layout files contain all that is needed. The main site configuration is defined in a single TOML file, and you can have a single one of these for every publishing destination; I have development and production servers, which makes this a very handy feature. Otherwise, every Markdown file needs a YAML header where titles, template references, publishing status and other similar information gets defined. The layouts then are linked to their components, and control logic and other advanced functionality can be added too.

Because static files are being created, it does mean that site searching and commenting, or contact pages cannot work like they would on a dynamic web platform. Often, external services are plugged in using JavaScript. One that I use for contact forms is Getform.io. Then, Zapier has had its uses in using the RSS feed to tweet site updates on Twitter when new content gets added. Though I made different choices, Disqus can be used for comments and Algolia for site searching. Generally, though, you can find yourself needing to pay, particularly if you need to remove advertising or gain advanced features.

Some comments service providers offer open source self-hosted options, but I found these difficult to set up and ended up not offering commenting at all. That was after I tried out Cactus Comments only to find that it was not discriminating between pages, so it showed the same comments everywhere. There are numerous alternatives like Remark42, Hyvor Talk, Commento, FastComments, Utterances, Isso, Mouthful, Muut and HyperComments but trying them all out was too time-consuming for what commenting was worth to me. It also explains why some static websites even send readers to Twitter if they have something to say, though I have not followed this way of working.

For searching, I added a JavaScript/JSON self-hosted component to the transport website, and it works well. However, it adds to the size of what a browser needs to download. That is not a major issue for desktop browsers, but the situation with mobile browsers is such that it has a sizeable effect. Testing with PageSpeed and Lighthouse highlighted this, even if I left things as they are. The solution works well in any case.

One thing that I have yet to work out is how to edit or add content while away from home. Editing files using an SSH connection is as much a possibility as setting up a Hugo publishing setup on a laptop. After that, there is the question of using a tablet or phone, since content management systems make everything web based. These are points that I have yet to explore.

As is natural with a code-based solution, there is a learning curve with Hugo. Reading a book provided some orientation, and looking on the web resolved many conundrums. There is good documentation on the project website, while forum discussions turn up on many a web search. Following any research, there was next to nothing that could not be done in some way.

Migration of content takes some forethought and took quite a bit of time, though there was an opportunity to carry some housekeeping as well. The history website was small, so copying and pasting sufficed. For the transport website, I used Python to convert what was on the database into Markdown files before refining the result. That provided some automation, but left a lot of work to be done afterwards.

The results were satisfactory, and I like the associated simplicity and efficiency. That Hugo works so fast means that it can handle large websites, so it is scalable. The new Markdown method for content production is not problematical so far apart from the need to make it more portable, and it helps that I found a setup that works for me. This also avoids any potential dealbreakers that continued development of publishing platforms like WordPress or Drupal could bring. For the former, I hope to remain with the Classic Editor indefinitely, but now have another option in case things go too far.

Moving a website from shared hosting to a virtual private server

24th November 2018

This year has seen some optimisation being applied to my web presences guided by the results of GTMetrix scans. It was then that I realised how slow things were, so server loads were reduced. Anything that slowed response times, such as WordPress plugins, got removed. Usage of Matomo also was curtailed in favour of Google Analytics while HTML, CSS and JS minification followed. What had yet to happen was a search for a faster server. Now, another website has been moved onto a virtual private server (VPS) to see how that would go.

Speed was not the only consideration since security was a factor too. After all, a VPS is more locked away from other users than a folder on a shared server. There also is the added sense of control, so Let’s Encrypt SSL certificates can be added using the Electronic Frontier Foundation’s Certbot. That avoids the expense of using an SSL certificate provided through my shared hosting provider and a successful transition for my travel website may mean that this one undergoes the same move.

For the VPS, I chose Ubuntu 18.04 as its operating system and it came with the LAMP stack already in place. Have offload development websites, the mix of Apache, MySQL and PHP is more familiar to me than anything using Nginx or Python. It also means that .htaccess files become more useful than they were on my previous Nginx-based platform. Having full access to the operating system by means of SSH helps too and should mean that I have fewer calls on technical support since I can do more for myself. Any extra tinkering should not affect others either, since this type of setup is well known to me and having an offline counterpart means that anything riskier is tried there beforehand.

Naturally, there were niggles to overcome with the move. The first to fix was to make the MySQL instance accept calls from outside the server so that I could migrate data there from elsewhere and I even got my shared hosting setup to start using the new database to see what performance boost it might give. To make all this happen, I first found the location of the relevant my.cnf configuration file using the following command:

find / -name my.cnf

Once I had the right file, I commented out the following line that it contained and restarted the database service afterwards using another command to stop the appearance of any error 111 messages:

bind-address 127.0.0.1
service mysql restart

After that, things worked as required and I moved onto another matter: uploading the requisite files. That meant installing an FTP server so I chose proftpd since I knew that well from previous tinkering. Once that was in place, file transfer commenced.

When that was done, I could do some testing to see if I had an active web server that loaded the website. Along the way, I also instated some Apache modules like mod-rewrite using the a2enmod command, restarting Apache each time I enabled another module.

Then, I discovered that Textpattern needed php-7.2-xml installed, so the following command was executed to do this:

apt install php7.2-xml

Then, the following line was uncommented in the correct php.ini configuration file that I found using the same method as that described already for the my.cnf configuration and that was followed by yet another Apache restart:

extension=php_xmlrpc.dll

Addressing the above issues yielded enough success for me to change the IP address in my Cloudflare dashboard so it pointed at the VPS and not the shared server. The changeover happened seamlessly without having to await DNS updates as once would have been the case. It had the added advantage of making both WordPress and Textpattern work fully.

With everything working to my satisfaction, I then followed the instructions on Certbot to set up my new Let’s Encrypt SSL certificate. Aside from a tweak to a configuration file and another Apache restart, the process was more automated than I had expected so I was ready to embark on some fine-tuning to embed the new security arrangements. That meant updating .htaccess files and Textpattern has its own, so the following addition was needed there:

RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

This complemented what was already in the main .htaccess file and WordPress allows you to include http(s) in the address it uses, so that was another task completed. The general .htaccess only needed the following lines to be added:

RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.assortedexplorations.com/$1 [R,L]

What all these achieve is to redirect insecure connections to secure ones for every visitor to the website. After that, internal hyperlinks without https needed updating along with any forms so that a padlock sign could be shown for all pages.

With the main work completed, it was time to sort out a lingering niggle regarding the appearance of an FTP login page every time a WordPress installation or update was requested. The main solution was to make the web server account the owner of the files and directories, but the following line was added to wp-config.php as part of the fix even if it probably is not necessary:

define('FS_METHOD', 'direct');

There also was the non-operation of WP Cron and that was addressed using WP-CLI and a script from Bjorn Johansen. To make double sure of its effectiveness, the following was added to wp-config.php to turn off the usual WP-Cron behaviour:

define('DISABLE_WP_CRON', true);

Intriguingly, WP-CLI offers a long list of possible commands that are worth investigating. A few have been examined but more await attention.

Before those, I still need to get my new VPS to send emails. So far, sendmail has been installed, the hostname changed from localhost and the server restarted. More investigations are needed but what I have not is faster than what was there before, so the effort has been rewarded already.

Improving a website contact form

23rd April 2018

On another website, I have had a contact form but it was missing some functionality. For instance, it stored the input in files on a web server instead of emailing them. That was fixed more easily than expected using the PHP mail function. Even so, it remains useful to survey corresponding documentation on the w3schools website.

The other changes affected the way the form looked to a visitor. There was a reset button and that was removed on finding that such things are out of favour these days. Thinking again, there hardly was any need for it any way.

Newer additions that came with HTML5 had their place too. Including user hints using the placeholder attribute should add some user friendliness although I have avoided experimenting with browser-powered input validation for now. Use of the required attribute has its uses for tell a visitor that they have forgotten something but I need to check how that is handled in CSS more thoroughly before I go with that since there are new :required, :optional, :valid and :invalid pseudoclasses that can be used to help.

It seems that there is much more to learn about setting up forms since I last checked. This is perhaps a hint that a few books need reading as part of catching with how things are done these days. There also is something new to learn.

Changing to web fonts

12th February 2012

While you can add Windows fonts to Linux installations, I have found that their display can be flaky to say the least. Linux Mint and Ubuntu display them as sharp as I’d like but I have struggled to get the same sort of results from Arch Linux while I am not so sure about Fedora or openSUSE either.

That has caused me to look at web fonts for my websites with Google Web Fonts doing what I need with both Open Sans and Arimo doing what I need so far. There have been others with which I have dallied, such as Droid Sans, but these are the ones on which I have settled for now. Both are in use on this website now and I added calls for them to the web page headers using the following code (lines are wrapping due to space constraints):

<link href="http://fonts.googleapis.com/css?family=Open+Sans:300italic,400italic,600italic,700italic,400,300,600,700" rel="stylesheet" type="text/css">
<link href='http://fonts.googleapis.com/css?family=Arimo:400,400italic,700,700italic' rel='stylesheet' type='text/css'>

With those lines in place, it then is a matter of updating font-family and font declarations in CSS style sheets with “Open Sans” or “Arimo” as needed while keeping alternatives defined in case the Google font service goes down for whatever reason. A look at a development release of the WordPress Twenty Twelve theme caused me to come across Open Sans and I like it for its clean lines and Arimo, which was found by looking through the growing Google Web Fonts catalogue, is not far behind. Looking through that catalogue now causes for me a round of indecision since there is so much choice. For that reason, I think it better to be open to the recommendations of others.

Adding GNOME 3 to Linux Mint 11

3rd June 2011

On the surface of it, this probably sounds a very strange thing to do: choose Linux Mint because they plan to stick with their current desktop interface for the foreseeable future and then stick a brand new one on there. However, that’s what last weekend’s dalliance with Fedora 15 caused. Not only did I find that I could find my way around GNOME Shell but I actually got to like it so much that I missed it on returning to using my Linux Mint machine again.

The result was that I started to look on the web to see if there was anyone else like me who had got the same brainwave. In fact, it was Mint’s being based on Ubuntu that allowed me to get GNOME 3 on there. The task could be summarised as involving three main stages: getting GNOME 3 installed, adding extensions and adding the Cantarell font that is used by default. After these steps, I gained a well-running GNOME 3 desktop running on Linux Mint and it looks set to stay that way unless something untoward has yet to emerge.

Installing GNOME 3

The first step is to add the PPA repository for GNOME 3 using the following command:

sudo add-apt-repository ppa:gnome3-team/gnome3

The, it was a case of issuing my usual update/upgrade command:

sudo apt-get update && sudo apt-get upgrade && sudo apt-get dist-upgrade

When that had done its thing and downloaded and installed quite a few upgrades along the way, it was time to add GNOME Shell using this command:

sudo apt-get install gnome-shell

When that was done, I rebooted my system to be greeted by a login screen very reminiscent of what I had seen in Fedora. While compiling this piece, I noticed that GNOME Session could need to be added before GNOME Shell but I do not recall doing so myself. Maybe dependency resolution kept any problems at bay, but there weren’t any issues that I could remember beyond things not being configured as fully as I would have liked without further work. For the sake of safety, it might be a good idea to run the following before adding GNOME Shell to your PC.

sudo apt-get install gnome-session && sudo apt-get dist-upgrade

Configuration and Customisation

Once I had logged in, the desktop that I saw wasn’t at all unlike the Fedora one and everything seemed stable too. However, there was still work to do before I was truly at home with it. One thing that was needed was the ever useful GNOME Tweak Tool. This came in very handy for changing the theme that was on display to the standard Adwaita one that caught my eye while I was using Fedora 15. Adding buttons to application title bars for minimising and maximising their windows was another job that the tool allowed me to do. The command to get this goodness added in the first place is this:

sudo apt-get install gnome-tweak-tool

The next thing that I wanted to do was add some extensions so I added a repository from which to do this using the command below. Downloading them via Git and compiling them just wasn’t working for me so I needed another approach.

sudo add-apt-repository ppa:ricotz/testing

With that is place, I issued the following commands to gain the Dock, the Alternative Status Menu and the Windows Navigator. The second of these would have added a shutdown option in the me-menu but it seems to have got deactivated after a system update. Holding down the ALT key to change the Suspend entry to Power off… will have to do me for now. Having the dock is the most important and that, thankfully, is staying the course and works exactly as it does for Fedora.

sudo apt-get install gnome-shell-extensions-dock
sudo apt-get install gnome-shell-extensions-alternative-status-menu
sudo apt-get install gnome-shell-extensions-windows-navigator

Adding Cantarell

The default font used by GNOME 3 in various parts of its interface is Cantarell and it was defaulting to that standard sans-serif font on my system because this wasn’t in place. That font didn’t look too well so I set to tracking the freely available Cantarell down on the web.  When that search brought me to Font Squirrel, I downloaded the zip file containing the required TTF files. The next step was to install them and, towards that end, I added Fontmatrix using this command:

sudo apt-get install fontmatrix

That gave me a tool with a nice user interface but I made a mistake when using it. This was because I (wrongly) thought that it would copy files from the folder that I told the import function to use. Extracting the TTF files to /tmp meant that would have had to happen, but Fontmatrix just registered them instead. A reboot confirmed that they hadn’t been copied or moved at all and I had rendered the user interface next to unusable through my own folly; the default action on Ubuntu and Linux Mint is that files are deleted from /tmp on shutdown. The font selection capabilities of the GNOME Tweak Tool came in very handy for helping me to convert useless boxes into letters that I could read.

Another step was to change the font line near the top of the GNOME Shell stylesheet (never thought that CSS usage would end up in places like this…) so that Cantarell wasn’t being sought and text in sans-serif font replaced grey and white boxes. The stylesheet needs to be edited as superuser, so the following command is what’s needed for this and, while I used sudo, gksu is just as useful here if it isn’t what I should have been using.

sudo gedit /usr/share/gnome-shell/theme/gnome-shell.css

Once I had extricated my system from that mess, a more conventional approach was taken and the command sequence below was what I followed, with extensive use of sudo to get done what I wanted. A new directory was created and the TTF files copied in there.

cd /usr/share/fonts/truetype
sudo mkdir ttf-cantarell
cd ttf-cantarell
sudo mv /tmp/*.ttf .

To refresh the font cache, I resorted to the command described in a tutorial in the Ubuntu Wiki:

sudo fc-cache -f -v

Once that was done, it was then time to restore the reference to Cantarell in the GNOME Shell stylesheet and reinstate its usage in application windows using the GNOME Tweak Tool. Since then, I have suffered no mishap or system issue with GNOME 3. Everything seems to be working quietly and I am happy to see that replacement of Unity with the GNOME Shell will become an easier task in Ubuntu 11.10, the first alpha release of which is out at the time of my writing these words. Could it lure me back from my modified instance of Linux Mint yet? While I cannot say that I am sure of those but it certainly cannot be ruled out at this stage.

Tinkering with Textpattern

26th April 2011

Textpattern 5 may be on the way but that isn’t to say that work on the 4.x branch is completely stopped though it is less of a priority at the moment. After all, version 4.40 was slipped out not so long ago as a security release, a discovery that I made while giving a section of my outdoors website a spring refresh. During that activity, the TinyMCE plugin started to grate with its issuing of error messages in the form of dialogue boxes needing user input to get rid of them every time an article was opened or saved. Because of that nuisance, the guilty hak_tinymce plugin was ejected with joh_admin_ckeditor replacing it and bringing CKEditor into use for editing my Textpattern articles. It is working well though the narrow editing area is causing the editor toolbars to take up too much vertical space but you can resize the editor to solve this though it would be better if it could be made to remember those size settings.

Another find was atb_editarea, a plugin that colour codes (X)HTML, PHP and CSS by augmenting the standard text editing for pages and stylesheets in the Presentation part of the administration interface. If I had this at the start of my redesign, it would have made doing the needful that bit more user-friendly than the basic editing facilities that Textpattern offers by default. Of course, the tinkering never stops so there’s no such thing as finding something too late in the day for it to be useful.

Textpattern may not be getting the attention that some of its competitors are getting but it isn’t being neglected either; its users and developer community see to that. Saying that, it needs to get better at announcing new versions of the CMS so they don’t slip by the likes of me who isn’t looking all the time. With a major change of version number involved, curiosity is aroused as what is coming next. So far, Textpattern appears to be taking an evolutionary course and there’s a lot to be said for such an approach.

Worth the attention?

21st July 2010

The latest edition of Web Designer has features and tutorials on modern trends one new ways to use fonts and typography in websites. One thing that’s at the heart of the attention is the @font-face CSS selector. It’s what allows you to break away from the limitations of whatever fonts your visitors might have on their PC’s to use something available remotely.

In principle, that sounds a great idea but there are caveats. The first of these is the support for the @font-face selector in the first place though the modern browsers that I have tried seem to do reasonably OK on this score. These include the latest versions of Firefox, Internet Explorer, Opera and Chrome. The new fonts may render OK but there’s a short delay in the full loading of a web page. With Firefox, the rendering seems to treat the process like an interleaved image so you may see fonts from your own PC before the remote ones come into place, a not too ideal situation in my opinion. Also, I have found that this is more noticeable on the Linux variant of the browser than its Windows counterpart. Loading a page that is predominantly text is another scenario where you’ll see the behaviour more clearly. Having a sizeable image file loading seems to make things less noticeable. Otherwise, you may see a short delay to the loading of a web page because the fonts have to be downloaded first. Opera is a particular offender here with IE8 loading things quite quickly and Chrome not being too bad either.

In the main, I have been using Google’s Fonts Directory but, in the interests of supposedly getting a better response, I tried using font files stored on a test web server only to discover that there was more of a lag with the fonts on the web server. While I do not know what Google has done with their set up, using their font delivery service appears to deliver better performance in my testing so it’ll be my choice for now. There’s Typekit too but I’ll be hanging onto to my money in the light of my recent experiences.

After my brush with remote font loading, I am inclined to wonder if the current hype about fonts applied using the @font-face directive is deserved until browsers get better and faster at loading them. As things stand, they may be better than before but the jury’s still out for me with Firefox’s rendering being a particular irritant. Of course, things can get better…

Easier to print?

20th February 2010

One matter that really came to light was how well or not the pages on here and on my hill walking and photography website came out on the printed page. After spotting a WordPress Codex article and with an eye on making things better, I have made a distinction between screen and print stylesheets. The code in the XHTML looks like this:

<link rel=”stylesheet” href=”/style.css” type=”text/css” media=”screen” />
<link rel=”stylesheet” href=”/style_print.css” type=”text/css” media=”print” />

The media attribute seems to be respected by the browsers that I have been using for testing (latest versions of Firefox, MSIE and Opera) so it then was a matter of using CSS to control what was shown and how it was displayed. Extraneous items like sidebars were excluded from the printed page in favour of the real content that visitors would be wanting anyway and everything else was made as monochrome as possible with images being the only things to escape. After all, people don’t want to be wasting paper and ink in this cash strained times and there’s no need to have any more colour than necessary either. Then, there’s the distraction caused by non-functioning hyperlinks that has inspired the sharing of some wisdom on A List Apart. Returning to my implementation, please let me know in the comments what you think of what I have done on here and if there remains any room for improvement.

Another look at Drupal

20th January 2010

Early on in the first year of this blog, I got to investigating the use of Drupal for creating an article-based subsite. In the end, the complexities of its HTML and CSS thwarted my attempts to harmonise the appearance of web pages with other parts of the same site and I discontinued my efforts. In the end, it was Textpattern that suited my needs and I have stuck with that for the aforementioned subsite. However, I recently spotted someone very obviously using Drupal in its out of the box state for a sort of blog (there is even an extension for importing WXR files containing content from a WordPress blog); they even hadn’t removed the Drupal logo. With my interest rekindled, I took another look for the sake of seeing where things have gone in the last few years. Well, first impressions are that it now looks like a blogging tool with greater menu control and the facility to define custom content types. There are plenty of nice themes around too though that highlights an idiosyncrasy in the sense that content editing is not fully integrated into the administration area where I’d expect it to be. The consequence of this situation is that pages, posts (or story as the content type is called) or any content types that you have defined yourself are created and edited with the front page theme controlling the appearance of the user interface. It is made even more striking when you use a different theme for the administration screens. That oddity aside, there is a lot to recommend Drupal though I’d try setting up a standalone site with it rather than attempting to shoehorn it as a part of an existing one like what I was trying when I last looked.

DePo Masthead

6th November 2009

There is a place on WordPress.com where I share various odds and ends about public transport in the U.K. It’s called On Trains and Buses and I try not to go tinkering with the design side of things too much. You only have the ability to change the CSS and my previous experience of doing that with this edifice while it lived on there taught me not to expect too much even if there are sandbox themes for anyone to turn into something presentable, not that I really would want to go doing that in full view of everyone (doing if offline first and copying the CSS afterwards when it’s done is my preferred way of going about it). Besides, I wanted to see how WordPress.com fares these days anyway.

While my public transport blog just been around for a little over a year, it’s worn a few themes over that time, ranging from the minimalist The Journalist v1.9 and Vigilance through to Spring Reloaded. After the last of these, I am back to minimalist again with DePo Masthead, albeit with a spot of my own colouring to soften its feel a little. I must admit growing to like it but it came to my attention that it was a bespoke design from Derek Powazek that Automattic’s Noel Jackson turned into reality. The result would appear that you cannot get it anywhere but from the WordPress.com Subversion theme repository. For those not versed in the little bit of Subversion action that is needed to get it, I did it for you and put it all into a zip file without making any changes to the original, hoping that it might make life easier for someone.

Download DePo Masthead

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.