Technology Tales

Notes drawn from experiences in consumer and enterprise technology

TOPIC: WORDPRESS

Busying oneself with website alterations

27th March 2026

Much has been happening on this little part of the web over the last few weeks. Firstly, it has gone from a pure WordPress instance to a hybrid with WordPress and Grav coexisting on here. The latter makes the Coding Notebook, AI & Data Science Jottings and Collected Snippets sections a bit more friendly than they used to be. Instead of one long page, which used an older live blogging plugin, everything is broken over multiple pages with navigation between these. That has not brought too much variation in how things appear, so everything comes together cohesively.

For a time, there were link scrapbooks accompanying the rest of the content, and all that has gone. Turning those links into more substantive content became a task that leaned on the capabilities of AI more than might have been ideal. The main bugbear was I got lured into a gargantuan task that I would have attempted otherwise. Expansions into longer descriptions take a while when done en masse, and some of these became much longer articles on the main blog. At the beginning of all this, though, a local AI model served through Ollama became pivotal when migrating content from WordPress to Grav, especially when it came to renaming Markdown files to something more meaningful than a number.

While a lot is made of AI's impact on human employment, hardly anything gets said about the capabilities that it offers you. Articles that would not get written come into being, running the risk of producing something without that much of a human touch; orchestration is never the same as composing things all by yourself, and may cause one to become less attentive than should be the case. The possibility of solely human origination of entries on here is something that appeals to me now, not least because it adds a spot of individuality to what is there to be read. After all, AI is not sentient and cannot experience its surroundings like we can.

Hardening WordPress on Ubuntu and Apache: A practical layered approach

1st March 2026

Protecting a WordPress site rarely depends on a single control. Practical hardening layers network filtering, a web application firewall (WAF), careful browser-side restrictions and sensible log-driven tuning. What follows brings together several well-tested techniques and the precise commands needed to get them working, while also calling out caveats and known changes that can catch administrators out. The focus is on Ubuntu and Apache with ModSecurity and the OWASP Core Rule Set for WordPress, but complementary measures round out a cohesive approach. These include a strict Content Security Policy, Cloudflare or Nginx rules for form spam, firewall housekeeping for UFW and Docker, targeted network blocks and automated abuse reporting with Fail2Ban. Where solutions have moved on, that is noted so you do not pursue dead ends.

The Web Application Firewall

ModSecurity and the OWASP Core Rule Set

ModSecurity remains the most widespread open-source web application firewall and has been under the custodianship of the OWASP Foundation since January 2024, having previously been stewarded by Trustwave for over a decade. It integrates closely with the OWASP Core Rule Set (CRS), which aims to shield web applications from a wide range of attacks including the OWASP Top Ten, while keeping false alerts to a minimum. There are two actively maintained engines: 2.9.x is the classic Apache module and 3.x is the newer, cross-platform variant. Whichever engine you pick, the rule set is the essential companion. One important update is worth stating at the outset: CRS 4 replaces exclusion lists with plugins, so older instructions that toggle CRS 3's exclusions no longer apply as written.

Installing ModSecurity on Ubuntu

On Ubuntu 24.04 LTS, installing the Apache module is straightforward. The universe repository ships libapache2-mod-security2 at version 2.9.7, which meets the 2.9.6 minimum required by CRS 4.x, so no third-party repository is needed. You can fetch and enable ModSecurity with the following commands:

sudo apt install libapache2-mod-security2
sudo a2enmod security2
sudo systemctl restart apache2

It is worth confirming the module is loaded before you proceed:

apache2ctl -M | grep security

The default configuration runs in detection-only mode, which does not block anything. Copy the recommended file into place and then edit it so that SecRuleEngine On replaces SecRuleEngine DetectionOnly:

sudo cp /etc/modsecurity/modsecurity.conf-recommended /etc/modsecurity/modsecurity.conf

Open /etc/modsecurity/modsecurity.conf and make the change, then restart Apache once more to apply it.

Pulling in the Core Rule Set

The next step is to pull in the latest Core Rule Set and wire it up. A typical approach is to clone the upstream repository, move the example setup into place and move the directory named rules into /etc/modsecurity:

cd
git clone https://github.com/coreruleset/coreruleset.git
cd coreruleset
sudo mv crs-setup.conf.example /etc/modsecurity/crs-setup.conf
sudo mv rules/ /etc/modsecurity/

Now adjust the Apache ModSecurity include so that the new crs-setup.conf and all files in /etc/modsecurity/rules are loaded. On Ubuntu, that is governed by /etc/apache2/mods-enabled/security2.conf. Edit this file to reference the new paths, remove any older CRS include lines that might conflict, and then run:

sudo systemctl restart apache2

On Ubuntu 26.04 (due for release in April 2026), the default installation includes a pre-existing CRS configuration at /etc/modsecurity/crs/crs-setup.conf. If this is left in place alongside your own cloned CRS, Apache will fail to start with a Found another rule with the same id error. Remove it before restarting:

sudo rm -f /etc/modsecurity/crs/crs-setup.conf

WordPress-Specific Allowances in CRS 3

WordPress tends to work far better with CRS when its application-specific allowances are enabled. With CRS 3, a variable named tx.crs_exclusions_wordpress can be set in crs-setup.conf to activate those allowances. The commented "exclusions" block in that file includes a template SecAction with ID 900130 that sets application exclusions. Uncomment it and reduce it to the single line that enables the WordPress flag:

SecAction 
 "id:900130,
  phase:1,
  nolog,
  pass,
  t:none,
  setvar:tx.crs_exclusions_wordpress=1"

Reload Apache afterwards with sudo service apache2 reload. If you are on CRS 4, do not use this older mechanism. The project has replaced exclusions with a dedicated WordPress rule exclusions plugin, so follow the CRS 4 plugin documentation instead. The WPSec guide to ModSecurity and CRS covers both the CRS 3 and CRS 4 approaches side by side if you need a reference that bridges the two versions.

Log Retention and WAF Tuning

Once the WAF is enforcing, logs become central to tuning. Retention is important for forensics as well as for understanding false positives over time, so do not settle for the default two weeks. On Ubuntu, you can extend Apache's logrotate configuration at /etc/logrotate.d/apache2 to keep weekly logs for 52 weeks, giving you a year of history to hand.

If you see Execution error – PCRE limits exceeded (-8) in the ModSecurity log, increase the following in /etc/modsecurity/modsecurity.conf to give the regular expression engine more headroom:

SecPcreMatchLimit 1000000
SecPcreMatchLimitRecursion 1000000

File uploads can generate an Access denied with code 403 (phase 2). Match of "eq 0" against "MULTIPART_UNMATCHED_BOUNDARY" required error. One remedy used in practice is to comment out the offending check around line 86 of modsecurity.conf and then reload. The built-in Theme Editor can trigger Request body no files data length is larger than the configured limit. Bumping SecRequestBodyLimit to 6000000 addresses that, again followed by a reload of Apache.

Whitelisting Rule IDs for Specific Endpoints

There are occasions where whitelisting specific rule IDs for specific WordPress endpoints is the most pragmatic way to remove false positives without weakening protection elsewhere. Creating a per-site or server-wide include works well; on Ubuntu, a common location is /etc/apache2/conf-enabled/whitelist.conf. For the Theme Editor, adding a LocationMatch block for /wp-admin/theme-editor.php that removes a small set of well-known noisy IDs can help:

<LocationMatch "/wp-admin/theme-editor.php">
  SecRuleRemoveById 300015 300016 300017 950907 950005 950006 960008 960011 960904 959006 980130
</LocationMatch>

For AJAX requests handled at /wp-admin/admin-ajax.php, the same set with 981173 added is often used. This style of targeted exclusion mirrors long-standing community advice: find the rule ID in logs, remove it only where it is truly safe to do so, and never disable ModSecurity outright. If you need help finding noisy rules, the following command (also documented by InMotion Hosting) summarises IDs, hostnames and URIs seen in errors:

grep ModSecurity /usr/local/apache/logs/error_log | grep "[id" | 
  sed -E -e 's#^.*[id "([0-9]*).*hostname "([a-z0-9-_.]*)"].*uri "(.*?)".*"#1 2 3#' | 
  cut -d" -f1 | sort -n | uniq -c | sort -n

Add a matching SecRuleRemoveById line in your include and restart Apache.

Browser-Side Controls: Content Security Policy

Beyond the WAF, browser-side controls significantly reduce the harm from injected content and cross-site scripting. A Content Security Policy (CSP) is both simple to begin and very effective when tightened. An easy starting point is a report-only header that blocks nothing but shows you what would have been stopped. Adding the following to your site lets you open the browser's developer console and watch violations scroll by as you navigate:

Content-Security-Policy-Report-Only: default-src 'self'; font-src 'self'; img-src 'self'; script-src 'self'; style-src 'self'

From there, iteratively allowlist the external origins your site legitimately uses and prefer strict matches. If a script is loaded from a CDN such as cdnjs.cloudflare.com, referencing the exact file or at least the specific directory, rather than the whole domain, reduces exposure to unrelated content hosted there. Inline code is best moved to external files. If that is not possible, hashes can allowlist specific inline blocks and nonces can authorise dynamically generated ones, though the latter must be unpredictable and unique per request. The 'unsafe-inline' escape hatch exists but undermines much of CSP's value and is best avoided.

Once the console is clean, you can add real-time reporting to a service such as URIports (their guide to building a solid CSP is also worth reading) by extending the header:

Content-Security-Policy-Report-Only: default-src 'self'; ...; report-uri https://example.uriports.com/reports/report; report-to default

Pair this with a Report-To header so that you can monitor and prioritise violations at scale. When you are satisfied, switch the key from Content-Security-Policy-Report-Only to Content-Security-Policy to enforce the policy, at which point browsers will block non-compliant content.

Server Fingerprints and Security Headers

While working on HTTPS and header hardening, it is useful to trim server fingerprints and raise other browser defences, and this Apache security headers walkthrough covers the rationale behind each directive clearly. Apache's ServerTokens directive can be set in /etc/apache2/apache.conf to mask version details. Options range from Full to Prod, with the latter sending only Server: Apache. Unsetting X-Powered-By in /etc/apache2/httpd.conf removes PHP version leakage. Adding the following headers in the same configuration file keeps responses out of hostile frames, asks browsers to block detected XSS and prevents MIME type sniffing:

X-Frame-Options SAMEORIGIN
X-XSS-Protection 1;mode=block
X-Content-Type-Options nosniff

These are not replacements for fixes in application code, but they do give the browser more to work with. If you are behind antivirus products or corporate HTTPS interception, bear in mind that these can cause certificate errors such as SEC_ERROR_UNKNOWN_ISSUER or MOZILLA_PKIX_ERROR_MITM_DETECTED in Firefox. Disabling encrypted traffic scanning in products like Avast, Bitdefender or Kaspersky, or ensuring enterprise interception certificates are correctly installed in Firefox's trust store, resolves those issues. Some errors cannot be bypassed when HSTS is used or when policies disable bypasses, which is the intended behaviour for high-value sites.

Contact Form Spam

Contact form spam is a different but common headache. Analysing access logs often reveals that many automated submissions arrive over HTTP/1.1 while legitimate traffic uses HTTP/2 with modern browser stacks, and this GridPane analysis of a real spam campaign confirms the pattern in detail. That difference gives you something to work with.

Filtering by Protocol in Cloudflare

You can block or challenge HTTP/1.x access to contact pages at the edge with Cloudflare's WAF by crafting an expression that matches both the old protocol and a target URI, while exempting major crawlers. A representative filter looks like this:

(http.request.version in {"HTTP/1.0" "HTTP/1.1" "HTTP/1.2"}
  and http.request.uri eq "/contact/"
  and not http.user_agent contains "Googlebot"
  and not http.user_agent contains "Bingbot"
  and not http.user_agent contains "DuckDuckBot"
  and not http.user_agent contains "facebot"
  and not http.user_agent contains "Slurp"
  and not http.user_agent contains "Alexa")

Set the action to block or to a managed challenge as appropriate.

Blocking Direct POST Requests Without a Valid Referrer

Another useful approach is to cut off direct POST requests to /wp-admin/admin-ajax.php and /wp-comments-post.php when the Referer does not contain your domain. In Cloudflare, this becomes:

(http.request.uri contains "/wp-admin/admin-ajax.php"
  and http.request.method eq "POST"
  and not http.referer contains "yourwebsitehere.com")
or
(http.request.uri contains "/wp-comments-post.php"
  and http.request.method eq "POST"
  and not http.referer contains "yourwebsitehere.com")

The same logic can be applied in Nginx with small site includes that set variables based on $server_protocol and $http_user_agent, then return 403 if a combination such as HTTP/1.1 on /contact/ by a non-whitelisted bot is met. It is sensible to verify with Google Search Console or similar that legitimate crawlers are not impeded once rules are live.

Complementary Mitigations Inside WordPress

Three complementary tools work well alongside the server-side measures already covered. The first is WP Armour, a free honeypot anti-spam plugin that adds a hidden field to comment forms, contact forms and registration pages using JavaScript. Because spambots cannot execute JavaScript, the field is never present in a genuine submission, and any bot that attempts to fill it is rejected silently. No CAPTCHA, API key or subscription is required, and the plugin is GDPR-compliant with no external server calls.

The second measure is entirely native to WordPress. Navigate to Settings, then Discussion and tick "Automatically close comments on articles older than X days." Spammers disproportionately target older content because it tends to be less actively monitored, so setting this to 180 days significantly reduces spam without affecting newer posts where discussion is still active. The value can be adjusted to suit the publishing cadence of the site.

The third layer is Akismet, developed by Automattic. Akismet passes each comment through its cloud-based filter and marks likely spam before it ever appears in the moderation queue. It is free for personal sites and requires an API key obtained from the Akismet website. Used alongside WP Armour, the two cover different vectors: WP Armour stops most bot submissions before they are processed at all, while Akismet catches those that reach the comment pipeline regardless of origin. Complementing both, reCAPTCHA v3 or hCaptcha (where privacy demands it) and simple "bot test" questions remain useful additions, though any solution that adds heavy database load warrants testing before large-scale deployment.

Host-Level Firewalls: UFW and Docker

Host-level firewalls remain important, particularly when Docker is in the mix. Ubuntu's UFW is convenient, but Docker's default iptables rules can bypass UFW and expose published ports to the public network even when ufw deny appears to be in place. One maintained solution uses the kernel's DOCKER-USER chain, so UFW regains control without disabling Docker's iptables management.

Appending a short block to /etc/ufw/after.rules that defines ufw-user-forward, a ufw-docker-logging-deny target and a DOCKER-USER chain, then jumps from DOCKER-USER into ufw-user-forward, allows UFW to govern forwarded traffic. Returning early for RELATED,ESTABLISHED connections, dropping invalid ones, accepting docker0-to-docker0 traffic and returning for RFC 1918 source ranges preserves internal communications. New connection attempts from public networks destined for private address ranges are logged and dropped, with a final RETURN handing off to Docker's own rules for permitted flows.

Restart UFW to activate the change:

sudo systemctl restart ufw
# or
sudo ufw reload

From that point, you can allow external access to a container's service port:

ufw route allow proto tcp from any to any port 80

Or scope to a specific container IP if needed:

ufw route allow proto tcp from any to 172.17.0.2 port 80

UDP rules follow the same pattern. If you prefer not to edit by hand, the UFW-docker helper script can install, check and manage these rules for you. It supports options to auto-detect Docker subnets, supports IPv6 by enabling ip6tables and a ULA (Unique Local Address) range in /etc/docker/daemon.json and can manage Swarm service exposure from manager nodes.

Should you instead use Firewalld, note that it provides a dynamically managed firewall with zones, a D-Bus API and runtime versus permanent configuration separation. It is the default in distributions such as RHEL, CentOS, Fedora and SUSE, and it also works with Docker's iptables backend, though the interaction model differs from UFW's.

Keeping Firewall Rules Tidy

Keeping firewall rules tidy is a small but useful habit. UFW can show verbose and numbered views of its state, as Linuxize's UFW rules guide explains in full:

sudo ufw status verbose
sudo ufw status numbered

Delete rules safely by number or by specification:

sudo ufw delete 4
sudo ufw delete allow 80/tcp

If you are scripting changes, the --force flag suppresses the interactive prompt. Take care never to remove your SSH allow rule when connected remotely, and remember that rule numbers change after deletions, so it is best to list again before removing the next one.

Logging Abusers with Fail2Ban and AbuseIPDB

Logging abusers and reporting them can reduce repeat visits. Fail2Ban watches logs for repeated failures and bans IPs by updating firewall rules for a set period. It can also report to AbuseIPDB via an action that was introduced in v0.10.0 (January 2017), which many installations have today.

Confirm that /etc/fail2ban/action.d/abuseipdb.conf exists and that your /etc/fail2ban/jail.local defines action_abuseipdb = abuseipdb. Within each jail that you want reported, add the following alongside your normal ban action, using categories that match the jail's purpose, such as SSH brute forcing:

%(action_abuseipdb)s[abuseipdb_apikey="my-api-key", abuseipdb_category="18,22"]

Reload with fail2ban-client reload and watch your AbuseIPDB reported IPs page to confirm submissions are flowing. If reports do not arrive, check /var/log/fail2ban.log for cURL errors and ensure your API key is correct, bearing in mind default API limits and throttling. Newer Fail2Ban versions (0.9.0 and above) use a persistent database, so re-reported IPs after restart are less of a concern. If you run older releases, a wrapper script can avoid duplicates by checking ban times before calling the API.

Blocking Provider Ranges

Occasionally, administrators choose to block traffic from entire provider ranges that are persistent sources of scanning or abuse. There are scripts such as the AWS-blocker tool that fetch the official AWS IPv4 and IPv6 ranges and insert iptables rules to block them all, and community posts such as this rundown of poneytelecom.eu ranges that shares specific ranges associated with problematic hosts for people who have seen repeated attacks from those networks. Measures like these are blunt instruments that can have side effects, so they warrant careful consideration and ongoing maintenance if used at all. Where possible, it is preferable to block based on behaviour, authentication failures and reputation rather than on broad ownership alone.

Final ModSecurity Notes: Chasing False Positives

Two final ModSecurity notes help when chasing false positives. First, WordPress comments and posting endpoints can trip generic SQL injection protections such as rule 300016 when text includes patterns that appear dangerous to a naive filter, a well-documented occurrence that catches many administrators out. Watching /etc/httpd/logs/modsec_audit.log or the Apache error log immediately after triggering the offending behaviour, and then scoping SecRuleRemoveById lines to the relevant WordPress locations such as /wp-comments-post.php and /wp-admin/post.php, clears real-world issues without turning off protections globally. Second, when very large responses are legitimately expected in parts of wp-admin, increasing SecResponseBodyLimit in an Apache or Nginx ModSecurity context can be more proportionate than whitelisting many checks at once. Always restart or reload Apache after changes so that your edits take effect.

Defence in Depth

Taken together, these layers complement each other well. ModSecurity with CRS gives you broad, configurable protection at the HTTP layer. CSP and security headers narrow the browser's attack surface and put guardrails in place for any client-side content issues. Targeted edge and server rules dampen automated spam without hindering real users or crawlers. Firewalls remain the bedrock, but modern container tooling means integrating UFW or Firewalld with Docker requires a small amount of extra care. Logs feed both your WAF tuning and your ban lists, and when you report abusers you contribute to a wider pool of threat intelligence. None of this removes the need to keep WordPress core, themes and plugins up to date, but it does mean the same attacks are far less likely to succeed or even to reach your application in the first place.

Lessons learned during migrations to Grav CMS from Textpattern

21st February 2026

After the most of four years since the arrival of Textpattern 4.8.8, Textpattern 4.9.0 was released. Since, I was running two subsites using the software, I tried one of them out with the new version.  That broke an elderly navigation plugin that no longer appears in any repository, prompting a rollback, which was successful. Even so, it stirred some curiosity about alternatives to this veteran of the content management system world, which is pushing on for twenty-two years of age. That might have been just as well, given the subsequent release of Textpattern 4.9.1 because of two reported security issues, one of which affecting all preceding versions of the software.

Well before that came to light, there had been a chat session in a Gemini app on a mobile which travelling on a bus. This started with a simple question about alternatives to Textpattern. The ensuing interaction led me to choose Grav CMS after one other option turned out to involve a subscription charge; A personal website side hustle generating no revenue was not going to become a more expensive sideline than it already was, the same reasoning that stops me paying for WordPress plugins.

Exploring Known Options

Without any recourse to AI capability, I already had options. While WordPress was one of those that was well known to me, the organisation of the website was such that it would be challenging to get everything placed under one instance and I never got around to exploring the multisite capability in much depth. Either way, it would prove to involve quite an amount of restructuring, Even having multiple instances would mean an added amount of maintenance, though I do automate things heavily. The number of attack surfaces because of database dependence is another matter.

In the past, I have been a user of Drupal, though its complexity and the steepness of the associated learning curve meant that I never exploited it fully. Since those were pre-AI days, I wonder how things would differ now. Nevertheless, the need to make parts of a website fit in with each other was another challenge that I failed to overcome in those days. Thus, this content framework approach was not one that I wished to use again. In short, this is an enterprise-grade tool that may be above the level of personal web tinkering, and I never did use its capabilities to their full extent.

The move away from Drupal brought me to Hugo around four years ago. That too presents a learning curve, though its inherent flexibility meant that I could do anything that I want with it once I navigated its documentation and ironed out oversights using web engine searches. This static website generator is what rebuilds a public transport website, a history website comprised of writings by my late father together with a website for my freelancing company. There is no database involved, and you can avoid having any dynamic content creation machinery on the web servers too. Using Git, it is possible to facilitate content publishing from anywhere as well.

Why Grav?

Of the lot, Hugo had a lot going for it. The inherent flexibility would not obstruct getting things to fit with a website wide consistency of appearance, and there is nothing to stop one structuring things how they wanted. However, Grav has one key selling point in comparison: straightforward remote production of content without recourse to local builds being uploaded to a web server. That decouples things from needing one to propagate the build machinery across different locations.

Like Hugo, Grav had an active project behind it and a decent supply of plugins and an architecture that bested Textpattern and its usual languid release cycle. The similarity also extended as far as not having to buy into someone else's theme: any theming can be done from scratch for consistency of appearance across different parts of a website. In Grav's case, that means using the Twig PHP templating engine, another thing to learn and reminiscent of Textpattern's Textile as much as what Hugo itself has.

The centricity of Markdown files was another area of commonality, albeit with remote editing. If you are conversant with page files having a combination of YAML front matter and subsequent page content from Hugo, Grav will not seem so alien to you, even if it has a web interface for editing that front matter. This could help if you need to edit the files directly for any reason.

That is never to say that there were no things to learn, for there was plenty of that. For example, it has its own way of setting up modular pages, an idea that I was to retrofit back into a Hugo website afterwards. This means care with module naming as well as caching, editor choice and content collections, each with their own uniqueness that rewards some prior reading. A learning journey was in the offing, a not attractive part of the experience in any event.

Considerations

There have been a number of other articles published here regarding the major lessons learned during the transitions from Textpattern to Grav. Unlike previous experiences with Hugo, another part of this learning was the use of AI as part of any debugging. At times, there was a need to take things step by step, interacting with the AI instead of trying out a script that it had put my way. There are times when one's own context window gets overwhelmed by the flow of text, meaning that such behaviour needs to be taken in hand.

Another thing to watch is that human consultation of the official documentation is not neglected in a quest for speed that lets the AI do that for you; after all, this machinery is fallible; nothing we ever bring into being is without its flaws. Grav itself also comes from a human enterprise that usefully includes its own Discord community. The GitHub repository was not something to which I had recourse, even if the Admin plugin interface has prompts for reporting issues on there. Here, I provide a synopsis of the points to watch that may add to the help provided elsewhere.

Choosing an Editor

By default, Grav Admin uses CodeMirror as its content editor. While CodeMirror is well-suited to editing code, offering syntax highlighting, bracket matching and multiple cursors, it renders its editing surface in a way that standard browser extension APIs cannot reach. Grammar checkers and spell-check extensions such as LanguageTool rely on native editable elements to detect text, and CodeMirror does not use these. The result is that browser-based writing tools produce no output in Grav Admin at all, which is a confirmed architectural incompatibility rather than a configuration issue documented in the LanguageTool issue tracker.

This can be addressed by replacing CodeMirror using the TinyMCE Editor Integration plugin, installable directly from the Admin plugin interface, which brings a familiar style of editor that browser extensions can access normally. Thus, LanguageTool functionality is restored, the writing workflow stays inside Grav Admin and the change requires only a small amount of configuration to prevent TinyMCE from interfering with Markdown files in undesirable ways. Before coming across the TinyMCE Editor plugin, I was seriously toying with the local editing option centred around a Git-based workflow. Here, using VS Code with the LanguageTool extension like I do for Hugo websites remained a strong possibility. The plugin means that the need to do this is not as pressing as it otherwise might be.

None of this appears to edit Twig templates and other configuration files unless one makes use of the Editor plugin. My brief dalliance with this revealed a clunky interface and interference with the appearance of the website, something that I never appreciated when I saw it with Drupal. Thus, the plugin was quickly removed, and I do not miss it. As it happened, editing and creating files over an SSH connection with a lightweight terminal editor worked well enough for me during the setup phase anyway. If I wanted a nicer editing experience, then a Git-based approach would allow local editing in VSCode before pushing the files back onto the server.

Grav Caching

Unlike WordPress, which requires plugins to do so, Grav maintains its own internal cache for compiled pages and assets. Learning to work with it is part of understanding the platform: changes to CSS, JavaScript and other static assets are served from this cache until it is refreshed. That can be accomplished using the admin panel or by removing the contents of the cache directory directly. Once this becomes second nature, it adds very little overhead to the development process.

Modular Pages

On one of the Textpattern subsites, I had set up the landing page in a modular fashion. This carried over to Grav, which has its own way of handling modular pages. There, the modular page system assembles a single page from the files found within a collection of child folders, each presenting a self-contained content block with its own folder, Markdown file and template.

All modules render together under a single URL; they are non-routable, meaning visitors cannot access them directly. When the parent folder contains a modular.md file, the name tells Grav to use the modular.html.twig template and whose front matter defines which modules to include and in what order.

Module folders are identified by an underscore at the start of their name, and numeric prefixes control the display sequence. The prefix must come before the underscore: _01.main is the correct form. For a home page with many sections this structure scales naturally, with folder names such as 01._title, 04._ireland or 13._practicalities-inspiration making the page architecture immediately readable from the file system alone.

Each module's Markdown filename determines which template renders it: a file named text.md looks for text.html.twig in the theme's modular templates folder. The parent modular.md assembles the modules using @self.modular to collect them, with a custom order list giving precise control over sequence. Once the folder naming convention and the template matching relationship are clear, the system is very workable.

Building Navigation

Given that the original impetus for leaving Textpattern was a broken navigation plugin, ensuring that Grav could replicate the required menu behaviour was a matter of some importance. Grav takes a different approach to navigation from database-driven systems, deriving its menu structure directly from the content directory tree using folder naming conventions and front matter flags rather than a dedicated menu editor.

Top-level navigation is driven by numerically prefixed subfolders within the content directory (pages), so a structure such as 01.home, 02.about and 03.blog yields an ordered working menu automatically. Visibility can be fine-tuned without renaming folders by setting visible: true or visible: false in a page's YAML front matter, and menu labels can be shortened for navigation purposes using the menu: field while retaining a fuller title for the page itself.

The primary navigation loop iterates over the visible children of the pages tree and uses the active and activeChild flags on each page object to highlight the current location, whether the visitor is on a given top-level page directly or somewhere within its subtree. A secondary menu for the current section is assembled by first identifying the active top-level page and then rendering its visible children as a list. Testing for activeChild as well as active in the secondary menu is important, as omitting it means that visitors to grandchild pages see no item highlighted at all. The approach differs from what was possible with Textpattern, where a single composite menu could drill down through the full hierarchy, but displaying one menu for pages within a given section alongside another showing the other sections proves to be a workable and context-sensitive alternative.

Setting Up RSS Feeds

Because Grav does not support the generation of RSS feeds out of the box, it needs a plugin and some extra configuration. The latter means that you need to get your head around the Grav concept of a collection because without it, you will not see anything in your feed. In contrast, database-driven platforms like WordPress or Drupal push out the content by default, which may mean that you are surprised when you first come across how Grav needs you to specify the collections explicitly.

There are two details that make performing configuration of a feed straightforward once understood. The first is that Grav routes do not match physical folder paths: a folder named 03.deliberations on disk is referenced in configuration as /deliberations, since the numeric prefix controls navigation ordering but does not appear in the route, that is the actual web page address. The second is the choice between @page.children, which collects only the immediate children of a folder, and @page.descendants, which collects recursively through all subdirectories. The collection definition belongs in the feed page's front matter, specifying where the content lives, how it should be ordered and in which direction.

Where All This Led

Once I got everything set in place, the end results were pleasing, with much learned along the way. Web page responsiveness was excellent, an experience enhanced by the caching of files. In the above discussion, I hardly mentioned the transition of existing content. For one subsite, this was manual because the scale was smaller, and the Admin plugin's interface made everything straightforward such that all was in place after a few hours of work. In the case of the other, the task was bigger, so I fell on an option used for a WordPress to Hugo migration: Python scripting. That greatly reduced the required effort, allowing me to focus on other things like setting up a modular landing page. The whole migration took around two weeks, all during time outside my client work. There are other places where I can use Grav, which surely will add to what I already have learned. My dalliance with Textpattern is feeling a little like history now.

Enhancing grammar checking for proofing written content in Grav

18th February 2026

For text proofing, I have used LanguageTool in my browser for a while now. It has always performed flawlessly in WordPress and Textpattern, catching errors as I type. When I began to use Grav as a CMS, I expected the same experience in its content editor. However, the project chose CodeMirror, causing me to undertake a search for a better alternative because the LanguageTool extension does not work with that at all.

Why CodeMirror Needed Replacing

Browser extensions such as LanguageTool and Grammarly rely on standard editable elements: <textarea> or elements with contenteditable="true". Both expose text directly in the Document Object Model (DOM), where extensions can access and analyse it.

In contrast, CodeMirror takes a different approach. Built for code editing rather than the writing of prose, it renders text through a JavaScript-managed DOM structure whilst hiding the actual textarea. While I can see how Markdown editing might fit this mode for some, and it claims to facilitate collaborative editing which also has its appeal, the match with content production is uneasy when you lose the functionality of browser spell-check and grammar extensions.

Returning to the Familiar with TinyMCE

Thankfully, there is a way to replace CodeMirror with something that works better for content writing. Moving to the TinyMCE Editor Integration plugin brings a traditional WYSIWYG editor that browser extensions can access. That restores LanguageTool functionality whilst remaining within the Admin interface.

It helps that installation is simple via the Admin plugin interface. For command line installation, make your way to the Grav folder on your web server and issue the following command:

bin/gpm install tinymce-editor

To make TinyMCE treat your Markdown content as plain text, add these parameters in the plugin settings. You will find that by going to Admin → Plugins → TinyMCE Editor Integration → Parameters. Once there, proceed to the Parameters section of the screen, and you can specify these using the Add Item button to create places for the information to go:

Name Value
forced_root_block false
verify_html false
clean-up false
entity_encoding raw

These settings should prevent forced paragraph tags and automatic HTML clean-up that can change your Markdown files in ways that are not desirable. If this still remains a concern, there is another option.

Using VSCode for Editing

The great thing about having access to files is that they can be edited directly, not something that is possible with a database-focussed system like WordPress. Thus, you can use VSCode to create and update any content. This approach may seem unconventional for a code editor, but the availability of the LanguageTool extension makes it viable for this kind of task. In a nutshell, this offers a distraction-free writing and real-time grammar checking, with Git integration that eliminates the need for separate SFTP or rsync uploads, which suits authors who prefer working directly with source files rather than relying on visual editors.

Rounding Things Off

From my experience, it appears that the incompatibility between CodeMirror and browser extensions stems from a fundamental mismatch between code editing and content writing. When CodeMirror abstracts text into a JavaScript model to enable features like syntax highlighting and multiple cursors, browser extensions lose direct DOM access to text fields. These approaches cannot coexist.

For configuration or theme files involving Twig logic or complex modular structures, using the nano editor in an SSH session on a web server remains sufficient. It is difficult to see how CodeMirror would help with this activity and retains direct control with little overhead.

Usefully, we can replace CodeMirror with TinyMCE using the TinyMCE Editor Integration plugin. This restores browser extension compatibility, enables real-time grammar checking and provides a familiar editing interface. The advantages are gained by a quick installation, a little configuration and no workflow changes. If more control is needed, mixing VSCode and Git will facilitate that way of working. It is not as if we do not have options.

Related Reading

WordPress Starter Themes: From bare foundations to modern workflows

1st February 2026

Starter themes have long occupied an important place in WordPress development. They sit between a completely blank project and a fully styled off-the-shelf theme, offering enough structure to speed up work without dictating how the finished site must look. For agencies, freelancers and in-house teams, that balance can save considerable time, allowing developers to begin with a lean codebase and concentrate on the parts that make each site distinct.

That broad appeal helps explain why starter themes continue to evolve in different directions. Some remain deliberately minimal and close to WordPress core conventions, while others embrace modern tooling such as Composer, Vite, Tailwind CSS and component-based templating. Alongside these are starter themes intended for visual builders and users who want a gentler route into customisation. Taken together, they demonstrate that there is no single definition of a WordPress starter theme, with the common thread being that each provides a starting point rather than a finished product.

wd_s: A Generator-Driven Approach

wd_s is a generator-driven starter theme from WebDevStudios. The wd_s generator makes the setup process more tailored by asking for project details: name, URL, description, namespace, text domain, author, author URL, author email and development URL. Once those details are entered, the script performs a find-and-replace and delivers a ZIP file ready to extract into wp-content/themes. The process is straightforward, though it also reflects a more structured approach to project setup than many starter themes provide.

The generator highlights details that matter in real-world theme development, but are sometimes overlooked when beginning from a generic scaffold. Name-spacing, text domains and project metadata all play a part in maintainability and localisation, and wd_s brings those choices to the fore from the very beginning. The example values shown by the generator, such as "Acme Inc." for the name field and a namespace using underscores, are illustrative rather than prescriptive. What stands out more than any one field is the intention to reduce repetitive manual setup and encourage consistency from the very start of a project.

Sage: A Modern Development Workflow

Roots Sage represents a significant shift towards a modern development workflow. Sage is a Tailwind CSS WordPress starter theme with Laravel Blade templating, currently at version 11.1.0 and with over 13,191 GitHub stars at the time of writing. Setup uses Composer and NPM rather than a simple ZIP download, and a typical installation begins in wp-content/themes with composer create-project roots/sage my-theme, followed by npm install and npm run build.

That workflow signals the audience Sage is aimed at. Rather than merely wrapping WordPress templates in a minimal theme shell, Sage introduces tooling and conventions familiar to developers who work with Laravel and modern front-end stacks. The Vite build process generates files including a manifest, compiled CSS and JavaScript and a theme.json, completing the whole build in a matter of seconds and demonstrating that WordPress development can be integrated into contemporary asset pipelines without giving up compatibility with the CMS.

Blade Templating and Component-Driven Design

Blade templating is central to Sage's proposition. The base layout in resources/views/layouts/app.blade.php shows a clean separation of structure and content using directives such as @include, @yield and @hasSection. Header and footer hooks still call familiar WordPress functions including wp_head, wp_body_open and wp_footer, but the surrounding syntax is closer to Laravel than traditional PHP-heavy WordPress templates. This gives developers access to template inheritance, reusable components and directives, making larger codebases considerably easier to organise.

Reusable components illustrate this style compactly. Properties define type and message, a PHP match expression selects classes based on alert type and the component merges those classes into its final markup. The result is not merely an isolated snippet but a demonstration of how Sage encourages component-driven design, reducing repetition and making presentation logic easier to follow, particularly in projects with many shared interface elements.

Tailwind CSS and the Block Editor

Sage places strong emphasis on integrating Tailwind CSS with the WordPress block editor. It automatically generates theme.json from the Tailwind configuration, making colours, font families and sizes immediately available in the block editor with zero additional configuration. The sample app.css imports Tailwind and points at views and app files as content sources, while the generated theme.json includes settings for layout, background, colour palettes, spacing and typography. The palette includes eleven shades of grey along with black and white, and the typography settings mirror Tailwind's familiar scale from xs through 9xl.

This addresses a long-standing friction point on WordPress theming: keeping front-end design systems in sync with the editing experience. In older workflows, editor styles and front-end styles often drifted apart, creating extra maintenance work and inconsistency for content editors. Sage's approach narrows that gap by deriving editor settings directly from the same Tailwind configuration used for the front end, with theme.json generated during the build process rather than maintained by hand.

Theme Structure and the Roots Ecosystem

The theme structure for Sage reinforces its emphasis on organisation. The app directory contains providers, view composers, filters.php and setup.php, while resources holds CSS, JavaScript and Blade views grouped into components, layouts, partials and sections. Public assets, composer.json, package.json, theme.json and vite.config.js complete the structure, paired with PSR-4 autoloading, service providers and Acorn, which brings Laravel-style patterns into WordPress. The Vite configuration includes Tailwind, the Laravel Vite plugin and Roots plugins for WordPress support and theme.json generation, plus aliases for scripts, styles, fonts and images.

Another notable feature is hot module replacement in the WordPress block editor, with style changes updating instantly without page refreshes. Sage sits within the broader Roots ecosystem, which also includes Bedrock (a WordPress boilerplate for Composer and Git-based projects), Trellis (a server provisioning and deployment tool), Acorn, Radicle (which bundles the entire Roots stack into a single starting point) and WP Packages (a Composer repository for WordPress plugins and themes). Testimonials on the Roots website emphasise that many developers regard this ecosystem as a route to a more structured and modern WordPress experience, with Sage having been actively maintained for over a decade.

Visual Composer Starter Theme: A Builder-Friendly Option

Not every starter theme is aimed at developers working with command-line tooling and component-based templates. The Visual Composer Starter Theme occupies a different place in the landscape, described as a free bundle of a lightweight theme and a powerful WordPress page builder. It is aimed at building blogs, WooCommerce stores, business sites and personal websites, and the language surrounding it stresses ease of use, intuitive theme options and layout customisation tools, presented as a free resource intended to support the WordPress community.

Its feature set reflects that broader audience. The theme is easy to customise through the WordPress customiser, SEO-friendly and responsive by default, covering use cases such as personal blogs, landing pages, business sites, portfolios, startups and online stores. WooCommerce compatibility receives particular emphasis, with support for adjusting design preferences via the customiser and building online shops at no cost. Hero and featured images, unlimited colour options, page-level design controls and a choice between regular and mobile sandwich-style menus are all included.

There is also a strong focus on compatibility. The theme is fully translation-ready and compatible with WPML, qTranslate and Polylang, while support for Advanced Custom Fields and Toolset for custom post type development is highlighted. It is also presented as ready to combine with the Visual Composer website builder and is developed openly on GitHub, where anyone can contribute. This is less about offering an unvarnished code scaffold and more about giving users a flexible visual base with a broad range of built-in options, though it remains part of the starter theme conversation because it is designed to be extended rather than merely installed and left untouched.

Bones: Speed, Control and Pragmatism

Bones, designed and developed by Eddie Machado, returns more closely to the classic developer-oriented concept while retaining a distinctive voice. It is described as an HTML5, mobile-first starter theme for rapid WordPress development, and it makes clear that it is not a framework. Frameworks can introduce their own conventions and complexity, whereas Bones is designed to be as bare and minimalistic as possible, intended to be used on a per-project basis with no child themes.

The mobile-first emphasis is one of Bones' defining characteristics. Its Sass setup serves minimal resources to smaller screens before scaling up for larger viewports, an approach tied to performance as well as responsiveness, and Bones includes extensive comments and examples to help developers get started with Sass. It also provides a well-documented example for custom post types and functions to customise the WordPress admin area for clients, though these are entirely optional and can be removed if not needed. The project is released under the WTFPL, one of the most permissive licences available, and takes pride in removing unnecessary elements from the WordPress header to keep output clean and lightweight. The philosophy is to keep what is useful and discard the rest, building from a solid and speedy foundation.

Selecting a Starter Theme to Match Your Workflow

When viewed together, these themes reveal how varied the starter theme category has become. A Speckyboy roundup of top starter and bare-bones themes for WordPress development in 2026 (last updated on the 8th of March 2026) places Sage alongside newer and more editor-focused options including Blockbase, GeneratePress, Air, WDS BT, Byvex and Flynt. The roundup notes that every website serves different goals and that WordPress is flexible enough to support them all, but also makes clear that starting each project from scratch leads to repeated work, with starter themes offering a way to avoid that repetition while preserving freedom over design and functionality.

The same roundup provides a useful framework for evaluating starter themes. Ongoing maintenance matters because themes need to keep pace with WordPress and surrounding technologies, and themes that have not been updated in years should be avoided. The distinction between classic and block themes is important, since developers need a starting point that aligns with their preferred editing model. Features that genuinely speed development, whether block patterns, a comprehensive settings panel or development tools, can make a significant difference over time. A starter theme should also stay out of the way rather than burden projects with an opinionated design direction, and compatibility with a preferred editor or page builder remains central to choosing well.

Whether the preference is for the generator-based setup of wd_s, the modern tooling of Sage, the builder-friendly versatility of Visual Composer Starter Theme or the stripped-back classic structure of Bones, each represents a different answer to the same challenge. Developers and site builders often need a head start rather than a finished design, and a good starter theme provides exactly that, while leaving enough room for the final result to become something entirely its own.

Rendering Markdown in WordPress without plugins by using Parsedown

4th November 2025

Much of what is generated using GenAI as articles is output as Markdown, meaning that you need to convert the content when using it in a WordPress website. Naturally, this kind of thing should be done with care to ensure that you are the creator and that it is not all the work of a machine; orchestration is fine, regurgitation does that add that much. Naturally, fact checking is another need as well.

Writing plain Markdown has secured its own following as well, with WordPress plugins switching over the editor to facilitate such a mode of editing. When I tried Markup Markdown, I found it restrictive when it came to working with images within the text, and it needed a workaround for getting links to open in new browser tabs as well. Thus, I got rid of it to realise that it had not converted any Markdown as I expected, only to provide rendering at post or page display time. Rather than attempting to update the affected text, I decided to see if another solution could be found.

This took me to Parsedown, which proved to be handy for accomplishing what I needed once I had everything set in place. First, that meant cloning its GitHub repo onto the web server. Next, I created a directory called includes under that of my theme. Into there, I copied Parsedown.php to that location. When all was done, I ensured that file and directory ownership were assigned to www-data to avoid execution issues.

Then, I could set to updating the functions.php file. The first line to get added there included the parser file:

require_once get_template_directory() . '/includes/Parsedown.php';

After that, I found that I needed to disable the WordPress rendering machinery because that got in the way of Markdown rendering:

remove_filter('the_content', 'wpautop');
remove_filter('the_content', 'wptexturize');

The last step was to add a filter that parsed the Markdown and passed its output to WordPress rendering to do the rest as usual. This was a simple affair until I needed to deal with code snippets in pre and code tags. Hopefully, the included comments tell you much of what is happening. A possible exception is $matches[0]which itself is an array of entire <pre>...</pre> blocks including the containing tags, with $i => $block doing a $key (not the same variable as in the code, by the way) => $value lookup of the values in the array nesting.

add_filter('the_content', function($content) {
    // Prepare a store for placeholders
    $placeholders = [];

    // 1. Extract pre blocks (including nested code) and replace with safe placeholders
    preg_match_all('//si', $content, $pre_matches);
    foreach ($pre_matches[0] as $i => $block) {
        $key = "§PREBLOCK{$i}§";
        $placeholders[$key] = $block;
        $content = str_replace($block, $key, $content);
    }

    // 2. Extract standalone code blocks (not inside pre)
    preg_match_all('/).*?<\/code>/si', $content, $code_matches);
    foreach ($code_matches[0] as $i => $block) {
        $key = "§CODEBLOCK{$i}§";
        $placeholders[$key] = $block;
        $content = str_replace($block, $key, $content);
    }

    // 3. Run Parsedown on the remaining content
    $Parsedown = new Parsedown();
    $content = $Parsedown->text($content);

    // 4. Restore both pre and code placeholders
    foreach ($placeholders as $key => $block) {
        $content = str_replace($key, $block, $content);
    }

    // 5. Apply paragraph formatting
    return wpautop($content);
}, 12);

All of this avoided dealing with extra plugins to produce the required result. Handily, I still use the Classic Editor, which makes this work a lot more easily. There still is a Markdown import plugin that I am tempted to remove as well to streamline things. That can wait, though. It best not add any more of them any way, not least avoid clashes between them and what is now in the theme.

Fixing WordPress 502 errors caused by incorrect WP-Optimise plugin permissions

31st August 2025

When I started getting a 502 status error on saving a WordPress post, it naturally was concerning, even if I found that the post was saved in the background. Inspection of the Nginx error log was in order, given that an invalid response was triggering the Bad Gateway message. That revealed that the WP-Optimise plugin was triggering the error when trying to access a location on the server. The cause was having incorrect permissions, which was sorted by something like the following:

chown -R www-data:www-data /var/www/html/wp-content/uploads/wpo
chmod -R 755 /var/www/html/wp-content/uploads/wpo

If it were to happen to you, the location may be different, as mine is for me. What you see above is the default. The chown command is assigning www-data as the owner and group for the specified location, with the -R switch triggering recursion for this action. After that, the chmod command is assigning full permissions for the owner, as well as read and execute for everyone else. All of this is to enable WP-Optimise to update its logs, the failure of which was causing the 502 messages in the first place.

How complexity can blind you

17th August 2025

Visitors may not have noticed it, but I was having a lot of trouble with this website. Intermittent slowdowns beset any attempt to add new content or perform any other administration. This was not happening on any other web portal that I had, even one sharing the same publishing software.

Even so, WordPress did get the blame at first, at least when deactivating plugins had no effect. Then, it was the turn of the web server, resulting in a move to something more powerful and my leaving Apache for Nginx at the same time. Redis caching was another suspect, especially when things got in a twist on the new instance. As if that were not enough, MySQL came in for some scrutiny too.

Finally, another suspect emerged: Cloudflare. Either some settings got mangled or something else was happening, but cutting out that intermediary was enough to make things fly again. Now, I use bunny.net for DNS duties instead, and the simplification has helped enormously; the previous layering was no help with debugging. With a bit of care, I might add some other tools behind the scenes while taking things slowly to avoid confusion in the future.

Adding a new domain or subdomain to an SSL certificate using Certbot

11th June 2025

On checking the Site Health page of a WordPress blog, I saw errors that pointed to a problem with its SSL set up. The www subdomain was not included in the site's certificate and was causing PHP errors as a result, though they had no major effect on what visitors saw. Still, it was best to get rid of them, so I needed to update the certificate as needed. Execution of a command like the following did the job:

sudo certbot --expand -d existing.com -d www.example.com

Using a Let's Encrypt certificate meant that I could use the certbot command, since that already was installed on the server. The --expand and -d switches ensured that the listed domains were added to the certificate to sort out the observed problem. In the above, a dummy domain name is used, but this was replaced by the real one to produce the desired effect and make things as they should have been.

Add Canonical Tags to WordPress without plugins

31st March 2025

Search engines need to know which is which because they cannot know which is the real content when there is any duplication, unless you tell them. That is where canonical tags come in handy. By default, WordPress appears to add these for posts and pages, which makes sense. However, you can add them for other places too. While a plugin can do this for you, adding some code to your theme's functions.php file also does the job. This is how it could look:

function add_canonical_link() {
    global $post;

    // Check if we're on a single post/page
    if (is_singular()) {
        $canonical_url = get_permalink($post->ID);
    } 
    // For the homepage
    elseif (is_home() || is_front_page()) {
        $canonical_url = home_url('/');
    }
    // For category archives
    elseif (is_category()) {
        $canonical_url = get_category_link(get_query_var('cat'));
    }
    // For tag archives
    elseif (is_tag()) {
        $canonical_url = get_tag_link(get_query_var('tag_id'));
    }
    // For other archive pages
    elseif (is_archive()) {
        $canonical_url = get_permalink();
    }
    // Fallback for other pages
    else {
        $canonical_url = get_permalink();
    }

    // Output the canonical link
    echo '' . "\n";
}

// Hook the function to wp_head
add_action('wp_head', 'add_canonical_link');

// Remove default canonical link
remove_action('wp_head', 'rel_canonical');

The first part defines a function to define the canonical URL and create the tag to be added. With that completed, the penultimate piece of code hooks it into the wp_head part of the web page, while the last function gets rid of the default link to get avoid any duplication of output.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.