Technology Tales

Notes drawn from experiences in consumer and enterprise technology

TOPIC: JAVASCRIPT

Enhancing grammar checking for proofing written content in Grav

18th February 2026

For text proofing, I have used LanguageTool in my browser for a while now. It has always performed flawlessly in WordPress and Textpattern, catching errors as I type. When I began to use Grav as a CMS, I expected the same experience in its content editor. However, the project chose CodeMirror, causing me to undertake a search for a better alternative because the LanguageTool extension does not work with that at all.

Why CodeMirror Needed Replacing

Browser extensions such as LanguageTool and Grammarly rely on standard editable elements: <textarea> or elements with contenteditable="true". Both expose text directly in the Document Object Model (DOM), where extensions can access and analyse it.

In contrast, CodeMirror takes a different approach. Built for code editing rather than the writing of prose, it renders text through a JavaScript-managed DOM structure whilst hiding the actual textarea. While I can see how Markdown editing might fit this mode for some, and it claims to facilitate collaborative editing which also has its appeal, the match with content production is uneasy when you lose the functionality of browser spell-check and grammar extensions.

Returning to the Familiar with TinyMCE

Thankfully, there is a way to replace CodeMirror with something that works better for content writing. Moving to the TinyMCE Editor Integration plugin brings a traditional WYSIWYG editor that browser extensions can access. That restores LanguageTool functionality whilst remaining within the Admin interface.

It helps that installation is simple via the Admin plugin interface. For command line installation, make your way to the Grav folder on your web server and issue the following command:

bin/gpm install tinymce-editor

To make TinyMCE treat your Markdown content as plain text, add these parameters in the plugin settings. You will find that by going to Admin → Plugins → TinyMCE Editor Integration → Parameters. Once there, proceed to the Parameters section of the screen, and you can specify these using the Add Item button to create places for the information to go:

Name Value
forced_root_block false
verify_html false
clean-up false
entity_encoding raw

These settings should prevent forced paragraph tags and automatic HTML clean-up that can change your Markdown files in ways that are not desirable. If this still remains a concern, there is another option.

Using VSCode for Editing

The great thing about having access to files is that they can be edited directly, not something that is possible with a database-focussed system like WordPress. Thus, you can use VSCode to create and update any content. This approach may seem unconventional for a code editor, but the availability of the LanguageTool extension makes it viable for this kind of task. In a nutshell, this offers a distraction-free writing and real-time grammar checking, with Git integration that eliminates the need for separate SFTP or rsync uploads, which suits authors who prefer working directly with source files rather than relying on visual editors.

Rounding Things Off

From my experience, it appears that the incompatibility between CodeMirror and browser extensions stems from a fundamental mismatch between code editing and content writing. When CodeMirror abstracts text into a JavaScript model to enable features like syntax highlighting and multiple cursors, browser extensions lose direct DOM access to text fields. These approaches cannot coexist.

For configuration or theme files involving Twig logic or complex modular structures, using the nano editor in an SSH session on a web server remains sufficient. It is difficult to see how CodeMirror would help with this activity and retains direct control with little overhead.

Usefully, we can replace CodeMirror with TinyMCE using the TinyMCE Editor Integration plugin. This restores browser extension compatibility, enables real-time grammar checking and provides a familiar editing interface. The advantages are gained by a quick installation, a little configuration and no workflow changes. If more control is needed, mixing VSCode and Git will facilitate that way of working. It is not as if we do not have options.

Related Reading

Resolving a glitch in the ChatGPT interface on Firefox using Tampermonkey

19th July 2025

It may be caused either by a new version of Firefox or an update on the OpenAI side, but the ChatGPT prompt box lost its ability to show a cursor while I am entering text. The Ask anything text also disappeared. In Brave, all looked well, and it still persisted in clean Firefox sessions with no extensions loaded. Thus, it was a case of moving browser or getting a fix in Firefox.

The latter has not been needed because I found a fix of sorts. For that, I needed to install the Tampermonkey extension. Then, I could add a new script to override the behaviour that I was seeing:

// ==UserScript==
// @name         ChatGPT Prompt Box Fix (Firefox)
// @namespace    http://tampermonkey.net/
// @version      1.0
// @description  Forces the ChatGPT prompt box textarea to remain visible in Firefox
// @author       You
// @match        https://chatgpt.com/*
// @grant        none
// ==/UserScript==

(function () {
    'use strict';

    const waitForTextarea = () => {
        const textarea = document.querySelector('textarea');
        if (textarea) {
            textarea.style.display = 'block';

            const observer = new MutationObserver(() => {
                if (textarea.style.display === 'none') {
                    console.log('Textarea display:none overridden');
                    textarea.style.display = 'block';
                }
            });

            observer.observe(textarea, { attributes: true, attributeFilter: ['style'] });
        } else {
            setTimeout(waitForTextarea, 300); // Keep retrying until textarea appears
        }
    };

    waitForTextarea();
})();

In short, this deals with a rogue display: none; line in the CSS, which equally well could have been inserted by JavaScript from somewhere that I cannot track down. The extra code is executed within a self-contained function to prevent interference with other elements and is restricted to the ChatGPT domain, which avoids unwanted impacts on the display of other websites.

The first step is to search for the relevant element on the page, retrying at intervals if necessary. Once located, the element's visibility is ensured by explicitly setting its display property to block. Continued monitoring of the element thwarts any dynamic attempts to hide it by changing its style. When such an action is detected, the script automatically overrides such changes to maintain its visibility, thereby ensuring consistent accessibility.

However, challenges with finding the affected element mean that I get the advisory text duplicated. Thus, I see two instances of Ask anything. However, that is a small price to pay for having a flashing cursor telling me where I am in the interface. Such is the nature of modern web coding that its complexity hinders debugging, thus posing the question as to why we are making such things so complicated in the first place.

Redirecting a WordPress site to its home page when its loop finds no posts

5th November 2022

Since I created a bespoke theme for this site, I have been tweaking things as I go. The basis came from the WordPress Theme Developer Handbook, which gave me a simpler starting point shorn of all sorts of complexity that is encountered with other themes. Naturally, this means that there are little rough edges that need tidying over time.

One of these is dealing with errors on the site, like when content is not found. This could be a wrong address or a search query that finds no matching posts. When that happens, there is a redirection to the home page using some simple JavaScript within the loop fallback code enclosed within script start and end tags (including the whole code triggers the action from this post so it cannot be shown here):

location.href="[blog home page ]";

The bloginfo function can be used with the url keyword to find the home page, avoiding hard-coding. For now, this works so long as JavaScript is enabled, but a more robust approach may come in time. It is not possible to do a PHP redirect because of the nature of HTTP: when headers have been sent, it is not possible to do server redirects. At this stage, things become client side, so using JavaScript is one way to go instead.

Out of memory at line: 56

28th May 2009

This is an error that I have started to see a lot in the last few weeks. First, it was with Piwik and latterly with WordPress.com Stats. For the record, I have never seen it on up-to-date systems but always with IE6 and at page unloading time. The CPU usage hits 100% before the error is produced, which had me blaming JavaScript in error. However, it isn't the cause of all ills.

In fact, the cause seems to be a bug in a certain release of Adobe Flash 9, but I reckon that the inclusion of certain features in a Flash movie are needed to trigger it too. While I don't have the exact details of this, WordPress.com Stats worked without fault until a recent update, so that is what is making me reach the conclusion that I have. That observation also makes me wonder whether we are coming to a point where Flash compatibility is something that needs to be factored into the use of the said technology in a website or web application. Updating Flash will solve the problem on the client, but it might be better if it wasn't triggered on the server side either.

Self-hosted web analytics tracking

24th April 2009

It amazes me now to think how little tracking I used to do on my various web "experiments" only a few short years ago. However, there was a time when a mere web counter, perhaps displayed on web pages themselves, was enough to yield some level of satisfaction, or dissatisfaction in many a case. Things have come a long way since then, and we now seem to have analytics packages all around us. In fact, we don't even have to dig into our pockets to get our hands on the means to peruse this sort of information, either.

At this point, I need to admit that I am known to make use of a few simultaneously but thoughts about reducing their number are coming to mind, but there'll be more on that later. Given that this site is hosted using WordPress software, it should come as no surprise that Automattic's own plugin has been set into action to see how things are going. The main focus is on the total number of visits by day, week and month, with a breakdown showing what pages are doing well, together with an indication of how people came to the site and what links they followed while there. Don't go expecting details of your visitors like the software that they are using and the country where they are accessing the site with this minimalist option and satisfaction should head your way.

There is next to no way of discussing the subject of website analytics without mentioning Google's comprehensive offering in the area. You have to admit that it's comprehensive, with perhaps the only bugbear being the lack of live tracking. That need has been addressed very effectively by Woopra, even if its WordPress plugin will not work with IE6. Otherwise, you need the desktop application (being written in Java, it's a cross-platform affair and I have had it going in both Windows and Linux) but that works well too. Apart maybe from the lack of campaigns, Woopra supplies as good as all the information that its main competitor provides. It certainly does what I would need from it.

However, while they can be free as in beer, there are some costs associated with using external services like Google Analytics and Woopra. Their means of tracking your web pages for you is by executing a piece of JavaScript that needs to be added to every page. If you have everything set to use a common header or footer page, that shouldn't be too laborious, especially when there are plugins for publishing platforms like WordPress too. This way of working means that if anyone has JavaScript disabled or decides not to enable JavaScript for the requisite hosts while using the NoScript extension with Firefox, then your numbers are scuppered. Saying that, the same concerns probably any JavaScript code that you may want to execute, but there's another cost again: the calls to external websites can, even with the best attention in the world, slow down the loading of your own pages. When you add in latency caused by servers having to communicate across the web, it is not all about executing JavaScript code.

A self-hosted analytics package would avoid the latter, and I found one recently through Lifehacker: Piwik, formerly known as PHPMyVisites. Usefully, it turns out that it does next to everything that Google Analytics does. While I'd prefer that it used PHP for this, JavaScript is its means of tracking web pages too. Nevertheless, page loading is still faster than with Google Analytics and/or Woopra and Firefox/NoScript users would only have to allow JavaScript for one site too. If you have had experience with installing PHP/MySQL-powered publishing platforms like WordPress, Textpattern and such like, then putting Piwik in place is no ordeal. Though, you may find yourself changing folder access, uploading of the required files, the specification of database credentials and adding an administration user is all fairly standard stuff. After all that, I have the thing tracking this edifice as well as my outdoor activities (hillwalking/cycling/photography) web presence and I cannot say that I have any complaints, so we'll see how it goes from here.

JavaScript: write it yourself or use a library?

3rd July 2008

I must admit that I have never been a great fan of JavaScript. For one thing, its need to interact with browser objects places you at the mercy of the purveyors of such pieces of software. Debugging is another fine art that can seem opaque to the uninitiated, since the amount and quality of the logging is determined by an interpreter not provided by the language's overseers. All in all, it seems to present a steep and obstacle-strewn learning curve to newcomers. As it happens, I have always found server side scripting languages like PHP and Perl to be more to my taste, and I have no aversion at all to writing SQL.

In the late 1990's when I was still using free web hosting, JavaScript probably was the best option for my then new online photo gallery. Whatever was the truth, it certainly was the way that I went. While learning Java or Flash might have been useful, I never managed to devote sufficient time to the task, so JavaScript turned out to be the way forward until I got a taste of server side scripting. Moving to paid hosting allowed for that to develop and the JavaScript option took a back seat.

Based on my experience of the browser wars and working with JavaScript throughout their existence, I was more than a little surprised at the buzz surrounding AJAX. Ploughing part of the way through WROX's Beginning AJAX did nothing to sell the technology to me; it came across as a very dry, jargon-blighted read. Nevertheless, I do see the advantages of web applications being as responsive as their desktop equivalents, but AJAX doesn't always guarantee this; as someone who has seen such applications crawling on IE6, I can certainly vouch for this. In fact, I suspect that may be behind the appearance of technologies such as AIR and Silverlight, so JavaScript may get usurped yet again, just like my move to a photo gallery powered on the server side.

Even with these concerns, using JavaScript to add a spot more interactivity is never a bad thing even if it can be overdone, hence the speed problems that I have witnessed. In fact, I have been known to use DOM scripting, but I need to have the use in mind before I can experiment with a technology; I cannot do it the other way around. Nevertheless, I am keen to see what JavaScript libraries such as jQuery and Prototype might have to offer (both have been used in WordPress). Since I have happened on their respective websites, they might make good places to start, and who knows where my curiosity might take me?

When JavaScript table generation becomes a performance bottleneck

3rd July 2007

Recently, I have seen a web application that displays thousands of records in a scrollable table (please bear with me, there is a decent reason for this). From the appearance of the table, it would be reasonable to assume that the table is generated by the server and output directly to the screen, but this isn't the case. What actually happens is that the server more or less outputs JavaScript code that is then executed.

This takes the form of large arrays that are slotted into the DOM as the contents of the required table by a JavaScript function. With the large amounts of data involved, this means that the browser fully loads the client CPU while the JavaScript processing takes place, something that takes up to a minute to complete.

Admittedly, the browser is IE6, but this was all on a PC with a 2.53 GHz Pentium 4 and 512 MB of memory. Getting the server to deliver standards-compliant (X)HTML for what is needed in the first place seems a much, much better approach to me.

IE6 and JavaScript performance

22nd June 2007

Having been exposed to an application at work that uses a lot of JavaScript, I fully appreciate what some mean when they discuss IE6's inefficient handling of JavaScript. After seeing a web page taking an age to reload and your CPU taking a hammering because of JavaScript processing, the penny does tend to drop...

Needless to say, this very much impacts the world of AJAX-driven web applications with their heavy dependence on client-side JavaScript. While IE7 does come to the rescue, there remain plenty of IE6 users still out there, and this is reflected in website statistics. This demonstrates a certain level of inertia in the browser market that not only afflicts the uptake of IE7 but also the likes of Mozilla, Opera and Safari. It also means that anyone developing AJAX applications very much needs to continue testing in IE6, especially if the product of their labours is for wider public use.

An example of such an application is Zimbra, an open-source web application for messaging and collaboration, and the people behind it have generously shared the results of their browser performance benchmarking. They did comparisons of IE6 vs. IE7 and Firefox 2 vs. IE7. IE6 easily came out as the worst of these, while Firefox 2 was the best.

The next question to be asked could centre around the type of code that is processed inefficiently by IE6. While I wouldn't be at all surprised if a list emerged, here's one: using Microsoft's proprietary innerHTML object to update the DOM for a web page format. Having a quick trawl on Google, this came up for mention as a cause of memory leaks. It is also a Microsoft innovation that never got taken up by those overseeing web standards, hardly a surprise since a spot of DOM scripting achieves the same end. It may be faster to code than any alternatives, and it does have some support from other browsers, but it does seem to have got a bad name, so it should be avoided if possible. That said, it would be interesting to see a performance comparison between innerHTML and DOM methods in IE6.

Perl vs. PHP: A Personal Experience

11th June 2007

Ever since I converted it from a client-side JavaScript-powered affair, my online photo gallery has been written in Perl. There have been some challenges along the way, figuring out how to use hash tables has been one, but everything has worked as expected. However, I am now wondering if it is better to write things in PHP for the sake of consistency with the rest of the website. I had a go a rewriting the random photo page and, unless I have been missing something in the Perl world, things do seem more succinct with PHP. For instance, actions that formerly involved several lines of code can now be achieved in one. Reading the contents of a file into an array and stripping HTML/XML tags from a string fall into this category, and seeing the number of lines of code halving is a striking observation. I am not going to completely abandon Perl, it's a very nice language, but I do rather suspect that there is now an increased chance of my having a website whose server-side processing needs are served entirely by PHP.

Exploring AJAX

7th June 2007

When I started it, my online photo gallery started out simply as a set of interlinked HTML pages. Over time, I discovered frames (yes, them!) and started to make use of JavaScript to make the slideshows slicker. In those days, I was working off free webspace provided by my ISP and client-side scripting was the only tool that I had for enhancing functionality. Having tired of the vagaries of client-side scripting while the browser wars were in full swing and incompatibilities reigned supreme, I went with paid hosting to get access to tools like Perl and PHP for server-side processing. Because their flexibility compared to JavaScript was a breath of fresh air to me, I am still a fan of the server-side approach.

The journey that I have just described is one that I now know was followed by many website builders around the same time. Nevertheless, I have still held on to JavaScript for some things, particularly for updating the DOM as part of making the pages more responsive to user interaction. In the last few years, a hybrid approach has been gaining currency: AJAX. This offers the ability to modify parts of a page without needing to reload the whole thing, generating a considerable amount of interest among web application developers.

The world of AJAX is evidently a complex one, though the underlying principle can be explained in simple terms. The essential idea is that you use JavaScript to call a server-side script, PHP is as good an example as any, that returns either text or XML that can be used to update part of a web page in situ without the need to reload it as per the traditional way of working. It has opened up so many possibilities from the interface design point of view that AJAX became a hot topic that still receives much attention today. One bugbear is efficiency because I have seen an AJAX application lock up a PC with a little help from IE6. There will always remain times when server-side processing is the best route, even if that needs to be balanced against the client-side approach.

Like its forbear DHTML, AJAX is really a development approach using a number of different technologies in combination. The DHTML elements such as (X)HTML, CSS, DOM and JavaScript are very much part of the AJAX world but server-side elements such as HTTP, PHP, MySQL and XML are also very much part of the fabric of the landscape. In fact, while AJAX can use plain text as the transfer format, XML is the one implied by the AJAX acronym and XSLT is used to transform XML into HTML. However, AJAX is not limited to the aforementioned technologies; for instance, I cannot see why Perl cannot play a role in place of PHP and ASP, both of which can be used for the same things.

Even in these standards-compliant days, browser support for AJAX remains diverse, to say the least, and it is akin to having MSIE in one corner and the rest in the other. Mind you, Microsoft did introduce the tools in the first place only for them to use ActiveX, while Mozilla created a new object type rather than continue this method of operation. Given that ActiveX is a Windows-only technology, I can see why Mozilla did what they did, and it is a sensible decision. In fact, IE7 appears to have picked up the Mozilla way of doing things.

Even with the apparent convergence, there will continue to be a need for the AJAX JavaScript libraries that are currently out there. Incidentally, Adobe has included one called Spry with Dreamweaver CS3. Nevertheless, I still like to find out how things work at the basic level and feel somewhat obstructed when I cannot do this. I remember perusing Wrox’s Professional AJAX and found the constant references to the associated function library rather grating; the writing style didn’t help either.

My taking a more granular approach has got me reading SAMS Teach Yourself AJAX in 10 Minutes as a means for getting my foot in the door. As with their Teach Yourself … in 24 Hours series, the title is a little misleading since there are 22 lessons of 10 minutes in duration (the 24 Hours moniker refers to there being 24 lessons, each of one hour in length). Anything composed of 10-minute lessons, even 22 of them, is never going to be comprehensive but, as a means for getting started, I have to say that the approach seems effective based on this volume. It has certainly whetted my appetite for giving AJAX a go, and it’ll be interesting to see how things progress from here.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.