Technology Tales

Adventures in consumer and enterprise technology

TOPIC: JAVASCRIPT

Resolving a glitch in the ChatGPT interface on Firefox using Tampermonkey

19th July 2025

It may be caused either by a new version of Firefox or an update on the OpenAI side, but the ChatGPT prompt box lost its ability to show a cursor while I am entering text. The Ask anything text also disappeared. In Brave, all looked well, and it still persisted in clean Firefox sessions with no extensions loaded. Thus, it was a case of moving browser or getting a fix in Firefox.

The latter has not been needed because I found a fix of sorts. For that, I needed to install the Tampermonkey extension. Then, I could add a new script to override the behaviour that I was seeing:

// ==UserScript==
// @name         ChatGPT Prompt Box Fix (Firefox)
// @namespace    http://tampermonkey.net/
// @version      1.0
// @description  Forces the ChatGPT prompt box textarea to remain visible in Firefox
// @author       You
// @match        https://chatgpt.com/*
// @grant        none
// ==/UserScript==

(function () {
    'use strict';

    const waitForTextarea = () => {
        const textarea = document.querySelector('textarea');
        if (textarea) {
            textarea.style.display = 'block';

            const observer = new MutationObserver(() => {
                if (textarea.style.display === 'none') {
                    console.log('Textarea display:none overridden');
                    textarea.style.display = 'block';
                }
            });

            observer.observe(textarea, { attributes: true, attributeFilter: ['style'] });
        } else {
            setTimeout(waitForTextarea, 300); // Keep retrying until textarea appears
        }
    };

    waitForTextarea();
})();

In short, this deals with a rogue display: none; line in the CSS, which equally well could have been inserted by JavaScript from somewhere that I cannot track down. The extra code is executed within a self-contained function to prevent interference with other elements and is restricted to the ChatGPT domain, which avoids unwanted impacts on the display of other websites.

The first step is to search for the relevant element on the page, retrying at intervals if necessary. Once located, the element's visibility is ensured by explicitly setting its display property to block. Continued monitoring of the element thwarts any dynamic attempts to hide it by changing its style. When such an action is detected, the script automatically overrides such changes to maintain its visibility, thereby ensuring consistent accessibility.

However, challenges with finding the affected element mean that I get the advisory text duplicated. Thus, I see two instances of Ask anything. However, that is a small price to pay for having a flashing cursor telling me where I am in the interface. Such is the nature of modern web coding that its complexity hinders debugging, thus posing the question as to why we are making such things so complicated in the first place.

Redirecting a WordPress site to its home page when its loop finds no posts

5th November 2022

Since I created a bespoke theme for this site, I have been tweaking things as I go. The basis came from the WordPress Theme Developer Handbook, which gave me a simpler starting point shorn of all sorts of complexity that is encountered with other themes. Naturally, this means that there are little rough edges that need tidying over time.

One of these is dealing with errors on the site, like when content is not found. This could be a wrong address or a search query that finds no matching posts. When that happens, there is a redirection to the home page using some simple JavaScript within the loop fallback code enclosed within script start and end tags (including the whole code triggers the action from this post so it cannot be shown here):

location.href="[blog home page ]";

The bloginfo function can be used with the url keyword to find the home page, avoiding hard-coding. For now, this works so long as JavaScript is enabled, but a more robust approach may come in time. It is not possible to do a PHP redirect because of the nature of HTTP: when headers have been sent, it is not possible to do server redirects. At this stage, things become client side, so using JavaScript is one way to go instead.

Out of memory at line: 56

28th May 2009

This is an error that I have started to see a lot in the last few weeks. First, it was with Piwik and latterly with WordPress.com Stats. For the record, I have never seen it on up-to-date systems but always with IE6 and at page unloading time. The CPU usage hits 100% before the error is produced, which had me blaming JavaScript in error. However, it isn't the cause of all ills.

In fact, the cause seems to be a bug in a certain release of Adobe Flash 9, but I reckon that the inclusion of certain features in a Flash movie are needed to trigger it too. While I don't have the exact details of this, WordPress.com Stats worked without fault until a recent update, so that is what is making me reach the conclusion that I have. That observation also makes me wonder whether we are coming to a point where Flash compatibility is something that needs to be factored into the use of the said technology in a website or web application. Updating Flash will solve the problem on the client, but it might be better if it wasn't triggered on the server side either.

Self-hosted web analytics tracking

24th April 2009

It amazes me now to think how little tracking I used to do on my various web "experiments" only a few short years ago. However, there was a time when a mere web counter, perhaps displayed on web pages themselves, was enough to yield some level of satisfaction, or dissatisfaction in many a case. Things have come a long way since then, and we now seem to have analytics packages all around us. In fact, we don't even have to dig into our pockets to get our hands on the means to peruse this sort of information, either.

At this point, I need to admit that I am known to make use of a few simultaneously but thoughts about reducing their number are coming to mind, but there'll be more on that later. Given that this site is hosted using WordPress software, it should come as no surprise that Automattic's own plugin has been set into action to see how things are going. The main focus is on the total number of visits by day, week and month, with a breakdown showing what pages are doing well, together with an indication of how people came to the site and what links they followed while there. Don't go expecting details of your visitors like the software that they are using and the country where they are accessing the site with this minimalist option and satisfaction should head your way.

There is next to no way of discussing the subject of website analytics without mentioning Google's comprehensive offering in the area. You have to admit that it's comprehensive, with perhaps the only bugbear being the lack of live tracking. That need has been addressed very effectively by Woopra, even if its WordPress plugin will not work with IE6. Otherwise, you need the desktop application (being written in Java, it's a cross-platform affair and I have had it going in both Windows and Linux) but that works well too. Apart maybe from the lack of campaigns, Woopra supplies as good as all the information that its main competitor provides. It certainly does what I would need from it.

However, while they can be free as in beer, there are some costs associated with using external services like Google Analytics and Woopra. Their means of tracking your web pages for you is by executing a piece of JavaScript that needs to be added to every page. If you have everything set to use a common header or footer page, that shouldn't be too laborious, especially when there are plugins for publishing platforms like WordPress too. This way of working means that if anyone has JavaScript disabled or decides not to enable JavaScript for the requisite hosts while using the NoScript extension with Firefox, then your numbers are scuppered. Saying that, the same concerns probably any JavaScript code that you may want to execute, but there's another cost again: the calls to external websites can, even with the best attention in the world, slow down the loading of your own pages. When you add in latency caused by servers having to communicate across the web, it is not all about executing JavaScript code.

A self-hosted analytics package would avoid the latter, and I found one recently through Lifehacker: Piwik, formerly known as PHPMyVisites. Usefully, it turns out that it does next to everything that Google Analytics does. While I'd prefer that it used PHP for this, JavaScript is its means of tracking web pages too. Nevertheless, page loading is still faster than with Google Analytics and/or Woopra and Firefox/NoScript users would only have to allow JavaScript for one site too. If you have had experience with installing PHP/MySQL-powered publishing platforms like WordPress, Textpattern and such like, then putting Piwik in place is no ordeal. Though, you may find yourself changing folder access, uploading of the required files, the specification of database credentials and adding an administration user is all fairly standard stuff. After all that, I have the thing tracking this edifice as well as my outdoor activities (hillwalking/cycling/photography) web presence and I cannot say that I have any complaints, so we'll see how it goes from here.

JavaScript: write it yourself or use a library?

3rd July 2008

I must admit that I have never been a great fan of JavaScript. For one thing, its need to interact with browser objects places you at the mercy of the purveyors of such pieces of software. Debugging is another fine art that can seem opaque to the uninitiated, since the amount and quality of the logging is determined by an interpreter not provided by the language's overseers. All in all, it seems to present a steep and obstacle-strewn learning curve to newcomers. As it happens, I have always found server side scripting languages like PHP and Perl to be more to my taste, and I have no aversion at all to writing SQL.

In the late 1990's when I was still using free web hosting, JavaScript probably was the best option for my then new online photo gallery. Whatever was the truth, it certainly was the way that I went. While learning Java or Flash might have been useful, I never managed to devote sufficient time to the task, so JavaScript turned out to be the way forward until I got a taste of server side scripting. Moving to paid hosting allowed for that to develop and the JavaScript option took a back seat.

Based on my experience of the browser wars and working with JavaScript throughout their existence, I was more than a little surprised at the buzz surrounding AJAX. Ploughing part of the way through WROX's Beginning AJAX did nothing to sell the technology to me; it came across as a very dry, jargon-blighted read. Nevertheless, I do see the advantages of web applications being as responsive as their desktop equivalents, but AJAX doesn't always guarantee this; as someone who has seen such applications crawling on IE6, I can certainly vouch for this. In fact, I suspect that may be behind the appearance of technologies such as AIR and Silverlight, so JavaScript may get usurped yet again, just like my move to a photo gallery powered on the server side.

Even with these concerns, using JavaScript to add a spot more interactivity is never a bad thing even if it can be overdone, hence the speed problems that I have witnessed. In fact, I have been known to use DOM scripting, but I need to have the use in mind before I can experiment with a technology; I cannot do it the other way around. Nevertheless, I am keen to see what JavaScript libraries such as jQuery and Prototype might have to offer (both have been used in WordPress). Since I have happened on their respective websites, they might make good places to start, and who knows where my curiosity might take me?

An inappropriate use of JavaScript

3rd July 2007

I have seen a web application that displays thousands of records in a scrollable table (please bear with me, there is a decent reason for this). From the appearance of the table, it would be reasonable to assume that the table is generated by the server and output directly to the screen, but this isn't the case. What actually happens is that the server more or less outputs JavaScript code that is then executed. This takes the form of large arrays that are slotted into the DOM as the contents of the required table by a JavaScript function. With the large amounts of data involved, this means that the browser fully loads the client CPU while the JavaScript processing takes place, something that takes up to a minute to complete. Admittedly, the browser is IE6, but this was all on a PC with a 2.53 GHz Pentium 4 and 512 MB of memory. Getting the server to deliver standards-compliant (X)HTML for what is needed in the first place seems a much, much better approach to me.

IE6 and JavaScript performance

22nd June 2007

Having been exposed to an application at work that uses a lot of JavaScript, I fully appreciate what some mean when they discuss IE6's inefficient handling of JavaScript. After seeing a web page taking an age to reload and your CPU taking a hammering because of JavaScript processing, the penny does tend to drop...

Needless to say, this very much impacts the world of AJAX-driven web applications with their heavy dependence on client-side JavaScript. While IE7 does come to the rescue, there remain plenty of IE6 users still out there, and this is reflected in website statistics. This demonstrates a certain level of inertia in the browser market that not only afflicts the uptake of IE7 but also the likes of Mozilla, Opera and Safari. It also means that anyone developing AJAX applications very much needs to continue testing in IE6, especially if the product of their labours is for wider public use.

An example of such an application is Zimbra, an open-source web application for messaging and collaboration, and the people behind it have generously shared the results of their browser performance benchmarking. They did comparisons of IE6 vs. IE7 and Firefox 2 vs. IE7. IE6 easily came out as the worst of these, while Firefox 2 was the best.

The next question to be asked could centre around the type of code that is processed inefficiently by IE6. While I wouldn't be at all surprised if a list emerged, here's one: using Microsoft's proprietary innerHTML object to update the DOM for a web page format. Having a quick trawl on Google, this came up for mention as a cause of memory leaks. It is also a Microsoft innovation that never got taken up by those overseeing web standards, hardly a surprise since a spot of DOM scripting achieves the same end. It may be faster to code than any alternatives, and it does have some support from other browsers, but it does seem to have got a bad name, so it should be avoided if possible. That said, it would be interesting to see a performance comparison between innerHTML and DOM methods in IE6.

Perl vs. PHP: A Personal Experience

11th June 2007

Ever since I converted it from a client-side JavaScript-powered affair, my online photo gallery has been written in Perl. There have been some challenges along the way, figuring out how to use hash tables has been one, but everything has worked as expected. However, I am now wondering if it is better to write things in PHP for the sake of consistency with the rest of the website. I had a go a rewriting the random photo page and, unless I have been missing something in the Perl world, things do seem more succinct with PHP. For instance, actions that formerly involved several lines of code can now be achieved in one. Reading the contents of a file into an array and stripping HTML/XML tags from a string fall into this category, and seeing the number of lines of code halving is a striking observation. I am not going to completely abandon Perl, it's a very nice language, but I do rather suspect that there is now an increased chance of my having a website whose server-side processing needs are served entirely by PHP.

Exploring AJAX

7th June 2007

When I started it, my online photo gallery started out simply as a set of interlinked HTML pages. Over time, I discovered frames (yes, them!) and started to make use of JavaScript to make the slideshows slicker. In those days, I was working off free webspace provided by my ISP and client-side scripting was the only tool that I had for enhancing functionality. Having tired of the vagaries of client-side scripting while the browser wars were in full swing and incompatibilities reigned supreme, I went with paid hosting to get access to tools like Perl and PHP for server-side processing. Because their flexibility compared to JavaScript was a breath of fresh air to me, I am still a fan of the server-side approach.

The journey that I have just described is one that I now know was followed by many website builders around the same time. Nevertheless, I have still held on to JavaScript for some things, particularly for updating the DOM as part of making the pages more responsive to user interaction. In the last few years, a hybrid approach has been gaining currency: AJAX. This offers the ability to modify parts of a page without needing to reload the whole thing, generating a considerable amount of interest among web application developers.

The world of AJAX is evidently a complex one, though the underlying principle can be explained in simple terms. The essential idea is that you use JavaScript to call a server-side script, PHP is as good an example as any, that returns either text or XML that can be used to update part of a web page in situ without the need to reload it as per the traditional way of working. It has opened up so many possibilities from the interface design point of view that AJAX became a hot topic that still receives much attention today. One bugbear is efficiency because I have seen an AJAX application lock up a PC with a little help from IE6. There will always remain times when server-side processing is the best route, even if that needs to be balanced against the client-side approach.

Like its forbear DHTML, AJAX is really a development approach using a number of different technologies in combination. The DHTML elements such as (X)HTML, CSS, DOM and JavaScript are very much part of the AJAX world but server-side elements such as HTTP, PHP, MySQL and XML are also very much part of the fabric of the landscape. In fact, while AJAX can use plain text as the transfer format, XML is the one implied by the AJAX acronym and XSLT is used to transform XML into HTML. However, AJAX is not limited to the aforementioned technologies; for instance, I cannot see why Perl cannot play a role in place of PHP and ASP, both of which can be used for the same things.

Even in these standards-compliant days, browser support for AJAX remains diverse, to say the least, and it is akin to having MSIE in one corner and the rest in the other. Mind you, Microsoft did introduce the tools in the first place only for them to use ActiveX, while Mozilla created a new object type rather than continue this method of operation. Given that ActiveX is a Windows-only technology, I can see why Mozilla did what they did, and it is a sensible decision. In fact, IE7 appears to have picked up the Mozilla way of doing things.

Even with the apparent convergence, there will continue to be a need for the AJAX JavaScript libraries that are currently out there. Incidentally, Adobe has included one called Spry with Dreamweaver CS3. Nevertheless, I still like to find out how things work at the basic level and feel somewhat obstructed when I cannot do this. I remember perusing Wrox’s Professional AJAX and found the constant references to the associated function library rather grating; the writing style didn’t help either.

My taking a more granular approach has got me reading SAMS Teach Yourself AJAX in 10 Minutes as a means for getting my foot in the door. As with their Teach Yourself … in 24 Hours series, the title is a little misleading since there are 22 lessons of 10 minutes in duration (the 24 Hours moniker refers to there being 24 lessons, each of one hour in length). Anything composed of 10-minute lessons, even 22 of them, is never going to be comprehensive but, as a means for getting started, I have to say that the approach seems effective based on this volume. It has certainly whetted my appetite for giving AJAX a go, and it’ll be interesting to see how things progress from here.

A web development toolbox

23rd March 2007

Having been on a web-building journey from Geocities to having a website with my own domain hosted by Fasthosts, it should come as no surprise that I have encountered a number of tools and technologies over this time and that my choices and knowledge have evolved too. I’ll muse over the technologies first before going on to the tools that I use.

Technologies

XHTML

When I started building websites, it was not after HTML 4 got released, and I devoured most if not all of Elizabeth Castro’s Peachpit Visual Quickstart guide to the language within a weekend. Having previously used fairly primitive WYSIWYG tools like Netscape Composer and Claris Home Page, it was an empowering experience and the first edition (it is now on its third) of Jennifer Niederst Robbins’ Web Design in a Nutshell took things much further, becoming something of a bible for a number of years.

When it first appeared, XHTML 1.0 wasn’t a major change from HTML 4, but its stricter more XML-compliant syntax was meant to point the way to the future and semantic markup was at its heart at least as much as it was for HTML 4. XHTML 2.0 is on the horizon and after the modular approach of XHTML 1.1 (which I have never used), it will be interesting to see how it develops. Nevertheless, there is a surprising development in that some people are musing over the idea of having an HTML 5. Let’s hope that the (X)HTML apple cart doesn’t get completely overturned after some years of relative stability. I still bear scars from the browser wars raging in the 1990’s and don’t want to see standards wars supplanting the relative peace that we have now. That said, I don’t mind peaceful progression.

CSS

Only seems to be coming into its own in the last few years and is truly a remarkable technology despite the hobbles that MSIE places on our ambitions. CSS Zen Garden has been a major source of ideas; I wouldn’t have been able to customise this blog as much as I have without them. I was an early adopter of the technology and got burnt by inconsistent browser support; Netscape 4 was the proverbial bête noir back then, fulfilling the role that MSIE plays today. In those days, it was the idea of controlling text display and element backgrounds from a single place that appealed. Since then, I have progressed to using CSS to replace table-based layouts and to control element positioning. It can do more…

JavaScript

Having had a JavaScript-powered photo gallery before my current Perl-driven one, I can say that I have definitely sampled this ever-pervasive scripting language. Being a client-side language rather than a server-side one, it does place you rather at the mercy of the browser purveyors, and it never ceases to amaze me that there is a buzz around AJAX because of this. In fact, the abundance of AJAX cross-browser function libraries is testimony to the need for browser-specific code. Despite my preferences for server-side scripting, I still find a use for JavaScript, and its main use for me these days is to dynamically control CSS elements to do such things as control the height of a page element or whether it is shown or not. Apparently, CSS may get some dynamic capabilities in the future and reduce my dependence on JavaScript. Meanwhile, Jeremy Keith’s DOM Scripting (Friends of Ed) will prove as much of an asset as it has done.

XML

These days, a lot of the raw data underlying my personal website is stored in XML. I did try to dynamically transform the display of the XML into something meaningful with CSS and XSLT when I first scaled its dizzy heights, but I soon resorted to other techniques. Browser support and the complexity of what I required were the major contributors to this. The new strategy involved two different approaches. The first was to create PHP/XHTML pages from the precursor XML offline, and this is how I generate the website’s directory pages. The other one is to process the XML as text to dynamically supply an XHTML page as the user visits it; this is the way that the photo gallery works.

Perl

This still powers all of my photo gallery. While thoughts of changing it all to PHP linger, there is a certain something about the Perl language that keeps it there. I suppose it is that PHP is entangled in the HTML while Perl encases the whole business, and I am reasonably familiar with its syntax these days, which is why it still does a lot of the data processing grunt work that I need.

PHP

PHP is everywhere these days, though it doesn’t attract quite the level of hype that used to be the case. It still appears with its sidekick MySQL in many website applications. Blogging software such as WordPress and content management systems like Drupal, Mambo and Joomla! wouldn’t exist without the pair. It appears on my website as the glue that holds my visitor directories together and is the processing engine of my WordPress blog. And if I ever get to a Drupal element to the site, by no means a foregone conclusion though I am spending a lot of time with it at this time, PHP will continue its presence in my website scripting as it powers that too.

Applications

Macromedia HomeSite

I have a liking for hand coding, so this does most of what I need. When Macromedia (itself since taken over by Adobe, of course) took over Allaire, HomeSite sadly lost its WYSIWYG capability, but the application still soldiers on even though Dreamweaver offers a lot to code cutters these days. Nevertheless, it does have certain advantages over Dreamweaver: it is a fleeter beast to start up and colour codes Perl syntax.

Macromedia Dreamweaver

There was a time when Dreamweaver was solely a tool for visual web page development, but the advent of Dreamweaver UltraDev added server-side development capabilities to the Dreamweaver family. These days, there is only one Dreamweaver version, but UltraDev’s capabilities still live on in the latest version and I would not be surprised if they were taken further in these database-driven times.

Nowadays, Dreamweaver isn’t an application where I spend a great deal of time. In former times, when my site was made up of static HTML pages, I used Dreamweaver a lot, even if its rendering capabilities were a step behind the then-current browser versions. I suppose that it didn’t fit the way in which I worked, but its template-driven workflow would have been a boon back then.

However, my move from a static site to a dynamic one, starting with my photo gallery, has meant that I haven’t used it as much since then. However, with my use of PHP/MySQL components on my site. Its server-side abilities could get the level of investigation that its PHP/MySQL capabilities allow.

Altova XMLSpy Professional

Adding MySQL databases to my web hosting costs money, not a lot, but it could be spent on other (more important?) things. Hence, I use XML as the data store for my photo gallery and XML files are pre-processed into XHTML/PHP pages for my visitor directories before uploading onto the server.

I use XMLSpy to edit and manage the XML files that I use: its ability to view XML in grid format is a killer feature as far as I am concerned and XML validation also proves very useful; particularly when it comes to ensuring that DTD’s and XML files are in step and for the correct coding of XSLT files. There are other features that I need to explore and that would also take my knowledge of the XML further to boot, not at all a bad thing.

Saxon

For processing XML into another file format such as XHTML, you need a parser and I use the free version of Saxon to do the needful, Saxonica offers commercial versions of it. There is, I believe, a parser in XMLSpy, but I don’t use it because Saxon’s command line interface fits better into my workflow. This is a Perl-driven process where XML files are read and XSLT files, one per XML file, are built before both are fed to Saxon for transforming into XHTML/PHP files. It all works smoothly and updating the XML inputs is all that is required.

AceFTP

If I were looking for an FTP client now, it would be FileZilla, but AceFTP has served me well over the last few years, and it looks as if that will continue. It does have some extra features over FileZilla: transfers between remote sites, and scheduling, for example. I have yet to use either, but they look valuable.

Hutmil

In bygone days when I had loads of static HTML files, making changes was a bit of a chore if they affected every single file. An example is changing the year on the copyright message on the page footers. Hutmil, which I found on a magazine cover-mounted disc, was a great time saver in those days. Today, I achieve this by putting this information into a single file and getting Perl or PHP to import that when building the page. The same “define once, use anywhere” approach underlies CSS as well, and scripting very usefully allows you to take that into the XHTML domain.

Apache

Apache is ubiquitous these days, and both the online and offline versions of my site are powered by it. It does require some configuration, but it is a powerful piece of kit. The introduction of 2.2.x meant a big change in the way that configuration files were modularised and while most things were contained in a single file for 2.0.x, the settings are broken up into different files in 2.2.x, and it can take a while to find things again. Without having it on my home PC, I would not be able to use Perl, PHP or MySQL. Apart from this, I especially like its virtual site capability; very useful for offline development.

WordPress

My hosting supplier offers blogs on Blogware, but that didn’t offer the level of configuration that I would have liked. It is true that this is probably true of any host of blogs. I can’t speak for Blogger, but WordPress.com does have its restrictions too. To make my hillwalking blog fit in with the appearance of my photo gallery, I went popped over to WordPress.org to download WordPress so that I could host a blog myself and have maximum control over its appearance. WordPress supports themes, so I created my own and got my blog pages looking as if they are part of my website, rather than looking like something that was bolted on. Now that I think of it, what about WordPress supporting user-created themes? I support that there is the worry of insecure PHP code but what about it?

MySQL

I am between minds on whether this is a technology or a tool. SQL certainly would be a technology standard, but I am not so clear on what MySQL would be. In any case, I have classed it as a tool, and a very useful one at that. It is the linchpin for my WordPress blogs and, if I go for a content management system like Drupal, its role would surely grow. While I do have a lot of experience with using SAS SQL and this helps me to deal with other varieties, there is still a learning curve with MySQL that gets me heading for a good book and Kofler’s The Definitive Guide to MySQL5 (Apress) seems to perform more than adequately in this endeavour.

Paint Shop Pro

As someone who hosts an online photo gallery, it won’t come as a surprise that I have had exposure to image editors. Despite various other flirtations, Paint Shop Pro has been my tool of choice over the years, but it is now set to be usurped by a member of Adobe’s Photoshop family. Paint Shop Pro does have books devoted to it, but it appears that Photoshop gets better coverage and I feel that my image processing needs to be taken up a gear, hence the potential move to Photoshop

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.