Technology Tales

Adventures in consumer and enterprise technology

From boardroom to code: More options for AI and Data Science education

27th July 2025

The artificial intelligence revolution has created an unprecedented demand for education that spans from executive strategy to technical implementation. Modern professionals face the challenge of navigating a landscape where understanding AI's business implications proves as crucial as mastering its technical foundations. This comprehensive examination explores five distinguished programmes that collectively address this spectrum, offering pathways for business professionals, aspiring data scientists and technical specialists seeking advanced expertise.

Strategic Business Implementation Through Practical AI Tools

LinkedIn Learning's Applying Generative AI as a Business Professional programme represents the entry point for professionals seeking immediate workplace impact. This focused five-hour curriculum across six courses addresses the practical reality that most business professionals need functional AI literacy rather than technical mastery. The programme emphasises hands-on application of contemporary tools including ChatGPT, Claude and Microsoft Copilot, recognising that these platforms have become integral to modern professional workflows.

The curriculum's strength lies in its emphasis on prompt engineering techniques that yield immediate productivity gains. Participants learn to craft effective queries that consistently produce useful outputs, a skill that has rapidly evolved from novelty to necessity across industries. The programme extends beyond basic tool usage to include strategies for creating custom GPTs without programming knowledge, enabling professionals to develop solutions that address specific organisational challenges.

Communication enhancement represents another critical component, as the programme teaches participants to leverage AI for improving written correspondence, presentations and strategic communications. This practical focus acknowledges that AI's greatest business value often emerges through augmenting existing capabilities rather than replacing human expertise. The inclusion of critical thinking frameworks for AI-assisted decision-making ensures that participants develop sophisticated approaches to integrating artificial intelligence into complex business processes.

Academic Rigour Meets Strategic AI Governance

The University of Pennsylvania's AI for Business Specialisation on Coursera elevates business AI education to an academic level whilst maintaining practical relevance. This four-course programme, completed over approximately four weeks, addresses the strategic implementation challenges that organisations face when deploying AI technologies at scale. The curriculum's foundation in Big Data fundamentals provides essential context for understanding the data requirements that underpin successful AI initiatives.

The programme's exploration of machine learning applications in marketing and finance demonstrates how AI transforms traditional business functions. Participants examine customer journey optimisation techniques, fraud prevention methodologies and personalisation technologies that have become competitive necessities rather than optional enhancements. These applications receive thorough treatment that balances technical understanding with strategic implications, enabling participants to make informed decisions about AI investments and implementations.

Particularly valuable is the programme's emphasis on AI-driven people management practices, addressing how artificial intelligence reshapes human resources, talent development and organisational dynamics. This focus acknowledges that successful AI implementation requires more than technological competence; it demands sophisticated understanding of how these tools affect workplace relationships and employee development.

The specialisation's coverage of strategic AI governance frameworks proves especially relevant as organisations grapple with ethical deployment challenges. Participants develop comprehensive approaches to responsible AI implementation that address regulatory compliance, bias mitigation and stakeholder concerns. This academic treatment of AI ethics provides the foundational knowledge necessary for creating sustainable AI programmes that serve both business objectives and societal responsibilities.

Industry-Standard Professional Development

IBM's Data Science Professional Certificate represents a bridge between business understanding and technical proficiency, offering a comprehensive twelve-course programme designed for career transition. This four-month pathway requires no prior experience whilst building industry-ready capabilities that align with contemporary data science roles. The programme's strength lies in its integration of technical skill development with practical application, ensuring graduates possess both theoretical knowledge and hands-on competency.

The curriculum's progression from Python programming fundamentals through advanced machine learning techniques mirrors the learning journey that working data scientists experience. Participants gain proficiency with industry-standard tools including Jupyter notebooks, GitHub and Watson Studio, ensuring familiarity with the collaborative development environments that characterise modern data science practice. This tool proficiency proves essential for workplace integration, as contemporary data science roles require seamless collaboration across technical teams.

The programme's inclusion of generative AI applications reflects IBM's recognition that artificial intelligence has become integral to data science practice rather than a separate discipline. Participants learn to leverage AI tools for data analysis, visualisation and insight generation, developing capabilities that enhance productivity whilst maintaining analytical rigour. This integration prepares trainees for data science roles that increasingly incorporate AI-assisted workflows.

Real-world project development represents a crucial component, as participants build comprehensive portfolios that demonstrate practical proficiency to potential employers. These projects address authentic business challenges using genuine datasets, ensuring that participants can articulate their capabilities through concrete examples.

Advanced Technical Mastery Through Academic Excellence

Andrew Ng's Machine Learning Specialisation on Coursera establishes the technical foundation for advanced AI practice. This three-course programme, completed over approximately two months, provides comprehensive coverage of core machine learning concepts whilst emphasising practical implementation skills. Andrew Ng's reputation as an AI pioneer lends exceptional credibility to this curriculum, ensuring that participants receive instruction that reflects both academic rigour and industry best practices.

The specialisation's treatment of supervised learning encompasses linear and logistic regression, neural networks and decision trees, providing thorough grounding in the algorithms that underpin contemporary machine learning applications. Participants develop practical proficiency with Python, NumPy and scikit-learn, gaining hands-on experience with the tools that professional machine learning practitioners use daily. This implementation focus ensures that theoretical understanding translates into practical capability.

Unsupervised learning includes clustering algorithms, anomaly detection techniques and certain approaches in recommender systems, all of which contribute to powering modern digital experiences. The programme's exploration of reinforcement learning provides exposure to the techniques driving advances in autonomous systems and game-playing AI. This breadth ensures that participants understand the full spectrum of machine learning approaches, rather than developing narrow expertise in specific techniques.

Cutting-Edge Deep Learning Applications

Again available through Coursera, Andrew Ng's Deep Learning Specialisation extends technical education into the neural network architectures that drives contemporary AI. This five-course programme, spanning approximately three months, addresses the advanced techniques that enable computer vision, natural language processing and complex pattern recognition applications. The intermediate-level curriculum assumes foundational machine learning knowledge whilst building expertise in cutting-edge methodologies.

Convolutional neural network coverage provides comprehensive understanding of computer vision applications, from image classification through object detection and facial recognition. Participants develop practical skills with CNN architectures that power visual AI applications across industries. The programme's treatment of recurrent neural networks and LSTMs addresses sequence processing challenges in speech recognition, machine translation and time series analysis.

The specialisation's exploration of transformer architectures proves particularly relevant given their central role in large language models and natural language processing breakthroughs. Participants gain understanding of attention mechanisms, transfer learning techniques and the architectural innovations that enable modern AI capabilities. This coverage ensures they understand the technical foundations underlying contemporary AI advances.

Real-world application development represents a crucial component, as participants work on speech recognition systems, machine translation applications, image recognition tools and chatbot implementations. These projects utilise TensorFlow, a dominant framework for deep learning development, ensuring that graduates possess practical experience with production-ready tools.

Strategic Integration and Future Pathways

These five programmes collectively address the comprehensive skill requirements of the modern AI landscape, from strategic business implementation through advanced technical development. The progression from practical tool usage through academic business strategy to technical mastery reflects the reality that successful AI adoption requires capabilities across multiple domains. Organisations benefit most when business leaders understand AI's strategic implications, whilst technical teams possess sophisticated implementation capabilities.

The integration of business strategy with technical education acknowledges that artificial intelligence's transformative potential emerges through thoughtful application rather than technological sophistication alone. These programmes prepare professionals to contribute meaningfully to AI initiatives regardless of their specific role or technical background, ensuring that organisations can build comprehensive AI capabilities that serve both immediate needs and long-term strategic objectives.

SAS Packages: Revolutionising code sharing in the SAS ecosystem

26th July 2025

In the world of statistical programming, SAS has long been the backbone of data analysis for countless organisations worldwide. Yet, for decades, one of the most significant challenges facing SAS practitioners has been the efficient sharing and reuse of code. Knowledge and expertise have often remained siloed within individual developers or teams, creating inefficiencies and missed opportunities for collaboration. Enter the SAS Packages Framework (SPF), a solution that changes how SAS professionals share, distribute and utilise code across their organisations and the broader community.

The Problem: Fragmented Knowledge and Complex Dependencies

Anyone who has worked extensively with SAS knows the frustration of trying to share complex macros or functions with colleagues. Traditional code sharing in SAS has been plagued by several issues:

  • Dependency nightmares: A single macro often relies on dozens of utility macros working behind the scenes, making it nearly impossible to share everything needed for the code to function properly
  • Version control chaos: Keeping track of which version of which macro works with which other components becomes an administrative burden
  • Platform compatibility issues: Code that works on Windows might fail on Linux systems and vice versa
  • Lack of documentation: Without proper documentation and help systems, even the most elegant code becomes unusable to others
  • Knowledge concentration: Valuable SAS expertise remains trapped within individuals rather than being shared with the broader community

These challenges have historically meant that SAS developers spend countless hours reinventing the wheel, recreating functionality that already exists elsewhere in their organisation or the wider SAS community.

The Solution: SAS Packages Framework

The SAS Packages Framework, developed by Bartosz Jabłoński, represents a paradigm shift in how SAS code is organised, shared and deployed. At its core, a SAS package is an automatically generated, single, standalone zip file containing organised and ordered code structures, extended with additional metadata and utility files. This solution addresses the fundamental challenges of SAS code sharing by providing:

  • Functionality over complexity: Instead of worrying about 73 utility macros working in the background, you simply share one file and tell your colleagues about the main functionality they need to use.
  • Complete self-containment: Everything needed for the code to function is bundled into one file, eliminating the "did I remember to include everything?" problem that has plagued SAS developers for years.
  • Automatic dependency management: The framework handles the loading order of code components and automatically updates system options like cmplib= and fmtsearch= for functions and formats.
  • Cross-platform compatibility: Packages work seamlessly across different operating systems, from Windows to Linux and UNIX environments.

Beyond Macros: A Spectrum of SAS Functionality

One of the most compelling aspects of the SAS Packages Framework is its versatility. While many code-sharing solutions focus solely on macros, SAS packages support a wide range of SAS functionality:

  • User-defined functions (both FCMP and CASL)
  • IML modules for matrix programming
  • PROC PROTO C routines for high-performance computing
  • Custom formats and informats
  • Libraries and datasets
  • PROC DS2 threads and packages
  • Data generation code
  • Additional content such as documentation PDF's

This comprehensive approach means that virtually any SAS functionality can be packaged and shared, making the framework suitable for everything from simple utility macros to complex analytical frameworks.

Real-World Applications: From Pharmaceutical Research to General Analytics

The adoption of SAS packages has been particularly notable in the pharmaceutical industry, where code quality, validation and sharing are critical concerns. The PharmaForest initiative, led by PHUSE Japan's Open-Source Technology Working Group, exemplifies how the framework is being used to revolutionise pharmaceutical SAS programming. PharmaForest offers a collaborative repository of SAS packages specifically designed for pharmaceutical applications, including:

  • OncoPlotter: A comprehensive package for creating figures commonly used in oncology studies
  • SAS FAKER: Tools for generating realistic test data while maintaining privacy
  • SASLogChecker: Automated log review and validation tools
  • rtfCreator: Streamlined RTF output generation

The initiative's philosophy perfectly captures the spirit of the SAS Packages Framework: "Through SAS packages, we want to actively encourage sharing of SAS know-how that has often stayed within individuals. By doing this, we aim to build up collective knowledge, boost productivity, ensure quality through standardisation and energise our community".

The SASPAC Archive: A Growing Ecosystem

The establishment of SASPAC (SAS Packages Archive) represents the maturation of the SAS packages ecosystem. This dedicated repository serves as the official home for SAS packages, with each package maintained as a separate repository complete with version history and documentation. Some notable packages available through SASPAC include:

  • BasePlus: Extends BASE SAS with functionality that many developers find themselves wishing was built into SAS itself. With 12 stars on GitHub, it's become one of the most popular packages in the archive.
  • MacroArray: Provides macro array functionality that simplifies complex macro programming tasks, addressing a long-standing gap in SAS's macro language capabilities.
  • SQLinDS: Enables SQL queries within data steps, bridging the gap between SAS's powerful data step processing and SQL's intuitive query syntax.
  • DFA (Dynamic Function Arrays): Offers advanced data structures that extend SAS's analytical capabilities.
  • GSM (Generate Secure Macros): Provides tools for protecting proprietary code while still enabling sharing and collaboration.

Getting Started: Surprisingly Simple

Despite the capabilities, getting started with SAS packages is fairly straightforward. The framework can be deployed in multiple ways, depending on your needs. For a quick test or one-time use, you can enable the framework directly from the web:

filename packages "%sysfunc(pathname(work))";
filename SPFinit url "https://raw.githubusercontent.com/yabwon/SAS_PACKAGES/main/SPF/SPFinit.sas";
%include SPFinit;

For permanent installation, you simply create a directory for your packages and install the framework:

filename packages "C:SAS_PACKAGES";
%installPackage(SPFinit)

Once installed, using packages becomes as simple as:

%installPackage(packageName)
%helpPackage(packageName)
%loadPackage(packageName)

Developer Benefits: Quality and Efficiency

For SAS developers, the framework offers numerous advantages that go beyond simple code sharing:

  • Enforced organisation: The package development process naturally encourages better code organisation and documentation practices.
  • Built-in testing: The framework includes testing capabilities that help ensure code quality and reliability.
  • Version management: Packages include metadata such as version numbers and generation timestamps, supporting modern DevOps practices.
  • Integrity verification: The framework provides tools to verify package authenticity and integrity, addressing security concerns in enterprise environments.
  • Cherry-picking: Users can load only specific components from a package, reducing memory usage and namespace pollution.

The Future of SAS Code Sharing

The growing adoption of SAS packages represents more than just a new tool, it signals a fundamental shift towards a more collaborative and efficient SAS ecosystem. The framework's MIT licensing and 100% open-source nature ensure that it remains accessible to all SAS users, from individual practitioners to large enterprise installations. This democratisation of advanced code-sharing capabilities levels the playing field and enables even small teams to benefit from enterprise-grade development practices.

As the ecosystem continues to grow, with contributions from pharmaceutical companies, academic institutions and individual developers worldwide, the SAS Packages Framework is proving that the future of SAS programming lies not in isolated development, but in collaborative, community-driven innovation.

For SAS practitioners looking to modernise their development practices, improve code quality and tap into the collective knowledge of the global SAS community, exploring SAS packages isn't just an option, it's becoming an essential step towards more efficient and effective statistical programming.

Advance your Data Science, AI and Computer Science skills using these online learning opportunities

25th July 2025

The landscape of online education has transformed dramatically over the past decade, creating unprecedented access to high-quality learning resources across multiple disciplines. This comprehensive examination explores the diverse array of courses available for aspiring data scientists, analysts, and computer science professionals, spanning from foundational programming concepts to cutting-edge artificial intelligence applications.

Data Analysis with R Programming

R programming has established itself as a cornerstone language for statistical analysis and data visualisation, making it an essential skill for modern data professionals. DataCamp's Data Analyst with R programme represents a comprehensive 77-hour journey through the fundamentals of data analysis, encompassing 21 distinct courses that progressively build expertise. Students begin with core programming concepts including data structures, conditional statements, and loops before advancing to sophisticated data manipulation techniques using tools such as dplyr and ggplot2. The curriculum extends beyond basic programming to include R Markdown for reproducible research, data manipulation with data.table, and essential database skills through SQL integration.

For those seeking more advanced statistical expertise, DataCamp's Statistician with R career track provides an extensive 108-hour programme spanning 27 courses. This comprehensive pathway develops essential skills for professional statistician roles, progressing from fundamental concepts of data collection and analysis to advanced statistical methodology. Students explore random variables, distributions, and conditioning through practical examples before advancing to linear and logistic regression techniques. The curriculum encompasses sophisticated topics including binomial and Poisson regression models, sampling methodologies, hypothesis testing, experimental design, and A/B testing frameworks. Advanced modules cover missing data handling, survey design principles, survival analysis, Bayesian data analysis, and factor analysis, making this track particularly suitable for those with existing R programming knowledge who seek to specialise in statistical practice.

The Google Data Analytics Professional Certificate programme, developed by Google and hosted on Coursera with US and UK versions, offers a structured six-month pathway for those seeking industry-recognised credentials. Students progress through eight carefully designed courses, beginning with foundational concepts in "Foundations: Data, Data, Everywhere" and culminating in a practical capstone project. The curriculum emphasises real-world applications, teaching students to formulate data-driven questions, prepare datasets for analysis, and communicate findings effectively to stakeholders.

Udacity's Data Analysis with R course presents a unique proposition as a completely free resource spanning two months of study. This programme focuses intensively on exploratory data analysis techniques, providing students with hands-on experience using RStudio and essential R packages. The course structure emphasises practical application through projects, including an in-depth exploration of diamond pricing data that demonstrates predictive modelling techniques.

Advanced Statistical Learning and Specialised Applications

Duke University's Statistics with R Specialisation elevates statistical understanding through a comprehensive seven-month programme that has earned a 4.6-star rating from participants. This five-course sequence delves deep into statistical theory and application, beginning with probability and data fundamentals before progressing through inferential statistics, linear regression, and Bayesian analysis. The programme distinguishes itself by emphasising both theoretical understanding and practical implementation, making it particularly valuable for those seeking to master statistical concepts rather than merely apply them.

The R Programming: Advanced Analytics course on Udemy, led by instructor Kirill, provides focused training in advanced R techniques within a compact six-hour format. This course addresses specific challenges that working analysts face, including data preparation workflows, handling missing data through median imputation, and working with complex date-time formats. The curriculum emphasises efficiency techniques such as using apply functions instead of traditional loops, making it particularly valuable for professionals seeking to optimise their analytical workflows.

Complementing this practical approach, the Applied Statistical Modelling for Data Analysis in R course on Udemy offers a more comprehensive 9.5-hour exploration of statistical methodology. The curriculum covers linear modelling implementation, advanced regression analysis techniques, and multivariate analysis methods. With its emphasis on statistical theory and application, this course serves those who already possess foundational R and RStudio knowledge but seek to deepen their understanding of statistical modelling approaches.

Imperial College London's Statistical Analysis with R for Public Health Specialisation brings academic rigour to practical health applications through a four-month programme. This specialisation addresses real-world public health challenges, using datasets that examine fruit and vegetable consumption patterns, diabetes risk factors, and cardiac outcomes. Students develop expertise in linear and logistic regression while gaining exposure to survival analysis techniques, making this programme particularly relevant for those interested in healthcare analytics.

Visualisation and Data Communication

Johns Hopkins University's Data Visualisation & Dashboarding with R Specialisation represents the pinnacle of visual analytics education, achieving an exceptional 4.9-star rating across its four-month curriculum. This five-course programme begins with fundamental visualisation principles before progressing through advanced ggplot2 techniques and interactive dashboard development. Students learn to create compelling visual narratives using Shiny applications and flexdashboard frameworks, skills that are increasingly essential in today's data-driven business environment.

The programme's emphasis on publication-ready visualisations and interactive dashboards addresses the growing demand for data professionals who can not only analyse data but also communicate insights effectively to diverse audiences. The curriculum balances technical skill development with design principles, ensuring graduates can create both statistically accurate and visually compelling presentations.

Professional Certification Pathways

DataCamp's certification programmes offer accelerated pathways to professional recognition, with each certification designed to be completed within 30 days. The Data Analyst Certification combines timed examinations with practical assessments to evaluate real-world competency. Candidates must demonstrate proficiency in data extraction, quality assessment, cleaning procedures, and metric calculation, reflecting the core responsibilities of working data analysts.

The Data Scientist Certification expands these requirements to include machine learning and artificial intelligence applications, requiring candidates to collect and interpret large datasets whilst effectively communicating results to business stakeholders. Similarly, the Data Engineer Certification focuses on data infrastructure and preprocessing capabilities, essential skills as organisations increasingly rely on automated data pipelines and real-time analytics.

The SQL Associate Certification addresses the universal need for database querying skills across all data roles. This certification validates both theoretical knowledge through timed examinations and practical application through hands-on database challenges, ensuring graduates can confidently extract and manipulate data from various database systems.

Emerging Technologies and Artificial Intelligence

The rapid advancement of artificial intelligence has created new educational opportunities that bridge traditional data science with cutting-edge generative technologies. DataCamp's Understanding Artificial Intelligence course provides a foundation for those new to AI concepts, requiring no programming background whilst covering machine learning, deep learning, and generative model fundamentals. This accessibility makes it valuable for business professionals seeking to understand AI's implications without becoming technical practitioners.

The Generative AI Concepts course builds upon this foundation to explore the specific technologies driving current AI innovation. Students examine how large language models function, consider ethical implications of AI deployment, and learn to maximise the effectiveness of AI tools in professional contexts. This programme addresses the growing need for AI literacy across various industries and roles.

DataCamp's Large Language Model Concepts course provides intermediate-level exploration of the technologies underlying systems like ChatGPT. The curriculum covers natural language processing fundamentals, fine-tuning techniques, and various learning approaches including zero-shot and few-shot learning. This technical depth makes it particularly valuable for professionals seeking to implement or customise language models within their organisations.

The ChatGPT Prompt Engineering for Developers course addresses the developing field of prompt engineering, a skill that has gained significant commercial value. Students learn to craft effective prompts that consistently produce desired outputs from language models, a capability that combines technical understanding with creative problem-solving. This expertise has become increasingly valuable as organisations integrate AI tools into their workflows.

Working with OpenAI API provides practical implementation skills for those seeking to build AI-powered applications. The course covers text generation, sentiment analysis, and chatbot development, giving students hands-on experience with the tools that are reshaping how businesses interact with customers and process information.

Computer Science Foundations

Stanford University's Computer Science 101 offers an accessible introduction to computing concepts without requiring prior programming experience. This course addresses fundamental questions about computational capabilities and limitations whilst exploring hardware architecture, software development, and internet infrastructure. The curriculum includes essential topics such as computer security, making it valuable for anyone seeking to understand the digital systems that underpin modern society.

The University of Leeds' Introduction to Logic for Computer Science provides focused training in logical reasoning, a skill that underlies algorithm design and problem-solving approaches. This compact course covers propositional logic and logical modelling techniques that form the foundation for more advanced computer science concepts.

Harvard's CS50 course, taught by Professor David Malan, has gained worldwide recognition for its engaging approach to computer science education. The programme combines theoretical concepts with practical projects, teaching algorithmic thinking alongside multiple programming languages including Python, SQL, HTML, CSS, and JavaScript. This breadth of coverage makes it particularly valuable for those seeking a comprehensive introduction to software development.

MIT's Introduction to Computer Science and Programming Using Python focuses specifically on computational thinking and Python programming. The curriculum emphasises problem-solving methodologies, testing and debugging strategies, and algorithmic complexity analysis. This foundation proves essential for those planning to specialise in data science or software development.

MIT's The Missing Semester course addresses practical tools that traditional computer science curricula often overlook. Students learn command-line environments, version control with Git, debugging techniques, and security practices. These skills prove essential for professional software development but are rarely taught systematically in traditional academic settings.

Accessible Learning Resources and Community Support

The democratisation of education extends beyond formal courses to include diverse learning resources that support different learning styles and schedules. YouTube channels such as Programming with Mosh, freeCodeCamp, Alex the Analyst, Tina Huang, and Ken Lee provide free, high-quality content that complements formal education programmes. These resources offer everything from comprehensive programming tutorials to career guidance and project-based learning opportunities.

The 365 Data Science platform contributes to this ecosystem through flashcard decks that reinforce learning of essential terminology and concepts across Excel, SQL, Python, and emerging technologies like ChatGPT. Their statistics calculators provide interactive tools that help students understand the mechanics behind statistical calculations, bridging the gap between theoretical knowledge and practical application.

Udemy's marketplace model supports this diversity by hosting over 100,000 courses, including many free options that allow instructors to share expertise with global audiences. The platform's filtering capabilities enable learners to identify resources that match their specific needs and learning preferences.

Industry Integration and Career Development

Major technology companies have recognised the value of contributing to global education initiatives, with Google, Microsoft and Amazon offering professional-grade courses at no cost. Google's Data Analytics Professional Certificate exemplifies this trend, providing industry-recognised credentials that directly align with employment requirements at leading technology firms.

These industry partnerships ensure that course content remains current with rapidly evolving technological landscapes, whilst providing students with credentials that carry weight in hiring decisions. The integration of real-world projects and case studies helps bridge the gap between academic learning and professional application.

The comprehensive nature of these educational opportunities reflects the complex requirements of modern data and technology roles. Successful professionals must combine technical proficiency with communication skills, statistical understanding with programming capability, and theoretical knowledge with practical application. The diversity of available courses enables learners to develop these multifaceted skill sets according to their career goals and learning preferences.

As technology continues to reshape industries and create new professional opportunities, access to high-quality education becomes increasingly critical. These courses represent more than mere skill development; they provide pathways for career transformation and professional advancement that transcend traditional educational barriers. Whether pursuing data analysis, software development, or artificial intelligence applications, learners can now access world-class education that was previously available only through expensive university programmes or exclusive corporate training initiatives.

The future of professional development lies in this combination of accessibility, quality, and relevance that characterises the modern online education landscape. These resources enable individuals to build expertise that matches industry demands, also maintaining the flexibility to learn at their own pace and according to their specific circumstances and goals.

Resolving a glitch in the ChatGPT interface on Firefox using Tampermonkey

19th July 2025

It may be caused either by a new version of Firefox or an update on the OpenAI side, but the ChatGPT prompt box lost its ability to show a cursor while I am entering text. The Ask anything text also disappeared. In Brave, all looked well, and it still persisted in clean Firefox sessions with no extensions loaded. Thus, it was a case of moving browser or getting a fix in Firefox.

The latter has not been needed because I found a fix of sorts. For that, I needed to install the Tampermonkey extension. Then, I could add a new script to override the behaviour that I was seeing:

// ==UserScript==
// @name         ChatGPT Prompt Box Fix (Firefox)
// @namespace    http://tampermonkey.net/
// @version      1.0
// @description  Forces the ChatGPT prompt box textarea to remain visible in Firefox
// @author       You
// @match        https://chatgpt.com/*
// @grant        none
// ==/UserScript==

(function () {
    'use strict';

    const waitForTextarea = () => {
        const textarea = document.querySelector('textarea');
        if (textarea) {
            textarea.style.display = 'block';

            const observer = new MutationObserver(() => {
                if (textarea.style.display === 'none') {
                    console.log('Textarea display:none overridden');
                    textarea.style.display = 'block';
                }
            });

            observer.observe(textarea, { attributes: true, attributeFilter: ['style'] });
        } else {
            setTimeout(waitForTextarea, 300); // Keep retrying until textarea appears
        }
    };

    waitForTextarea();
})();

In short, this deals with a rogue display: none; line in the CSS, which equally well could have been inserted by JavaScript from somewhere that I cannot track down. The extra code is executed within a self-contained function to prevent interference with other elements and is restricted to the ChatGPT domain, which avoids unwanted impacts on the display of other websites.

The first step is to search for the relevant element on the page, retrying at intervals if necessary. Once located, the element's visibility is ensured by explicitly setting its display property to block. Continued monitoring of the element thwarts any dynamic attempts to hide it by changing its style. When such an action is detected, the script automatically overrides such changes to maintain its visibility, thereby ensuring consistent accessibility.

However, challenges with finding the affected element mean that I get the advisory text duplicated. Thus, I see two instances of Ask anything. However, that is a small price to pay for having a flashing cursor telling me where I am in the interface. Such is the nature of modern web coding that its complexity hinders debugging, thus posing the question as to why we are making such things so complicated in the first place.

Synthetic Data: The key to unlocking AI's potential in healthcare

18th July 2025

The integration of artificial intelligence into healthcare is being hindered by challenges such as data scarcity, privacy concerns and regulatory constraints. Healthcare organisations face difficulties in obtaining sufficient volumes of high-quality, real-world data to train AI models, which can accurately predict outcomes or assist in decision-making.

Synthetic data, defined as algorithmically generated data that mimics real-world data, is emerging as a solution to these challenges. This artificially generated data mirrors the statistical properties of real-world data without containing any sensitive or identifiable information, allowing organisations to sidestep privacy issues and adhere to regulatory requirements.

By generating datasets that preserve statistical relationships and distributions found in real data, synthetic data enables healthcare organisations to train AI models with rich datasets while ensuring sensitive information remains secure. The use of synthetic data can also help address bias and ensure fairness in AI systems by enabling the creation of balanced training sets and allowing for the evaluation of model outputs across different demographic groups.

Furthermore, synthetic data can be generated programmatically, reducing the time spent on data collection and processing and enabling organisations to scale their AI initiatives more efficiently. Ultimately, synthetic data are becoming a critical asset in the development of AI in healthcare, enabling faster development cycles, improving outcomes and driving innovation while maintaining trust and security.

Adding a new domain or subdomain to an SSL certificate using Certbot

11th June 2025

On checking the Site Health page of a WordPress blog, I saw errors that pointed to a problem with its SSL set up. The www subdomain was not included in the site's certificate and was causing PHP errors as a result, though they had no major effect on what visitors saw. Still, it was best to get rid of them, so I needed to update the certificate as needed. Execution of a command like the following did the job:

sudo certbot --expand -d existing.com -d www.example.com

Using a Let's Encrypt certificate meant that I could use the certbot command, since that already was installed on the server. The --expand and -d switches ensured that the listed domains were added to the certificate to sort out the observed problem. In the above, a dummy domain name is used, but this was replaced by the real one to produce the desired effect and make things as they should have been.

From mathematical insights to practical applications: Two perspectives on AI

19th April 2025

As AI continues to transform our technological landscape, two recent books offer distinct yet complementary perspectives on understanding and working with these powerful tools. Stephen Wolfram's technical deep dive and Ethan Mollick's practical guide approach the subject from different angles, but both provide valuable insights for navigating our AI-integrated future.

What is ChatGPT Doing?: Wolfram's Technical Lens

Stephen Wolfram's exploration of large language models is characteristically thorough and mathematically oriented. While dense in parts, his analysis reveals fascinating insights about both AI and human cognition.

Perhaps most intriguing is Wolfram's observation that generative AI unexpectedly teaches us about human language production. These systems, in modelling our linguistic patterns with such accuracy, hold up a mirror to our own cognitive processes, perhaps revealing structures and patterns we had not fully appreciated before.

Wolfram does not shy away from highlighting limitations, particularly regarding computational capabilities. As sophisticated as next-word prediction has become through multi-billion parameter neural networks, these systems fundamentally lack true mathematical reasoning. However, his proposal of integrating language models with computational tools like WolframAlpha presents an elegant solution, combining the conversational fluency of AI with precise computational power.

Co-intelligence: Mollick's Practical Framework

Ethan Mollick takes a decidedly more accessible approach in "Co-intelligence," offering accessible strategies for effective human-AI collaboration across various contexts. His framework includes several practical principles:

  • Invite AI to the table as a collaborator rather than merely a tool
  • Maintain human oversight and decision-making authority
  • Communicate with AI systems as if they were people with specific roles
  • Assume current AI represents the lowest capability level you will work with going forward

What makes Mollick's work particularly valuable is its contextual applications. Drawing from his background as a business professor, he methodically examines how these principles apply across different collaborative scenarios: from personal assistant to creative partner, coworker, tutor, coach, and beyond. With a technology, that, even now, retains some of the quality of a solution looking for a problem, these grounded suggestions act as a counterpoint to the torrent of hype that that deluges our working lives, especially if you frequent LinkedIn a lot as I am doing at this time while searching for new freelance work.

Complementary Perspectives

Though differing significantly in their technical depth and intended audience, both books contribute meaningfully to our understanding of AI. Wolfram's mathematical rigour provides theoretical grounding, while Mollick's practical frameworks offer immediate actionable insights. For general readers looking to productively integrate AI into their work and life, Mollick's accessible approach serves as an excellent entry point. Those seeking deeper technical understanding will find Wolfram's analysis challenging but rewarding.

As we navigate this rapidly evolving landscape, perspectives from both technical innovators and practical implementers will be essential in helping us maximise the benefits of AI while mitigating potential drawbacks. As ever, the hype outpaces the practical experiences, leaving us to suffer the marketing output while awaiting real experiences to be shared. It is the latter is more tangible and will allow us to make use of game-changing technical advances.

Removing query strings from any URL on an Nginx-powered website

12th April 2025

My public transport website is produced using Hugo and is hosted on a web server with Nginx. Usually, I use Apache, so this is an exception. When Google highlighted some duplication caused by unneeded query strings, I set to work. However, doing anything with URL's like redirection cannot use a .htaccess file or MOD_REWRITE on Nginx. Thus, such clauses have to go somewhere else and take a different form.

In my case, the configuration file to edit is /etc/nginx/sites-available/default because that was what was enabled. Once I had that open, I needed to find the following block:

location / {
        # First attempt to serve request as file, then
        # as directory, then fall back to displaying a 404.
        try_files $uri $uri/ =404;
}

Because I have one section for port 80 and another for port 443, there were two locations that I needed to update due to duplication, though I may have got away without altering the second of these. After adding the redirection clause, the block became:

location / {
        # First attempt to serve request as file, then
        # as directory, then fall back to displaying a 404.
        try_files $uri $uri/ =404;

        # Remove query strings only when necessary
        if ($args) {
                rewrite ^(.*)$ $1? permanent;
        }
}

The result of the addition is a permanent (301) redirection whenever there are arguments passed in a query string. The $1? portion is the rewritten URL without a query string that was retrieved in the initial ^(.*)$ portion. In other words, the redirect it from the original address to a new one with only the part preceding the question mark.

Handily, Nginx allows you to test your updated configuration using the following command:

sudo nginx -t

That helped me with some debugging. Once all was in order, I needed to reload the service by issuing this command:

sudo systemctl reload nginx

With Apache, there is no need to restart the service after updating the .htaccess file, which adds some convenience. The different locations also mean some care with backups when upgrading the operating system or moving from one server to another. Apart from that, all works well, proving that there can be different ways to complete the same task.

The critical differences between Generative AI, AI Agents, and Agentic Systems

9th April 2025

The distinction between three key artificial intelligence concepts can be explained without technical jargon. Here then are the descriptions:

  • Generative AI functions as a responsive assistant that creates content when prompted but lacks initiative, memory or goals. Examples include ChatGPT, Claude and GitHub Copilot.
  • AI Agents represent a step forward, actively completing tasks by planning, using tools, interacting with APIs and working through processes independently with minimal supervision, similar to a junior colleague.
  • Agentic AI represents the most sophisticated approach, possessing goals and memory while adapting to changing circumstances; it operates as a thinking system rather than a simple chatbot, capable of collaboration, self-improvement and autonomous operation.

This evolution marks a significant shift from building applications to designing autonomous workflows, with various frameworks currently being developed in this rapidly advancing field.

Add Canonical Tags to WordPress without plugins

31st March 2025

Search engines need to know which is which because they cannot know which is the real content when there is any duplication, unless you tell them. That is where canonical tags come in handy. By default, WordPress appears to add these for posts and pages, which makes sense. However, you can add them for other places too. While a plugin can do this for you, adding some code to your theme's functions.php file also does the job. This is how it could look:

function add_canonical_link() {
    global $post;

    // Check if we're on a single post/page
    if (is_singular()) {
        $canonical_url = get_permalink($post->ID);
    } 
    // For the homepage
    elseif (is_home() || is_front_page()) {
        $canonical_url = home_url('/');
    }
    // For category archives
    elseif (is_category()) {
        $canonical_url = get_category_link(get_query_var('cat'));
    }
    // For tag archives
    elseif (is_tag()) {
        $canonical_url = get_tag_link(get_query_var('tag_id'));
    }
    // For other archive pages
    elseif (is_archive()) {
        $canonical_url = get_permalink();
    }
    // Fallback for other pages
    else {
        $canonical_url = get_permalink();
    }

    // Output the canonical link
    echo '' . "\n";
}

// Hook the function to wp_head
add_action('wp_head', 'add_canonical_link');

// Remove default canonical link
remove_action('wp_head', 'rel_canonical');

The first part defines a function to define the canonical URL and create the tag to be added. With that completed, the penultimate piece of code hooks it into the wp_head part of the web page, while the last function gets rid of the default link to get avoid any duplication of output.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.