TOPIC: THE MICROSOFT
24th March 2026
Here are some commonly used R packages and other tools that are pervasive, along with others that I have encountered while getting started with the language, itself becoming pervasive in my line of business. The collection grew organically as my explorations proceeded, and reflects what I was trying out during my acclimatisation.
General
Here are two general packages to get things started, with one of them being unavoidable in the R world. The other is more advanced, possibly offering more to package developers.
{tidyverse}
You cannot use R without knowing about this collection of packages. In many ways, they form a mini-language of their own, drawing some criticism from those who reckon that base R functionality covers a sufficient gamut anyway. Nevertheless, there is so much here that will get you going with data wrangling and visualisation that it is worth knowing what is possible. The complaints may come from your not needing to use anything else for these purposes.
{plumber}
This R package enables developers to convert existing R functions into web API endpoints by adding roxygen2-like comment annotations to their code. Once annotated, functions can handle HTTP GET and POST requests, accept query string or JSON parameters and return outputs such as plain values or rendered plots. The package is available on CRAN as a stable release, with a development version hosted on GitHub. For deployment, it integrates with DigitalOcean through a companion package called {plumberDeploy}, and also supports Posit Connect, PM2 and Docker as hosting options. Related projects in the same space include OpenCPU, which is designed for hosting R APIs in scientific research contexts, and the now-discontinued jug package, which took a more programmatic approach to API construction.
Data Preparation
You simply cannot avoid working with data during any analysis or reporting work. While there is a learning curve if you are used to other languages, there is little doubt that R is well-endowed when it comes to performing these tasks. Here are some packages that extend base R capabilities and might even add some extra user-friendliness along the way.
{forcats}
The {forcats} package in R provides functions to manage categorical variables by reordering factor levels, collapsing infrequent values and adjusting their sequence based on frequency or other variables. It includes tools such as reordering by another variable, grouping rare categories into 'other' and modifying level order manually, which are useful for data analysis and visualisation workflows. Designed as part of the tidyverse, it integrates with other packages to streamline tasks like counting and plotting categorical data, enhancing clarity and efficiency in handling factors within R.
{tidyr}
Around this time last year, I remember completing a LinkedIn course on a set of good practices known as tidy data, where each variable occupies a column, each observation a row and each value a single cell. This package is designed to help users restructure data so it follows those rules. It provides tools for reshaping data between long and wide formats, handling nested lists, splitting or combining columns, managing missing values and layering or flattening grouped data.
Installation options include the {tidyverse} collection, standalone installation, or the development version from GitHub. The package succeeds earlier reshaping tools like {reshape2} and {reshape}, offering a focused approach to tidying data rather than general reshaping or aggregation.
{haven}
Having a long track record of working with SAS, {haven} with its abilities to read and write data files from statistical software such as SAS, SPSS and Stata, leveraging the ReadStat library, arouses my interest. Handily, it supports a range of file formats, including SAS transport and data files, SPSS system and older portable files and Stata data files up to version 15, converting these into tibbles with enhanced printing capabilities. Value labels are preserved as a labelled class, allowing conversion to factors, while dates and times are transformed into standard R classes.
{RMariaDB}
While there are other approaches to working with databases using R, {RMariaDB} provides a database interface and driver for MariaDB, designed to fully comply with the DBI specification and serve as a replacement for the older {RMySQL} package. It supports connecting to databases using configuration files, executing queries, reading and writing data tables and managing results in chunks. Installation options include binary packages from CRAN or development versions from GitHub, with additional dependencies such as MariaDB Connector/C or libmysqlclient required for Linux and macOS systems. Configuration is typically handled through a MariaDB-specific file, and the package includes acknowledgments for contributions from various developers and organisations.
COVID-19 Data Hub
For many people, the pandemic may be a fading memory, yet it offered its chances for learning R, not least because there was a use case with more than a hint of personal interest about it. Here is a library making it easier to get hold of the data, with some added pre-processing too. Memories of how I needed to wrangle what was published by various sources make me appreciate just how vital it is to have harmonised data for analysis work.
Table Production
While many appear to graphical presentation of results to their tabular display, R does have its options here too. In recent times, the options have improved, particularly of the pharmaverse initiative. Here is a selection of what I found during my explorations.
{officer}
Part of the {officeverse} along with {officedown}, {Flextable}, {Rvg} and {mschart}, the {officer} R package enables users to create and modify Word and PowerPoint documents directly from R, allowing the insertion of images, tables and formatted content, as well as the import of document content into data frames. It supports the generation of RTF files and integrates with other packages for advanced features such as vector graphics and native office charts. Installation options include CRAN and GitHub, with community resources available for assistance and contributions. The package facilitates the manipulation of document elements like paragraphs, tables and section breaks and provides tools for exporting and importing content between R and office formats, alongside functions for managing slide layouts and embedded objects in presentations.
{pharmaRTF}
If you work in clinical research like I do, the need to produce data tabulations is a non-negotiable requirement. That is how this package came to be developed and the pharmaverse of which it is part has numerous other options, should you need to look at using one of those. The flavour of RTF produced here is the Microsoft Word variety, which did not look as well in LibreOffice Writer when I last looked at the results with that open-source alternative. Otherwise, the results look well to many eyes.
{formattable}
Here is a package that enhances data presentation by applying customisable formatting to vectors and data frames, supporting formats such as percentages, currency and accounting. Available on GitHub and CRAN, it integrates with dynamic document tools like {knitr} and {rmarkdown} to produce visually distinct tables, with features including gradient colour scales, conditional styling and icon-based representations. It automatically converts to {htmlwidgets} in interactive environments and is licensed under MIT, enabling flexible use in both static and interactive data displays.
{reactable}
The {reactable} package for R provides interactive data tables built on the React Table library, offering features such as sorting, filtering, pagination, grouping with aggregation, virtual scrolling for large datasets and support for custom rendering through R or JavaScript. It integrates seamlessly into R Markdown documents and Shiny applications, enabling the use of HTML widgets and conditional styling. Installation options include CRAN and GitHub, with examples demonstrating its application across various datasets and scenarios. The package supports major web browsers and is licensed under MIT, designed for developers seeking dynamic data presentation tools within the R ecosystem.
{DT}
Particularly useful in dynamic web applications like Shiny, the {DT} package in R provides a means of rendering interactive HTML tables by building on the DataTables JavaScript library. It supports features including sorting, searching, pagination and advanced filtering, with numeric, date and time columns using range-based sliders whilst factor and character columns rely on search boxes or dropdowns. Filtering operates on the client side by default, though server-side processing is also available. JavaScript callbacks can be injected after initialisation to manipulate table behaviour, such as enabling automatic page navigation or adding child rows to display additional detail. HTML content is escaped by default as a safeguard against cross-site scripting attacks, with the option to adjust this on a per-column basis. Whilst the package integrates with Shiny applications, attention is needed around scrolling and slider positioning to prevent layout problems. Overall, the package is well suited to exploratory data analysis and the building of interactive dashboards.
{gt}
The {gt} package in R enables users to create well-structured tables with a variety of formatting options, starting from data frames or tibbles and incorporating elements such as headers, footers and customised column labels. It supports output in HTML, LaTeX and RTF formats and includes example datasets for experimentation. The package prioritises simplicity for common tasks while offering advanced functions for detailed customisation, with installation available via CRAN or GitHub. Users can access resources like documentation, community forums and example projects to explore its capabilities, and it is supported by a range of related packages that extend its functionality.
{gtsummary}
Enabling users to produce publication-ready outputs with minimal code, the {gtsummary} package offers a streamlined approach to generating analytical and summary tables in R. It automates the summarisation of data frames, regression models and other datasets, identifying variable types and calculating relevant statistics, including measures of data incompleteness. Customisation options allow for formatting, merging and styling tables to suit specific needs, while integration with packages such as {broom} and {gt} facilitates seamless incorporation into R Markdown workflows. The package supports the creation of side-by-side regression tables and provides tools for exporting results as images, HTML, Word, or LaTeX files, enhancing flexibility for reporting and sharing findings.
{huxtable}
Here is an R package designed to generate LaTeX and HTML tables with a modern, user-friendly interface, offering extensive control over styling, formatting, alignment and layout. It supports features such as custom borders, padding, background colours and cell spanning across rows or columns, with tables modifiable using standard R subsetting or dplyr functions. Examples demonstrate its use for creating simple tables, applying conditional formatting and producing regression output with statistical details. The package also facilitates quick export to formats like PDF, DOCX, HTML and XLSX. Installation options include CRAN, R-Universe and GitHub, while the name reflects its origins as an enhanced version of the {xtable} package. The logo was generated using the package itself, and the background design draws inspiration from Piet Mondrian’s artwork.
Figure Generation
R has such a reputation for graphical presentations that it is cited as a strong reason to explore what the ecosystem has to offer. While base R itself is not shabby when it comes to creating graphs and charts, these packages will extend things by quite a way. In fact, the first on this list is near enough pervasive.
{ggplot2}
Though its default formatting does not appeal to me, the myriad of options makes this a very flexible tool, albeit at the expense of some code verbosity. Multi-panel plots are not among its strengths, which may send you elsewhere for that need.
{ggforce}
Focusing on features not included in the core library, the {ggforce} package extends {ggplot2} by offering additional tools to enhance data visualisation. Designed to complement the primary role of {ggplot2} in exploratory data analysis, it provides a range of geoms, stats and other components that are well-documented and implemented, aiming to support more complex and custom plot compositions. Available for installation via CRAN or GitHub, the package includes a variety of functionalities described in detail on its associated website, though specific examples are not included here.
{cowplot}
Developed by Claus O. Wilke for internal use in his lab, {cowplot} is an R package designed to help with the creation of publication-quality figures built on top of {ggplot2}. It provides a set of themes, tools for aligning and arranging plots into compound figures and functions for annotating plots or combining them with images. The package can be installed directly from CRAN or as a development version via GitHub, and it has seen widespread use in the book Fundamentals of Data Visualisation.
{sjPlot}
The {sjPlot} package provides a range of tools for visualising data and statistical results commonly used in social science research, including frequency tables, histograms, box plots, regression models, mixed effects models, PCA, correlation matrices and cluster analyses. It supports installation via CRAN for stable releases or through GitHub for development versions, with documentation and examples available online. The package is licensed under GPL-3 and developed by Daniel Lüdecke, offering functions to create visualisations such as scatter plots, Likert scales and interaction effect plots, along with tools for constructing index variables and presenting statistical outputs in tabular formats.
{thematic}
By offering a centralised approach to theming and enabling automatic adaptation of plot styles within Shiny applications, the {thematic} package simplifies the styling of R graphics, including {ggplot2}, {lattice} and base R plots, R Markdown documents and RStudio. It allows users to apply consistent visual themes across different plotting systems, with auto-theming in Shiny and R Markdown relying on CSS and {bslib} themes, respectively. Installation requires specific versions of dependent packages such as {shiny} and {rmarkdown}, while custom fonts benefit from {showtext} or {ragg}. Users can set global defaults for background, foreground and accent colours, as well as fonts, which can be overridden with plot-specific theme adjustments. The package also defines default colour scales for qualitative and sequential data and integrates with tools like bslib to import Google Fonts, enhancing visual consistency across different environments and user interfaces.
Publishing Tools
The R ecosystem goes beyond mere graphical and tabular display production to offer means for taking things much further, often offering platforms for publishing your work. These can be used locally too, so there is no need to entrust everything to a third-party provider. The uses are endless for what is available, and it appears that Posit has used this to help with building documentation and training too.
R Markdown
What you have here is one of those distinguishing facilities of the R ecosystem, particularly for those wanting to share their analysis work with more than a hint of reproducibility. The tool combines narrative text and code to generate various outputs, supporting multiple programming languages and formats such as HTML, PDF and dashboards. It enables users to produce reports, presentations and interactive applications, with options for publishing and scheduling through platforms like RStudio Connect, facilitating collaboration and distribution of results in professional settings.
Distill for R Markdown
Distill for R Markdown is a tool designed to streamline the creation of technical documents, offering features such as code folding, syntax highlighting and theming. It builds on existing frameworks like Pandoc, MathJax and D3, enabling the production of dynamic, interactive content. Users can customise the appearance with CSS and incorporate appendices for supplementary information. The tool acknowledges the contributions of developers who created foundational libraries, ensuring accessibility and functionality for a wide audience. Its design prioritises clarity, allowing authors to focus on presenting results rather than underlying code, while maintaining flexibility for those who wish to include detailed explanations.
{shiny}
For a while, this was one of R's unique selling points, and remains as compelling a reason to use the language even when Python has got its own version of the package. Enabling the creation of interactive web applications for data analysis without requiring web development expertise allows users to build interfaces that let others explore data through dynamic visualisations and filters. Here is a simple example: an app that generates scatter plots with adjustable variables, species filters and marginal plots, hosted either on personal servers or through a dedicated hosting service.
{bslib}
The {bslib} R package offers a modern user interface toolkit for Shiny and R Markdown applications, leveraging Bootstrap to enable the creation of customisable dashboards and interactive theming. It supports the use of updated Bootstrap and Bootswatch versions while maintaining compatibility with existing defaults, and provides tools for real-time visual adjustments. Installation is available through CRAN, with example previews demonstrating its capabilities.
{rhandsontable}
Enabling users to manipulate and validate data within a spreadsheet-like interface, the {rhandsontable} package introduces an interactive data grid for R. It supports features such as custom cell rendering, validation rules and integration with Shiny applications. When used in Shiny, the widget requires explicit conversion of data using the hot_to_r function, as updates may not be immediately reflected in reactive contexts. Examples demonstrate its application in various scenarios, including date editing, financial calculations and dynamic visualisations linked to charts. The package also accommodates bookmarks in Shiny apps with specific handling. Users are encouraged to report issues or contribute improvements, with guidance provided for those seeking to expand its functionality. The development team welcomes feedback to refine the tool further, ensuring it aligns with evolving user needs.
{xaringanExtra}
{xaringanExtra} offers a range of enhancements and extensions for creating and presenting slides with xaringan, enabling features such as adding an overview tile view, making slides editable, broadcasting in real time, incorporating animations, embedding live video feeds and applying custom styles. It allows users to selectively activate individual tools or load multiple features simultaneously through a single function call, supporting tasks like adding banners, enabling code copying, fitting slides to screen dimensions and integrating utility toolkits. The package is available for installation via CRAN or GitHub, providing flexibility for developers and presenters seeking to expand the functionality of their slides.
15th March 2026
Moving from 32-bit SAS on Microsoft Windows to a 64-bit environment can look deceptively straightforward from the outside. The operating system is still Windows, programmes often run without alteration, and many data sets open just as expected. Beneath that continuity, however, sit several technical differences that matter considerably in practice, especially for organisations with long-lived code, established format libraries and regular exchanges with Microsoft Office files.
What makes this transition particularly awkward is that SAS treats some of these changes as more than a simple in-place upgrade. As Jacques Thibault notes in his PharmaSUG 2012 paper, a new operating system will often be accompanied by a new version of surrounding applications, and what matters most is ensuring sufficient time and resources to fully test existing programmes under the new environment before committing to the change. SAS file types are not uniformly portable across the 32-bit to 64-bit boundary, and support behaviour also differs by SAS release, with SAS 9.3 marking the point at which some earlier friction was meaningfully reduced. As of 2025, the current release of the SAS 9 line is SAS 9.4 Maintenance 9 (M9), and organisations running any SAS 9.4 release benefit from the data-set interoperability improvements first introduced in SAS 9.3, whilst the catalog and Office-integration issues described in this article remain relevant across all SAS 9.x environments.
Data Sets and Catalogs: A Fundamental Distinction
The broadest distinction is between SAS data sets and SAS catalogs. Data sets are generally more forgiving, while catalogs are not. SAS Usage Note 38339 explains that when upgrading from 32-bit to 64-bit Windows SAS in releases earlier than SAS 9.3, Cross-Environment Data Access (CEDA) is invoked to access 32-bit SAS data sets. CEDA allows the file to be read without immediate conversion, though it can impose restrictions and may reduce performance. The same note states directly that 64-bit SAS provides no access to 32-bit catalogs at all.
That distinction sits at the centre of most migration problems, and it is the reason a move that feels routine can catch teams off guard when they first encounter the ERROR: CATALOG was created for a different operating system message. As Chris Hemedinger explains in a post on The SAS Dummy, the move from 32-bit SAS for Windows to 64-bit SAS for Windows is, for all intents and purposes, a platform change from SAS's perspective, even though only the bit architecture has changed, and SAS catalogs are not portable across platforms.
How SAS Handles Data Sets Across the Boundary
For data sets, the picture is comparatively manageable. If a 32-bit SAS data set is opened in a 64-bit SAS session in releases before SAS 9.3, SAS writes a note to the log stating that the file is native to another host or that its encoding differs from the current session encoding, and that Cross-Environment Data Access will be used, which might require additional CPU resources and might reduce performance. This is SAS performing translation work in the background, and whilst useful for continued access, it is not always ideal for regular production use.
There is an important nuance that changes things significantly with SAS 9.3. In 32-bit SAS on Windows, the data representation is WINDOWS_32, whilst in 64-bit SAS on Windows it is WINDOWS_64. Hemedinger notes that in SAS 9.3 the developers taught SAS for Windows to bypass the CEDA layer when the only encoding difference is WINDOWS_32 versus WINDOWS_64. SAS Knowledge Base article 38379 confirms this, stating that from SAS 9.3 onwards, Windows 32-bit data sets can be read, written and updated in Windows 64-bit SAS, and vice versa, as a result of a change in how SAS determines file compatibility at open time. Users on SAS 9.3 and later, including all SAS 9.4 maintenance releases, may therefore see fewer warnings and less friction with ordinary data sets originating in 32-bit Windows SAS.
Converting Data Sets to Native 64-Bit Format
Even with those SAS 9.3 improvements, many organisations prefer to convert files into the native 64-bit format rather than rely indefinitely on cross-environment access. For entire libraries, PROC MIGRATE is the recommended mechanism. SAS Usage Note 38339 notes that for releases preceding SAS 9.3, PROC MIGRATE can migrate 32-bit SAS data sets to 64-bit, changing their format so that CEDA is no longer required.
The advantages of PROC MIGRATE over the older conversion procedures are set out in detail by Diane Olson and David Wiehle of SAS Institute in their paper hosted by the University of Delaware. Unlike PROC COPY, PROC MIGRATE retains deleted observations, migrates audit trails, preserves all integrity constraints and automatically retains created and last-modified date/times, compression, encryption, indexes and passwords from the source library. It is designed to produce members in the target library that differ from the source only in being in the new SAS format.
When the task concerns individual SAS data files rather than a whole library, SAS Usage Note 38339 points to PROC COPY with the NOCLONE option. Used in a 64-bit SAS session, this copies a 32-bit Windows data set into a new file that is native to the 64-bit environment. The NOCLONE option prevents SAS from cloning the original data representation during the copy, so that the resulting file is written in the target environment's native format and CEDA is no longer needed to process it. Thibault's PharmaSUG paper illustrates this with an example using PROC COPY with the NOCLONE option together with an OUTREP setting on the target LIBNAME statement to force creation in the desired representation.
Catalogs: The Hard Problem
Catalogs are a different matter entirely. If a user running 64-bit SAS attempts to open a catalog created in a 32-bit SAS session, the familiar error appears: ERROR: CATALOG was created for a different operating system. In the case of format catalogs, a related message often reads ERROR: File LIBRARY.FORMATS.CATALOG was created for a different operating system, and this is frequently followed by failures to use user-defined formats attached to variables. As the SSCC guidance from the University of Wisconsin-Madison notes, this can prevent 64-bit SAS from reading the data set at all, with the error about formats recorded only in the log whilst the visible symptom is simply that the table did not open.
This matters because catalogs are machine-dependent. User-defined formats created by PROC FORMAT are usually stored in catalogs, often in a member named FORMATS. If those formats were built in 32-bit SAS, 64-bit SAS cannot use the catalog directly, and this affects not only explicit formatting in code but also routine data viewing because a data set linked to permanent user-defined formats may fail to display properly unless the associated format catalog is converted.
Options for Migrating Format Catalogs
There are several ways to address catalog incompatibility. If the original PROC FORMAT source code still exists, the cleanest option is simply to rerun it under 64-bit SAS, producing a fresh native catalog. The SSCC guidance treats this as the easiest solution that preserves the formats themselves, and it also describes a short-term workaround: adding a bare format female; statement to the DATA or PROC step, which removes the custom format from that variable, so there is no need to read the problem catalog file at all.
When source code is not available, transport-based conversion is the answer. In a 32-bit SAS session, PROC CPORT creates a transport file from the catalog library, and in a 64-bit SAS session, PROC CIMPORT recreates the catalog in the new environment. SAS Knowledge Base article KB0041614 provides sample code that creates a transport file in 32-bit SAS using proc cport lib=my32 file=trans memtype=catalog; select formats; and then unloads it in 64-bit SAS using PROC CIMPORT, after which a new Formats.sas7bcat file should be present in the target library. The same article notes that if access to a 32-bit SAS session is simply not available, the system option NOFMTERR can be submitted as a last resort: this allows the underlying data values to be displayed whilst user-defined formats are ignored, avoiding the error without converting the catalog.
A more robust route for user-defined formats is to avoid moving the catalog as a catalog at all. PROC FORMAT can write format definitions to a standard SAS data set using CNTLOUT, and later rebuild them from that data set using CNTLIN. Because SAS data sets are generally portable across the 32-bit to 64-bit boundary, this method sidesteps the catalog incompatibility directly. KB0041614 describes CNTLOUT/CNTLIN as the most robust method available for migrating user-defined format libraries. Karin LaPann, writing in a poster presented at a meeting of the Philadelphia Area SAS Users Group, reaches the same conclusion and recommends always creating data sets from format catalogs and storing them alongside the data in the same library as a matter of good practice.
Caveats: Item Stores, Compiled Macros and the PIPE Engine
SAS Usage Note 38339 explicitly states that stored compiled macro catalogs are not supported by PROC CPORT and must be recompiled in the new operating environment, with SAS Note 46846 covering compatibility guidance for those files specifically. The note also warns that the 32-bit version of SAS should not be removed until it can be verified that all 32-bit catalogs have been successfully migrated.
Thibault's PharmaSUG paper identifies two further file types that require attention. SAS Item Store files (.sas7bitm), which organisations may use to store standard PROC TEMPLATE output templates, are not compatible across 32-bit and 64-bit environments, and the practical solution is to recreate them under the new environment using the same programme that created them originally, targeting a different output directory to avoid a mixed 32-bit and 64-bit directory. Thibault also notes that programmes using the PIPE engine may produce errors on Windows 64-bit environments, and recommends replacing such code with newer SAS functions such as filename, dopen and dread to avoid the issue altogether. These are not universal blockers, but they underline why testing is essential rather than assumed.
Microsoft Office Integration After the Move
Another area where 64-bit moves catch users out is access to Microsoft Excel and Access files. The issue is not SAS data compatibility but the bit-ness of the Microsoft data providers. In 64-bit SAS for Windows, attempts to use PROC IMPORT with DBMS=EXCEL, PROC EXPORT with Excel or Access options, or LIBNAME EXCEL can fail with errors such as ERROR: Connect: Class not registered or Connection Failed. As Hemedinger explains, the cause is that the 64-bit SAS process cannot use the built-in data providers for Microsoft Excel or Microsoft Access, which are usually 32-bit modules. Thibault's paper confirms that installation of the PC Files Server on the same machine will be required, since the required 32-bit ODBC drivers are incompatible with 64-bit SAS on Windows.
The workarounds depend on the file type and local setup. SAS/ACCESS to PC Files provides methods such as DBMS=EXCELCS, DBMS=ACCESSCS and LIBNAME PCFILES, all of which use the PC Files Server as an intermediary, with an autostart feature that minimises configuration changes to existing SAS programmes. For .xlsx files, DBMS=XLSX removes the Microsoft data providers from the equation entirely and requires no additional setup from SAS 9.3 Maintenance 1 onwards. Installing 64-bit Microsoft Office may appear to solve the bit-ness mismatch by supplying 64-bit providers, but as Hemedinger cautions, Microsoft recommends the 64-bit version of Office in only a few circumstances, and that route can introduce other incompatibilities with how Office applications are used.
Identifying 32-Bit Catalogs in a Mixed Environment
In mixed environments, a practical challenge is identifying which catalogs are still 32-bit and which are already 64-bit. This was precisely the problem Michael Raithel posed on LinkedIn in March 2015, after finding that no SAS facility, whether PROC CATALOG, PROC CONTENTS, PROC DATASETS or the Dictionary Tables, provided a direct way to distinguish them. His solution treats the .sas7bcat file as a flat file rather than a catalog, reading the first record and searching for the character strings W32_7PRO (identifying a 32-bit catalog) and X64_7PRO (identifying a 64-bit catalog). The macro he developed can be run against any number of catalogs and builds a SAS data set recording the bit-ness and full path of each file, making large-scale inventory automation entirely practical during a phased transition.
For broader validation work, the Olson and Wiehle paper pairs PROC MIGRATE with macros based on PROC CONTENTS, PROC DATASETS and PROC COMPARE, documenting what existed in the source library before migration and verifying what exists in the target library afterwards. For highly regulated or large-scale environments, that kind of structured checking is not optional.
Navigating the Transition Without Unnecessary Disruption
The main lesson from all of this is that moving from 32-bit to 64-bit SAS on Windows is not simply a matter of reinstalling software and carrying on unchanged. Much will work as before, particularly with ordinary data sets and particularly in SAS 9.3 and later. Catalogs, format libraries, item stores and Microsoft Office integration, however, require deliberate attention.
The transition is not so much problematic as predictable. Keeping 32-bit SAS available until catalog migration is confirmed, using PROC MIGRATE for full libraries, using PROC COPY with NOCLONE for individual data sets, converting format catalogs via CPORT/CIMPORT or CNTLOUT/CNTLIN, recreating item stores and compiled macros in the new environment and testing Office-related workflows and PIPE based code before deployment together form a sound path through the process. With that preparation in place, the advantages of a 64-bit environment can be gained without avoidable disruption.
12th March 2026
Outlook continues to evolve across Windows, with a mixture of everyday personalisation options for users and deployment controls for administrators. Recent guidance from Microsoft brings together practical steps for composing messages in a preferred typeface, approaches for reading messages more comfortably, and a set of administrative measures to manage when and how the new Outlook appears in an organisation. Alongside this are reminders about where Outlook stores data on different account types and how that affects moving between computers, as well as pointers for finding POP, IMAP and SMTP settings for Outlook.com when manual configuration is needed. What follows draws these threads together so that individual users and IT teams can navigate the changes with clarity.
Changing the Default Font for New Messages and Replies
For those composing email, Outlook starts with a familiar default: new messages use Calibri in black. This is only a starting point because the application allows the font, its colour, size and style to be changed, and it treats new messages separately from replies and forwards so that different choices can be set for each if desired.
In new Outlook for Windows, the path goes like this: View > View Settings > Email > Compose and Reply. Under Message Format, the preferred font, size and style can be chosen before saving, and these settings then apply whenever a message is written or a reply is sent. Note that in new Outlook the font setting applies to both new messages and replies and forwards from a single control, so a separate choice for each is not available in this version.
In classic Outlook for Windows, the approach is different and more granular. Navigating to File > Options > Mail reveals a Stationery and Fonts button. On the Personal Stationery tab, there are separate Font buttons for new mail messages and for replying or forwarding messages, which allows a distinct typeface, size and colour to be set for each scenario independently. This separation can be useful for distinguishing composed messages from replied ones at a glance. If similar changes are needed for the message list rather than the compose window, there is a separate set of options for changing the font or font size in the message list.
Adjusting the Zoom Level in the Reading Pane
Comfort when reading is equally important, particularly with longer emails. Both new and classic Outlook offer ways to adjust zoom in the Reading Pane without touching system-wide display settings, though the controls differ between the two versions. In new Outlook, selecting a message in the inbox opens it in the Reading Pane, after which the View tab's Zoom control can be used. Zooming in and out is done with plus and minus buttons, and there is a Reset option that returns the view to its default level. In classic Outlook, the same result can be achieved either by dragging the zoom bar at the bottom right of the window or by going to View and then Zoom, where a specific percentage between 50% and 200% can be chosen. Classic Outlook also offers a "Remember my preference" checkbox in the Zoom dialogue, which locks the chosen level so it persists across sessions without needing to be reset each time. In both versions, these adjustments affect only how messages appear on the screen and have no bearing on how they are composed or how recipients will see them.
Confirming Which Version of Outlook Is in Use
Not every copy of Outlook presents the same options at the same time. If steps that are described as applying to new Outlook do not appear, the device may still be running classic Outlook for Windows. That is not uncommon in environments where administrators are controlling the transition or where devices have not yet received the relevant updates, so checking the version in use is a sensible first step before assuming that something has gone wrong.
Hiding the New Outlook Toggle in Classic Outlook
For administrators, a recurring question is how to prevent users from switching to new Outlook until the organisation is ready. Microsoft provides a cloud policy in the Microsoft 365 Apps admin centre that hides the Try the new Outlook toggle in classic Outlook for Windows. After signing in to the admin centre, the policy can be created by going to Customisation, selecting Policy Management and enabling the policy named Hide the "Try the new Outlook" toggle in Outlook. There is also a registry-based method for controlling the same setting: the key is under HKEY_CURRENT_USERSoftwareMicrosoftOffice16.0OutlookOptionsGeneral and is named HideNewOutlookToggle, with a value of dword:00000000 to hide the toggle. To later enable the policy, the same value is set to 1. As with any registry change, this approach is best handled with care and in line with internal change management practices.
Removing the New Outlook App After Preinstallation on Windows 11
Preinstallation of the new Outlook on Windows 11 is another area where planning matters. On Windows 11 builds later than version 23H2, the app is preinstalled for all users, and there is currently no way to block that preinstallation. If devices should not surface the new Outlook, it can be removed after installation using the following Windows PowerShell command:
Remove-AppxProvisionedPackage -AllUsers -Online -PackageName (Get-AppxPackage Microsoft.OutlookForWindows).PackageFullName
After deprovisioning, Windows updates will not reinstall the app. Administrators can also remove an additional Windows orchestrator registry value at HKEY_LOCAL_MACHINESOFTWAREMicrosoftWindowsUpdateOrchestratorUScheduler_OobeOutlookUpdate where applicable. Devices that have installed the March 2024 Non-Security Preview release, or a later cumulative update for Windows 11 version 23H2, respect the deprovisioning command and do not require removal of that registry value.
Handling User-Installed Instances and Start Menu Placeholders
Users may also install the app themselves, for example by selecting a toggle. In that case, the management approach shifts from provisioned packages to installed packages, and the following PowerShell command removes the app for all users:
Remove-AppxPackage -AllUsers -Package (Get-AppxPackage Microsoft.OutlookForWindows).PackageFullName
It is worth verifying whether the app is actually installed or whether only a Start menu placeholder is visible because a pinned icon may appear even when the underlying app is not yet present. A quick check of the folder at %localappdata%MicrosoftOlklogs can confirm whether the app has produced logs, and Start layout policies can be used to manage pins, so users are not inadvertently prompted to install by selecting a placeholder. On consumer devices, a Recommended section in the Windows 11 Start menu can also surface the app, which may need consideration in user communications.
Migrating Users Away from Windows Mail and Calendar
The end of support for Windows Mail and Calendar on the 31st of December 2024 introduced another migration pathway. Active users of those apps are being switched automatically to the new Outlook app, so organisations that wish to block that route can remove the Mail and Calendar apps from devices using the following command:
Get-AppxProvisionedPackage -Online | Where {$_.DisplayName -match "microsoft.windowscommunicationsapps"} | Remove-AppxProvisionedPackage -Online -PackageName {$_.PackageName}
For current users, the installed package can be removed with Remove-AppxPackage -AllUsers -Package (Get-AppxPackage microsoft.windowscommunicationsapps).PackageFullName. Alternatives exist through Microsoft Intune or Configuration Manager, which may be preferable in environments that already use those tools for application lifecycle management.
Blocking Acquisition via the Microsoft Store
Preventing acquisition from the Microsoft Store is more straightforward. Because the new Outlook for Windows is available there as well, blocking access to the Microsoft Store app prevents users from downloading it through that channel. Microsoft provides configuration options for controlling Microsoft Store access, and administrators can align those with broader device management policies that may already limit consumer app installs on corporate devices.
Opting Out of Automatic Migration
Some organisations will want to opt out of new Outlook migration entirely for a period. Starting in January 2025, users with Microsoft 365 Business Standard and Premium licences are automatically migrated from classic Outlook to new Outlook, with in-app notifications sent before the switch and the option to toggle back afterwards. Microsoft exposes a policy named Manage user setting for new Outlook automatic migration that controls whether users are switched automatically. If the policy is not set, the user setting remains uncontrolled and users can manage it themselves, with the default being enabled. Enabling the policy enforces automatic migration and prevents users from changing the setting, while disabling it turns off automatic migration and also prevents user changes. The equivalent registry setting sits under HKEY_CURRENT_USERSoftwarePoliciesMicrosoftoffice16.0outlookpreferences with a DWORD named NewOutlookMigrationUserSetting set to 0 to disable or 1 to enable. The same controls can be managed via Group Policy Administrative Templates and through the Cloud Policy service from the Microsoft 365 Apps admin centre, and because the setting is defined in ADMX templates it can also be surfaced in Intune using Administrative Templates.
Applying Conditional Access and Mailbox Policies
Beyond installation state and migration timing, access policies are a decisive layer of control. Conditional Access policies can require multifactor authentication, restrict access by location, block risky sign-in behaviours or insist on organisation-managed devices. For additional nuance, Outlook on the web (OWA) mailbox policies used together with the ConditionalAccessPolicy parameter can limit capabilities for users on non-compliant devices, for instance by restricting attachments. This approach allows a more graduated user experience that reduces risk without completely blocking access, and it can be combined with broader Conditional Access baseline requirements.
There are cases where a firmer control is required. To prevent mailbox access from the new Outlook regardless of how users acquired the app, administrators can use an Exchange mailbox policy that blocks organisation mailboxes from being added. This acts as a final block so that work or school accounts cannot be used in the app, even if an individual user has installed it or found it preinstalled. Because mailbox policies are applied to the account rather than to a device or a specific app, it is prudent to consider them alongside the earlier measures that block acquisition or control installation, so that personal accounts are not used in ways that bypass organisational safeguards.
Understanding How Outlook Stores Data and What Moves to a New Computer
While deployment and access are important, day-to-day continuity often depends on understanding how Outlook stores data and how that affects moving to a new computer. Outlook saves backup information in a variety of different locations depending on the account type involved. For users of Microsoft 365, Exchange, Outlook.com, Hotmail.com or Live.com accounts not accessed by POP or IMAP, email is backed up on the server and there is no Personal Folders file with a .pst extension. An Offline Folders file with an .ost extension may be present, but Outlook automatically recreates this when a new email account is added, and it cannot be moved between computers. Other elements such as navigation pane settings, print styles, signatures and stationery can be transferred, and their locations vary with version and configuration.
Users of POP accounts encounter a different arrangement. All email, calendar, contact and task information is stored in a .pst file, and moving this file to a new computer preserves that information. It does not carry over the account settings themselves, so Outlook needs to be set up on the new computer before opening the .pst file that was copied from the old one. On Windows 11, navigation pane settings are found at drive:Users<username>AppDataRoamingMicrosoftOutlook and signatures at drive:Users<username>AppDataRoamingMicrosoftSignatures. Knowing these paths saves time during a migration and reduces the risk of overlooking important data.
Avoiding OneDrive Synchronisation Problems with PST Files
Large .pst files can slow down OneDrive synchronisation if they are stored in folders that OneDrive is backing up. Symptoms include messages such as "Processing changes" or "A file is in use" that persist for longer than expected. Microsoft provides guidance on removing an Outlook PST data file from OneDrive if that becomes necessary, and doing so can restore normal synchronisation behaviour while keeping Outlook functional on the local machine.
Showing Hidden Files and Extensions on Windows
Locating Outlook data sometimes means revealing folders and file name extensions that Windows hides by default. This is especially true when navigating to AppData or similar directories, or when differentiating between PST and OST files. On Windows 11 File Explorer, going to View > Show, where both "File name extensions" and "Hidden items" settings can be toggled to their on positions. Doing so makes the AppData folder and the distinction between these file types visible without needing to navigate through the Control Panel.
Configuring POP, IMAP and SMTP Settings for Outlook.com
Configuration of Outlook.com accounts brings its own questions when used in the Outlook desktop app or other mail applications. Outlook and Outlook.com can often detect the correct mailbox settings automatically, which simplifies setup for many users. When that is not the case, or when using a third-party app, the POP, IMAP and SMTP settings can be viewed within Outlook.com settings and used for manual configuration. For Outlook.com accounts, both the IMAP and POP server name is outlook.office365.com, with IMAP using port 993 and POP using port 995, both with SSL/TLS encryption and OAuth2 authentication. It is worth noting that POP and IMAP access is disabled by default in Outlook.com and must be enabled in account settings before either protocol can be used. For other non-Microsoft accounts, the safest course is to obtain settings directly from the relevant email provider rather than guessing values, since incorrect entries can lead to connection issues that are not always obvious at first glance.
Getting Support for Outlook.com
Support remains close at hand for Outlook.com users who need it. The Help option on the menu bar in Outlook.com opens self-help resources where queries can be entered and common issues surfaced. If those do not resolve the problem, there is a path to contact support, which requires signing in to the account so that assistance can be tailored. If signing in is not possible, Microsoft directs users to a separate route to begin recovery or get help, and the Outlook.com Community provides an additional place to search for answers or ask questions from other users.
Keeping Users and IT Teams Informed During Outlook's Transition
Together, these user-facing features and administrative controls reflect a period of transition for Outlook on Windows. Individuals can shape the way they write and read messages, adjusting fonts to suit their preferences and using zoom where needed, without altering system-wide settings. Administrators can pace the adoption of the new Outlook with policies that hide toggles, prevent or reverse preinstallation, opt out of automatic migration and apply Conditional Access or mailbox policies that enforce organisational requirements. Underneath these changes, the fundamentals of data storage and account setup remain steady, with server-backed accounts recreating their local caches on-demand and POP accounts relying on .pst files that can be moved with care. By keeping these points in mind, users and IT teams alike can make informed decisions that avoid surprises and maintain a smooth email experience.
21st June 2024
SAS Innovate 2024 provided insight into evolving approaches to analytics modernisation, platform development and applied data science across multiple industries. This document captures observations from sessions addressing strategic platform migrations, unified analytics environments, enterprise integration patterns and practical applications in regulated sectors. The content reflects a discipline transitioning from experimental implementations to production-grade, business-critical infrastructure.

Strategic Platform Modernisation
A presentation from DNB Bank detailed the organisation's migration from SAS 9.4 to SAS Viya on Microsoft Azure. The strategic approach proved counter-intuitive: whilst SAS Viya supports both SAS and Python code seamlessly, DNB deliberately chose to rewrite their legacy SAS code library into Python. The rationale combined two business objectives. First, expanding the addressable talent market by tapping into the global Python developer pool. Second, creating a viable exit strategy from their primary analytics vendor, ensuring compliance with financial regulatory requirements to demonstrate realistic vendor transition options within 30 to 90 days.
This decision represents a fundamental shift in enterprise software value propositions. Competitive advantage no longer derives from creating vendor lock-in, but from providing powerful, stable and governed environments that fully embrace open-source tools. The winning strategy involves convincing customers to remain because the platform delivers undeniable value, not because departure presents insurmountable difficulty. This is something that signals the maturing of a market, where value flows through partnership rather than proprietary constraints.
Unified Analytics Environments
A healthcare analytics presentation addressed the persistent debate between low-code/no-code interfaces for business users and professional coding environments for data scientists. Two analysts tackled identical problems (predicting diabetes risk factors using a public CDC dataset) using different approaches within the same platform.
The low-code user employed SAS Viya's Model Studio, a visual interface. This analyst assessed the model for statistical bias against variables such as age and gender by selecting a configuration option, whereupon the platform automatically generated fairness statistics and visualisations.
The professional coder used SAS Viya Workbench, a code-first environment similar to Visual Studio Code. This analyst manually wrote code to perform identical bias assessments. However, direct code access enabled fine-tuning of variable interactions (such as age and cholesterol), ultimately producing a logistic regression model with marginally superior performance compared to the low-code approach.
The demonstration illustrated that the debate presents a false dichotomy. The actual value resides in unified platforms, enabling both personas to achieve exceptional productivity. Citizen data scientists can rapidly build and validate baseline models, whilst expert coders can refine those same models with advanced techniques and deploy them, all within a single ecosystem. This unified approach characterises disciplinary development, where focus shifts from tribal tool debates to collective problem-solving.
Analytics as Enterprise Infrastructure
Multiple architectural demonstrations illustrated analytics platforms evolving beyond sophisticated workbenches for specialists into the central nervous system of enterprise operations. Three distinct patterns emerged:
The AI Assistant Architecture: A demonstration featured a customer-facing AI assistant built with Azure OpenAI. When users interacted with the chatbot regarding credit risk, requests routed through Azure Logic App not to the large language model for decisions but to a SAS Intelligent Decisioning engine. The SAS engine functioned as the trusted decision core, executing business rules and models to generate real-time risk assessments, which returned to the chatbot for customer delivery. SAS provided not the interface but the automated decision engine.
The Digital Twin Pattern: A pharmaceutical use case described using historical data from penicillin manufacturing batches to train machine learning models. These models became digital twins of physical bioreactors. Rather than conducting costly and time-consuming physical experiments, researchers executed thousands of in silico simulated experiments, adjusting parameters in the model to discover the optimal recipe for maximising yield (the "Golden Batch").
The Microsoft 365 Automation Hub: A workflow demonstration showed SAS programmes functioning as critical nodes in Microsoft 365 ecosystems. The automated process involved SAS code accessing SharePoint folders, retrieving Excel files, executing analyses, generating new reports as Excel files and delivering those reports directly into Microsoft Teams channels for business users.
These patterns mark profound evolution. Analytics platforms are moving beyond sophisticated calculators for experts, becoming foundational infrastructure: the connective tissue enabling intelligent automation and integrating disparate systems such as cloud office suites, AI interfaces and industrial hardware into cohesive business processes. This evolution from specialised tool to core infrastructure clearly indicates analytics' growing maturity within enterprise contexts.
Applied Data Science in High-Stakes Environments
Whilst much data science narrative focuses on e-commerce recommendations or marketing optimisation, compelling applications tackle intensely human, high-stakes operational challenges. Heather Hallett, a former ICU nurse and healthcare industry consultant at SAS, presented on improving hospital efficiency.
She described the challenge of staffing intensive care units, where having appropriate nurse numbers with correct skills proves critical. Staffing decisions constitute "life and death decisions". Her team uses forecasting models (such as ARIMA) to predict patient demand and optimisation algorithms (including mixed-integer programming) to create optimal nurse schedules. The optimisation addresses more than headcount; it matches nurses' specific skills, such as certifications for complex assistive devices like intra-aortic balloon pumps, to forecasted needs of the sickest patients.
A second use case applied identical operational rigour to community care. Using the classic "travelling salesman solver" from optimisation theory, the team planned efficient daily routes for mobile care vans serving maximum numbers of patients in their homes, delivering essential services to those unable to reach hospitals easily.
These applications ground abstract concepts of forecasting and optimisation in deeply tangible human contexts. They demonstrate that beyond driving revenue or reducing costs, the ultimate purpose of data science and operational analytics can be directly improving and even saving human lives. This application of sophisticated mathematics to life preservation marks data science evolution from commercial tool to critical component of human-centred operations.
Transparency as Competitive Advantage
In highly regulated industries such as pharmaceuticals, generating trustworthy research proves paramount. A presentation from Japanese pharmaceutical company Shionogi detailed how they transform the transparency challenge in Real-World Evidence (RWE) into competitive advantage.
The core problem with RWE studies, which analyse data from sources such as electronic health records and insurance claims, involves their historical lack of standardisation and transparency compared to randomised clinical trials, leading regulators and peers to question validity. Shionogi's solution is an internal system called "AI SAS for RWE", addressing the challenge through two approaches:
Standardisation: The system transforms disparate Real-World Data from various vendors into a Shionogi-defined common data model based on OMOP principles, ensuring consistency where direct conversion of Japanese RWD proves challenging.
Semi-Automation: It semi-automates the entire analysis workflow, from defining research concepts to generating final tables, figures and reports.
The most innovative aspect involves its foundation in radical transparency. The system automatically records every research process step: from the initial concept suite where analysis is defined, through specification documents, final analysis programmes and resulting reports, directly into Git. This creates a complete, immutable and auditable history of exactly how evidence was generated.
This represents more than a clever technical solution; it constitutes profound strategic positioning. By building transparent, reproducible and efficient systems for generating RWE, Shionogi directly addresses core industry challenges. They work to increase research quality and trustworthiness, effectively transforming regulatory burden into competitive edge built on integrity. This move toward provable, auditable results hallmarks a discipline transitioning from experimental art to industrial-grade science.
User Experience as Productivity Multiplier
In complex data tool contexts, user experience (UX) has evolved beyond "nice-to-have" aesthetic features into a central product strategy pillar, directly tied to user productivity and talent acquisition. A detailed examination of the upcoming complete rewrite of SAS Studio illustrated this point.
The motivation for the massive undertaking proved straightforward: the old architecture was slow and becoming a drag on user productivity. The primary goal for the new version involved making a web-based application "feel like a desktop application" regarding speed and responsiveness. To achieve this, the team focused on improvements directly boosting productivity for coders and analysts:
A Modern Editor: Integrating the Monaco editor used in the widely popular Visual Studio Code, providing familiar and powerful coding experiences.
Smarter Assistance: Improving code completion and syntax help to reduce errors and time spent consulting documentation.
Better Navigation: Adding features such as code "mini-maps" enabling programmers to navigate thousands of lines of code instantly.
For modern technical software, UX has become a fundamental competitive differentiator. Faster, more intuitive and less frustrating tools do not merely improve existing user satisfaction; they enhance productivity. In competitive markets for top data science and engineering talent, providing a best-in-class user experience represents a key strategy for attracting and retaining exceptional people. The next leap in team productivity might derive not from new algorithms but from superior interfaces.
Conclusion
These observations from SAS Innovate 2024 illustrate a discipline maturing. Data science is moving beyond isolated experiments and "science projects", becoming pragmatic, integrated, transparent and deeply human business functionality. Focus shifts from algorithmic novelty to real-world application value (whether enabling better user experiences, building regulatory trust or making life-or-death decisions on ICU floors).
As analytics becomes more integrated and accessible, the challenge involves identifying where it might unexpectedly transform core processes within organisations, moving from specialist concern to foundational infrastructure enabling intelligent, automated and human-centred operations.
6th September 2016
There remain people who advise those on Windows 7 or 8.x to hold fire on upgrading to Windows 10. Now that the free upgrade no longer is available, that advice may hold more weight than it did. Even so, there are those among us who jumped ship who are open to having the latest versions of things at no monetary cost to see what is available, and I must admit to being one of those.
After all, I do have a virtual machine with a pre-release version of the next update to Windows 10 installed on there to see what might be coming our way and to get a sense of what changes that may bring so that I am ready for those. Otherwise, I am usually happy to wait, but I noticed that the Windows 10 Anniversary Update only came to my HP Pavilion dm4 laptop and not other machines with Windows 10 installed, so I started to wonder why there was a lag when it came to automatic upgrades.
So that these things do not arrive when it is least convenient, I took advantage of a manual method to choose my timing. This did not involve installation from a disk image, but was in-situ. The first part of the process is standard enough in that the Settings app was started and the Update & security item chosen. That dropped me onto the Windows Update, and I first clicked on the Check for updates button to see what would happen. When nothing came of that, the Learn more link was clicked to bring me onto part of the Microsoft support website where I found that the Windows 10 Anniversary Update installer could be downloaded, so I duly did just that.
Running it produced a screen asking whether I wanted to proceed. Since I wanted to go ahead, the appropriate button was clicked and the machine left alone until the process completed. Because the installer purely is a facilitator, the first stage is to download the rest of the files needed, and that will take a while on any connection. Once downloading was completed, the actual process of installation commenced with several restarts before a log-in screen was again on offer. On logging in to the machine, the last part of the process started.
Though the process took quite a while, it seemingly worked without a hitch. If there was anything that I needed to do, it was the re-installation of VirtualBox Guest Additions to restore access to shared folders, as well as dealing with a self-inflicted irritation. Otherwise, I have found that previously installed software worked as expected and no file has been missed. Waiting a while may have had its advantages too because initial issues with the Anniversary Update will have been addressed, though it is best not to leave it too long, or you could have the feeling of being forgotten. A happy balance needs striking.
15th November 2013
With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC's, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsoft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC's that needed a new Windows installation, and that the licence that you bought was then attached to the machine from when it got installed on there.
As is usual with Microsoft, the situation is less clear-cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC's. With Windows 7 and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it were a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance, if not the actual installation. At times, Microsoft muddies waters to keep its support costs down.
Even with Microsoft's track record in mind, it still surprised me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discover the same for software purchased from the Microsoft website. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90-day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. Though this not only be an actual in-situ installation, there are options to create an ISO or USB image of the installation disk for later installation.
In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work, so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine, and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested, so I am unsure that I will go beyond what I have done already.
My main machine has seen a change of motherboard, CPU and memory, so it could have deactivated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely does it, so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the motherboard, and I have seen suggestions that this is the critical component for Windows licence activation, which would make sense if that was the case.
However, this rule is not hard and fast either, since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you, and I have noticed that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be, even when they are impossible to keep like that?
6th July 2013
While I have a previous posting from 2009 that discusses adding Microsoft's Core Fonts to the then current version of Fedora, it did strike me that I hadn't laid out the series of commands that were used. Instead, I referred to an external and unofficial Fedora FAQ. That's still there, yet I also felt that I was leaving things a little to chance, given how websites can disappear quite suddenly.
Even after next to four years, it still amazes me that you cannot install Microsoft's Core Fonts in Fedora as you would on Ubuntu, Linux Mint or even Debian. Therefore, the following series of steps is as necessary now as it was then.
The first step is to add in a number of precursor applications such as wget for command line file downloading from websites, cabextract for extracting the contents of Windows CAB files, rpmbuild for creating RPM installers and utilities for the XFS file system that chkfontpath needs:
sudo yum -y install rpm-build cabextract ttmkfdir wget xfs
Here, I have gone with terminal commands that use sudo, but you could become the superuser (root) for all of this and there are those who believe you should. The -y switch tells yum to go ahead with prompting you for permission before it does any installations. The next step is to download the Microsoft fonts package with wget:
sudo wget http://corefonts.sourceforge.net/msttcorefonts-2.0-1.spec
Once that is done, you need to install the chkfontpath package because the RPM for the fonts cannot be built without it:
sudo rpm -ivh http://dl.atrpms.net/all/chkfontpath
Once that is in place, you are ready to create the RPM file using this command:
sudo rpmbuild -ba msttcorefonts-2.0-1.spec
After the RPM has been created, it is time to install it:
sudo yum install --nogpgcheck ~/rpmbuild/RPMS/noarch/msttcorefonts-2.0-1.noarch.rpm
When installation has completed, the process is done. Because I used sudo, all of this happened in my own home area, so there was a need for some housekeeping afterwards. If you did it by becoming the root user, then the files would be there instead, and that's the scenario in the online FAQ.
16th March 2013
One of the disadvantages of my Google/Asus Nexus 7 is that it needs a Wi-Fi connection to use. Most of the time, this is not a problem since I also have a Huawei mobile Wi-Fi hub from T-Mobile and this seems to work just about anywhere in the U.K. Away from the U.K. though, it won't work because roaming is not switched on for it and that may be no bad thing with the fees that could introduce. While my HTC Desire S could deputise, I need to watch costs with that too.
There's also the factor of download caps, and those apply both to the Huawei and to the HTC. Recently, I added Anquet's Outdoor Map Navigator (OMN) to my Nexus 7 through the Google Play Store for a fee of £7 and that allows access to any walking maps that I have bought from Anquet. However, those are large downloads, so the caps start to come into play. Frugality would help, but I began to look at other possibilities that make use of a laptop's Wi-Fi functionality.
Looking on the web, I found two options for this that work on Windows 7 (8 should be OK too): Connectify Hotspot and Virtual Router Manager. The first of these is commercial software, yet there is a Lite edition for those wanting to try it out; that it is not a time limited demo is not something that I can confirm though that did not seem to be the case since it looked as if only features were missing from it that you'd get if you paid for the Pro variant. The second option is an open source one that is free of charge apart from an invitation to donate to the project.
Though online tutorials show the usage of either of these to be straightforward, my experiences were not all that positive at the outset. In fact, there was something that I needed to do, which is how this post has come to exist at all. That happened even after the restart that Connectify Hotspot needed as part of its installation; it runs as a system service, which is why the restart was needed. In fact, it was Virtual Router Manager that told me what the issue was, and it needed no reboot. Neither did it cause network disconnection of a laptop like the Connectify offering did on me and that was the cause of its ejection from that system; limitations in favour of its paid addition aside, it may have the snazzier interface, but I'll take effective simplicity any day.
Using Virtual Router Manager turns out to be simple enough. It needs a network name (also known as an SSID), a password to restrict who accesses the network and the internet connection to be shared. In my case, the was Local Area Connection on the dropdown list. With all the required information entered, I was ready to start the router using the Start Network Router button. The text on this button changes to Stop Network Router when the hub is operational, or at least it should have done for me on the first time that I ran it. What I got instead was the following message:
The group or resource is not in the correct state to perform the requested operation.
While the above may not say all that much, it becomes more than ample information if you enter it into the likes of Google. Behind the scenes, Virtual Router Manager uses native Windows functionality to create a Wi-Fi hub from a PC, and it appears to be the Microsoft Virtual Wi-Fi Miniport Adapter from what I have seen. When I tried setting up an ad hoc Wi-Fi network from a laptop to the Nexus 7 using Windows' own network set up capability via its Control Panel, it didn't do what I needed, so there might be something that third party software can do. So, the interesting thing about the solution to my Virtual Router Manager problem was that it needed me to delve into the innards of Windows a little.
Firstly, there's running Command Prompt (All Programs > Accessories) from the Start Menu with Administrator privileges. It helps here if the account with which you log into Windows is in the Administrator group, since all you have to do then is right-click on the Start Menu entry and choose Run as administrator entry in the pop-up context menu. With a command line window now open, you then need to issue the following command:
netsh wlan set hostednetwork mode=allow ssid=[network name] key=[password] keyUsage=persistent
When that had done its thing, Virtual Router Manager worked without a hitch though it did turn itself after a while and that may be no bad thing from the security standpoint. On the Android side, it was a matter of going in Settings > Wi-Fi and choosing the new network that was created on the laptop. This sort of thing may apply to other types of tablet (Dare I mention iPads?) so you could connect anything to the hub without needing to do any more on the Windows side.
For those wanting to know what's going on behind the scenes on Windows, there's a useful tutorial on Instructables that shows what third party software is saving you from having to do. Even if I never go down the more DIY route, I probably have saved myself having to buy a mobile Wi-Fi hub for any trips to Éire. For now, the Irish 3G dongle that I already have should be enough.
1st November 2012
Though my main home PC runs Linux Mint, I do like to have the facility to use Windows software occasionally, and virtualisation has allowed me to continue doing that. For a good while, it was a Windows 7 guest within a VirtualBox virtual machine and, before that, one running Windows XP fulfilled the same role. However, it did feel as if things were running slower in VirtualBox than once might have been the case, so I jumped ship to VMware Player. While it may be proprietary and closed source, it is free of charge and has been doing what was needed. A subsequent recent upgrade of a video driver on the host operating system allowed the enabling of a better graphical environment in the Windows 7 guest.
Instability
However, there were issues with stability and I lost the ability to flit from the VM window to the Linux desktop at will, with the system freezing on me and needing a reboot. Working in Windows 7 using full-screen mode avoided this, yet it did feel as I was constrained to working on a Windows-only machine whenever I did so. The graphics performance was imperfect too, with screening refreshing being very blocky with some momentary scrambling whenever I opened the Start menu. Others would not have been as patient with that as I was, though there was the matter of an expensive Photoshop licence to be guarded too.
In hindsight, a bit of pruning could have helped. An example would have been driver housekeeping in the form of removing VirtualBox Guest Additions because they could have been conflicting with their VMware counterparts. For some reason, those thoughts entered my mind to make me consider another, more expensive option instead.
Considering NAS & Windows/Linux Networking
That would have taken the form of setting aside a PC for running Windows 7 and having a NAS for sharing files between it and my Linux system. In fact, I did get to exploring what a four bay QNAP TS-412 would offer me and realised that you cannot put normal desktop hard drives into devices like that. For a while, it looked as if it would be a matter of getting drives bundled with the device or acquiring enterprise grade disks to main the required continuity of operation. The final edition of PC Plus highlighted another one, though: the Western Digital Red Pro range. These are part way been desktop and enterprise classifications and have been developed in association with NAS makers too.
While looking at the NAS option certainly became an education, it has exited any sort of wish list that I have. After all, it is the cost of such a setup that gets me asking if I really need such a thing. While the purchase of a Netgear FS 605 Ethernet switch would have helped incorporate it, there has been no trouble sorting alternative uses for that device since it bumps up the number of networked devices that I can have, never a bad capability to have. As I was to find, there was a less expensive alternative that would become sufficient for my needs.
In-situ Windows 8 Upgrade
Microsoft has been making available evaluation copies of Windows 8 Enterprise that last for 90 days before expiring. One is in my hands has been running faultlessly in a VMware virtual machine for the past few weeks. That made me wonder if upgrading from Windows 7 to Windows 8 help with my main Windows VM problems. Being a curious risk-taking type I decided to answer the question for myself using the £24.99 Windows Pro upgrade offer that Microsoft have been running for those not needing a disk up front; they need to pay £49.99 while you can get one afterwards for an extra £12.99 and £3.49 postage if you wish, a slightly cheaper option. Though there also was a time cost in that it occupied a lot of a weekend for me, it seems to have done what was needed, so it was worth the outlay.
Given the element of risk, Photoshop was deactivated to be on the safe side. That wasn't the only pre-upgrade action that was needed because the Windows 8 Pro 32-bit upgrade needs at least 16 GB before it will proceed. Of course, there was the matter of downloading the installer from the Microsoft website too. This took care of system evaluation and paying for the software, as well as the actual upgrade itself.
The installation took a few hours, with virtual machine reboots along the way. Naturally, the licence key was needed too, as well as the selection of a few options, though there weren't many of these. Being able to carry over settings from the pre-existing Windows 7 instance certainly helped with this and with making the process smoother too. No software needed reinstatement, and it doesn't feel as if the system has forgotten very much at all, a successful outcome.
Post-upgrade Actions
Just because I had a working Windows 8 instance didn't mean that there wasn't more to be done. In fact, it was the post-upgrade sorting that took up more time than the actual installation. For one thing, my digital mapping software wouldn't work without .Net Framework 3.5 and turning on the operating system feature from the Control Panel fell over at the point where it was being downloaded from the Microsoft Update website. Even removing Avira Internet Security after updating it to the latest version had no effect, and that was a finding during the Windows 8 system evaluation process. The solution was to mount the Windows 8 Enterprise ISO installation image that I had and issue the following command from a command prompt running with administrative privileges:
dism.exe /online /enable-feature /featurename:NetFX3 /Source:d:\sources\sxs /LimitAccess
For sake of assurance regarding compatibility, Avira has been replaced with Trend Micro Titanium Internet Security. The Avira licence won't go to waste, since I have another home in mind for it. Removing Avira without crashing Windows 8 proved impossible, though, and necessitating booting Windows 8 into Safe Mode. Because of much faster startup times, that cannot be achieved with a key press at the appropriate moment because the time window is too short now. One solution is to set the Safe Boot tick box in the Boot tab of MSCONFIG (or System Configuration, as it otherwise calls itself) before the machine is restarted. While there may be others, this was the one that I used. With Avira removed, clearing the same setting and rebooting restored normal service.
Dealing with a Dual Personality
One observer has stated that Windows 8 gives you two operating systems for the price of one: the one on the Start screen and the one on the desktop. Having got to wanting to work with one at a time, I decided to make some adjustments. Adding Classic Shell got me back a Start menu, and I omitted the Windows Explorer (or File Explorer as it is known in Windows 8) and Internet Explorer components. Though Classic Shell will present a desktop like what we have been getting from Windows 7 by sweeping the Start screen out of the way for you, I found that this wasn't quick enough for my liking, so I added Skip Metro Suite to speed up things. Though the tool does more than sweeping the Start screen out of the way, I have switched off these functions. Classic Shell has been configured too, so the Start screen can be accessed with a press of the Windows key. It has updated too so that boot into the desktop should be faster now. As for me, I'll leave things as they are for now. Even the possibility of using Windows' own functionality to go directly to the traditional desktop will be left untested while things are left to settle. Tinkering can need a break.
Outcome
After all that effort, I now have a seemingly more stable Windows virtual machine running Windows 8. Flitting between it and other Linux desktop applications has not caused a system freeze so far, and that was the result that I wanted. There now is no need to consider having separate Windows and Linux PC's with a NAS for sharing files between them, so that option is well off my wish-list. There are better uses for my money.
Not everyone has had my experience, though, because I saw a report that one user failed to update a physical machine to Windows 8 and installed Ubuntu instead; they were a Linux user anyway, even if they used Fedora more than Ubuntu. It is possible to roll back from Windows 8 to the previous version of Windows because there is a windows.old directory left primarily for that purpose. However, that may not help you if you have a partially operating system that doesn't allow you to do just that. In time, I'll remove it using the Disk Clean-up utility by asking it to remove previous Windows installations or running File Explorer with administrator privileges. Somehow, the former approach sounds the safer.
What About Installing Afresh?
While there was a time when I went solely for upgrades when moving from one version of Windows to the next, the annoyance of the process got to me. If I had known that installing the upgrade twice onto a computer with a clean disk would suffice, it would have saved me a lot. Staring from Windows 95 (from the days when you got a full installation disk with a PC and not the rescue media that we get now) and moving through a sequence of successors not only was time-consuming, but it also revealed the limitations of the first in the series when it came to supporting more recent hardware. It was enough to have me buying the full retailed editions of Windows XP and Windows 7 when they were released; the latter got downloaded directly from Microsoft. While these were retail versions that you could move from one computer to another, Windows 8 will not be like that. In fact, you will need to get its System Builder edition from a reseller and that can only be used on one machine. It is the merging of the former retail and OEM product offerings.
What I have been reading is that the market for full retail versions of Windows was not a big one anyway. However, it was how I used to work as you have read above, and it does give you a fresh system. Most probably get Windows with a new PC and don't go building them from scratch like I have done for more than a decade. Maybe the System Builder version would apply to me anyway, and it appears to be intended for virtual machine use as well as on physical ones. More care will be needed with those licences by the looks of things, and I wonder what needs not to be changed so as not to invalidate a licence. After all, making a mistake might cost between £75 and £120 depending on the edition.
Final Thoughts
So far, Windows 8 is treating me well, and I have managed to bend it to my will too, always a good thing to be able to say. In time, it might be that a System Builder copy could need buying yet, but I'll leave well alone for now. Though I needed new security software, the upgrade still saved me money over a hardware solution to my home computing needs and I have a backup disk on order from Microsoft too. That I have had to spend some time settling things was a means of learning new things for me but others may not be so patient and, with Windows 7 working well enough for most, you have to ask if it's only curious folk like me who are taking the plunge. Still, the dramatic change has re-energised the PC world in an era when smartphones and tablets have made so much of the running recently. That too is no bad thing because an unchanging technology is one that dies and there are times when significant changes are needed, as much as they upset some folk. For Microsoft, this looks like one of them, and it'll be interesting to see where things go from here for PC technology.
15th February 2007
There are things in the Vista EULA that gave me a shock when I first saw them. In fact, one provision set off something of a storm across the web in the latter part of 2006. Microsoft in its wisdom went and made everything more explicit and raised cane in doing so. It was their clarification of the one machine, one licence understanding that was at the heart of the whole furore. The new wording made it crystal clear that you were only allowed to move your licence between machines once and once only. After howls of protest, the XP wording reappeared and things calmed down again.
Around the same time, Paul Thurrott published his take on the Vista EULA on his Windows SuperSite. He takes the view that the new EULA only clarified what in the one XP, and that enthusiast PC builders are but a small proportion of the software market. Another interesting point that he makes is that there is no need to license the home user editions of Vista for use in virtual machines because those users would not be doing that kind of thing. The logical conclusion of this argument is that only technical business users and enthusiasts would ever want to do such a thing; I am both. On the same site, Koroush Ghazi of TweakGuides.com offers an alternate view, at Thurrrott’s invitation, from the enthusiast’s side. That view takes note of the restrictions of both the licensing and all the DRM technology that Microsoft has piled into Vista. Another point made is that enthusiasts add a lot to the coffers of both hardware and software producers.
Bit-tech.net got the Microsoft view on the numbers of activations possible with a copy of retail Vista before further action is required. The number comes in at 10, and it seems a little low. However, Vista will differ from XP in that it thankfully will not need reactivation as often. In fact, it will take changing a hard drive and one other component to do it. That’s less stringent than needing reactivation after changing three components from a wider list in a set period, like it is in XP. While I cannot remember the exact duration of the period in question, 60 days seems to ring a bell.
OEM Vista is more restrictive than this: one reactivation and no more. I learned that from the current issue of PC Plus, the trigger of my concern regarding Windows licensing. Nevertheless, so long as no hard drive changes go on, you should be fine. That said, I do wonder what happens if you add or remove an external hard drive. On this basis at least, it seems OEM is not such a bargain then and Microsoft will not support you anyway.
However, there are cracks appearing in the whole licensing edifice and the whole thing is beginning to look a bit of a mess. Brian Livingston of Windows Secrets has pointed out that you could do a clean installation using only the upgrade edition(s) of Vista by installing it twice. The Vista upgrade will upgrade over itself, allowing you access to the activation process. Of course, he recommends that you only do this when you are in already in possession of an XP licence, and it does mean that your XP licence isn’t put out of its misery, apparently a surprising consequence of the upgrade process if I have understood it correctly.
However, this is not all. Jeff Atwood has shared on his blog Coding Horror that the 30 grace activation period can be extended in three increments to 120 days. Another revelation was that all Windows editions are on the DVD, and it is only the licence key that you have in your possession that will determine the version that you install. In fact, you can install any version for 30 days without entering a licence key at all. Therefore, you can experience 32-bit or 64-bit versions and any edition from Home Basic, Home Premium, Business or Ultimate. The only catch is that once the grace period is up, you have to license the version that is installed at that time.
There is no cracking required for any of the above (a quick Google search digs loads of references to cracking of the Windows activation process). Though it sounds surprising, it is none other than Microsoft itself who has made these possibilities available, albeit in an undocumented fashion. And the reason is not commercial benevolence but the need to keep their technical support costs under control, apparently.
That said, an unintended consequence of the activation period extensibility is that PC hardware enthusiasts, the types who rebuild their machines every few months (in contrast, I regard my main PC as a workhouse and I have no wish to cause undue disruption to my life with this sort of behaviour but each to their own… anyway, it’s not as if they are doing anyone else any harm), would not ever have to activate their copies of Vista, thus avoiding any issues with the activation limit of 1 or 10: an interesting workaround for the limitations in the first place. And all of this is available without (illegally, no doubt) using a fake Windows activation server, as has been reported.
With all of these back doors inserted into the activation process by Microsoft itself, it makes some of the more scary provisions look not only over the top but also plain silly: a bit like using a sledgehammer to crack a nut. For instance, there is a provision that Microsoft could kill your Windows licence if it deems that you breached the terms of that licence. It looks as if it’s meant to cover the loss in functionality at the end of the activation grace period, but it does rather give the appearance that your £370 Vista Ultimate is as ephemeral as a puff of smoke: overdoing that reminder is an almost guaranteed method of encouraging power users jump ship to Linux or another UNIX. And the idea of Windows Genuine Advantage continually phoning home doesn’t provide any great reassurance either. However, it does seem that Microsoft has reactivated XP licences over the phone when reasonable grounds are given: irredeemable loss of system, for example. That ease and cost of technical support returns again. There is a corollary to this: make life easy for Microsoft, and they won’t bother you very much, if at all. Incidentally, if they ever did do a remote control kill of your system, the whole action would be akin to skating on legal thin ice. And I suspect that they may not like making trouble for themselves.
I think I’ll let the dust settle and stay on my XP planet while in a Vista universe. As it happens, Paul Thurrott has a good article on that subject too.