TOPIC: SHORTCUT
Some PowerShell fundamentals for practical automation
27th October 2025In the last few months, I have taken to using PowerShell for automating tasks while working on a new contract. There has been an element of vibe programming some of the scripts, which is why I wished to collate a reference guide that anyone can have to hand. While working with PowerShell every day does help to reinforce the learning, it also helps to look up granular concepts on a more bite-sized level. This especially matters given PowerShell's object-oriented approach. After all, many of us build things up iteratively from little steps, which also allows for more flexibility. Using an AI is all very well, yet the fastest recall is always from your on head.
1. Variables and Basic Data Types
Variables start with a dollar sign and hold values you intend to reuse, so names like $date, $outDir and $finalDir become anchors for later operations. Dates are a frequent companion in filenames and logs, and PowerShell's Get-Date makes this straightforward. A format string such as Get-Date -Format "yyyy-MM-dd" yields values like 2025-10-27, while Get-Date -Format "yyyy-MM-dd HH:mm:ss" adds a precise timestamp that helps when tracing the order of events. Because these commands return text when a format is specified, you can stitch the results into other strings without fuss.
2. File System Operations
As soon as you start handling files, you will meet a cluster of commands that make navigation robust rather than fragile. Join-Path assembles folder segments without worrying about stray slashes, Test-Path checks for the existence of a target, and New-Item creates folders when needed. Moving items with Move-Item keeps the momentum going once the structure exists.
Environment variables give cross-machine resilience; reading $env:TEMP finds the system's temporary area, and [Environment]::GetFolderPath("MyDocuments") retrieves a well-known Windows location without hard-coding. Setting context helps too, so Set-Location acts much like cd to make a directory the default focus for subsequent file operations. You can combine these approaches, as in cd ([Environment]::GetFolderPath("MyDocuments")), which navigates directly to the My Documents folder without hard-coded paths.
Scripts are often paired with nearby files, and Split-Path $ScriptPath -Parent extracts a parent folder from a full path so you can create companions in the same place. Network locations behave like local ones, with Universal Naming Convention paths beginning \ supported throughout, and Windows paths do not require careful case matching because the file systems are generally case-insensitive, which differs from many Unix-based systems. Even simple details matter, so constructing strings such as "$Folder*" is enough for wildcard searches, with backslashes treated correctly and whitespace handled sensibly.
3. Arrays and Collections
Arrays are created with @() and make it easy to keep related items together, whether those are folders in a $locs array or filenames gathered into $progs1, $progs2 and others. Indexing retrieves specific positions with square brackets, so $locs[0] returns the first entry, and a variable index like $outFiles[$i] supports loop counters.
A single value can still sit in an array using syntax such as @("bm_rc_report.sas"), which keeps your code consistent when functions always expect sequences. Any collection advertises how many items it holds using the Count property, so checking $files.Count equals zero tells you whether there is anything to process.
4. Hash Tables
When you need fast lookups, a hash table works as a dictionary that associates keys and values. Creating one with @{$locs[0] = $progs1} ties the first location to its corresponding programme list and then $locsProgs[$loc] retrieves the associated filenames for whichever folder you are considering. This is a neat stepping stone to loops and conditionals because it organises data around meaningful relationships rather than leaving you to juggle parallel arrays.
5. Control Flow
Control flow is where scripts begin to feel purposeful. A foreach loop steps through the items in a collection and is comfortable with nested passes, so you might iterate through folders, then the files inside each folder, and then a set of search patterns for those files. A for loop offers a counting pattern with initialisation, a condition and an increment written as for ($i = 0; $i -lt 5; $i++). It differs from foreach by focusing on the numeric progression rather than the items themselves.
Counters are introduced with $i = 0 and advanced with $i++, which in turn blends well with array indexing. Conditions gate work to what needs doing. Patterns such as if (-not (Test-Path ...)) reduce needless operations by creating folders only when they do not exist, and an else branch can note alternative outcomes, such as a message that a search pattern was not found.
Sometimes there is nothing to gain from proceeding, and break exits the current loop immediately, which is an efficient way to stop retrying once a log write succeeds. At other times it is better to skip just the current iteration, and continue moves directly to the next pass, which proves useful when a file list turns out to be empty for a given pattern.
6. String Operations
Strings support much of this work, so several operations are worth learning well. String interpolation allows variables to be embedded inside text using "$variable" or by wrapping expressions as "$($expression)", which becomes handy when constructing paths like "psoutput$($date)".
Splitting text is as simple as -split, and a statement such as $stub, $type = $File -split "." divides a filename around its dot, assigning the parts to two variables in one step. This demonstrates multiple variable assignment, where the first part goes to $stub and the second to $type, allowing you to decompose strings efficiently.
When transforming text, the -Replace operator substitutes all occurrences of one pattern with another, and you can chain replacements, as in -replace $Match, $Replace -replace $Match2, $Replace2, so each change applies to the modified output of the previous one.
Building new names clearly is easier with braces, as in "${stub}_${date}.txt", which prevents ambiguity when variable names abut other characters. Escaping characters is sometimes needed, so using "." treats a dot as a literal in a split operation. The backtick character ` serves as PowerShell's escape character and introduces special sequences like a newline written as `n, a tab as `t and a carriage return as `r. When you need to preserve formatting across lines without worrying about escapes, here-strings created with @" ... "@ keep indentation and line breaks intact.
7. Pipeline Operations
PowerShell's pipeline threads operations together so that the output of one command flows to the next. The pipe character | links each stage, and commands such as ForEach-Object (which processes each item), Where-Object (which filters items based on conditions) and Sort-Object -Unique (which removes duplicates) become building blocks that shape data progressively.
Within these blocks, the current item appears as $_, and properties exposed by commands can be read with syntax like $_.InputObject or $_.SideIndicator, the latter being especially relevant when handling comparison results. With pipeline formatting, you can emit compact summaries, as in ForEach-Object { "$($_.SideIndicator) $($_.InputObject)" }, which brings together multiple properties into a single line of output.
A multi-stage pipeline filtering approach often follows three stages: Select-String finds matches, ForEach-Object extracts only the values you need, and Where-Object discards anything that fails your criteria. This progressive refinement lets you start broad and narrow results step by step. There is no compulsion to over-engineer, though; a simplified pipeline might omit filtering stages if the initial search is already precise enough to return only what you need.
8. Comparison and Matching
Behind many of these steps sit comparison and matching operators that extend beyond simple equality. Pattern matching appears through -notmatch, which uses regular expressions to decide whether a value does not fit a given pattern, and it sits alongside -eq, -ne and -lt for equality, inequality and numeric comparison.
Complex conditions chain with -and, so an expression such as $_ -notmatch '^%macro$' -and $_ -notmatch '^%mend$' ensures both constraints are satisfied before an item passes through. Negative matching in particular helps exclude unwanted lines while leaving the rest untouched.
9. Regular Expressions
Regular expressions define patterns that match or search for text, often surfacing through operators such as -match and -replace. Simple patterns like .log$ identify strings ending with .log, while more elaborate ones capture groups using parentheses, as in (sdtm.|adam.), which finds two alternative prefixes.
Anchors matter, so ^ pins a match to the start of a line and $ pins it to the end, which is why ^%macro$ means an entire line consists of nothing but %macro. Character classes provide shortcuts such as w for word characters (letters, digits or underscores) and s for whitespace. The pattern "GRCw*" matches "GRC" followed by zero or more word characters, demonstrating how * controls repetition. Other quantifiers like + (one or more) and ? (zero or one) offer further control.
Escaping special characters with a backslash turns them into literals, so . matches a dot rather than any character. More complex patterns like '%(m|v).*?(?=[,(;s])' combine alternation with non-greedy matching and lookaheads to define precise search criteria.
When working with matches in pipelines, $_.Matches.Value extracts the actual text that matched the pattern, rather than returning the entire line where the match was found. This proves essential when you need just the matching portion for further processing. The syntax can appear dense at first, but PowerShell's integration means you can test patterns quickly within a pipeline or conditional, refining as you go.
10. File Content Operations
Searching file content with Select-String applies regular expressions to lines and returns match objects, while Out-File writes text to files with options such as -Append and -Encoding UTF8 to control how content is persisted.
11. File and Directory Searching
Commands for locating files typically combine path operations with filters. Get-ChildItem retrieves items from a folder, and parameters like -Filter or -Include narrow results by pattern. Wildcards such as * are often enough, but regular expressions provide finer control when integrated with pipeline operations. Recursion through subdirectories is available with -Recurse, and combining these techniques allows you to find specific files scattered across a directory tree. Once items are located, properties like FullName, Name and LastWriteTime let you decide what to do next.
12. Object Properties
Objects exposed by commands carry properties that you can access directly. $File.FullName retrieves an absolute path from a file object, while names, sizes and modification timestamps are all available as well. Subexpressions introduced with $() evaluate an inner expression within a larger string or command, which is why $($File.FullName) is often seen when embedding property values in strings. However, subexpressions are not always required; direct property access works cleanly in many contexts. For instance, $File.FullName -Replace ... reads naturally and works as you would expect because the property access is unambiguous when used as a command argument rather than embedded within a string.
13. Output and Logging
Producing output that can be read later is easier if you apply a few conventions. Write-Output sends structured lines to the console or pipeline, while Write-Warning signals notable conditions without halting execution, a helpful way to flag missing files. There are times when command output is unnecessary, and piping to Out-Null discards it quietly, for example when creating directories. Larger scripts benefit from consistency and a short custom function such as Write-Log establishes a uniform format for messages, optionally pairing console output with a line written to a file.
14. Functions
Functions tie these pieces together as reusable blocks with a clear interface. Defining one with function Get-UniquePatternMatches { } sets the structure, and a param() block declares the inputs. Strongly typed parameters like [string[]] make it clear that a function accepts an array of strings, and naming parameters $Folder and $Pattern describes their roles without additional comments.
Functions are called using named parameters in the format Get-UniquePatternMatches -Folder $loc -Pattern '(sdtm.|adam.)', which makes the intent explicit. It is common to pass several arrays into similar functions, so a function might have many parameters of the same type. Using clear, descriptive names such as $Match, $Replace, $Match2 and $Replace2 leaves little doubt about intent, even if an array of replacement rules would sometimes reduce repetition.
Positional parameters are also available; when calling Do-Compare you can omit parameter names and rely on the order defined in param(). PowerShell follows verb-noun naming conventions for functions, with common verbs including Get, Set, New, Remove, Copy, Move and Test. Following this pattern, as in Multiply-Files, places your code in the mainstream of PowerShell conventions.
It is worth avoiding a common pitfall where a function declares param([string[]]$Files) but inadvertently reads a variable like $progs from outside the function. PowerShell allows this via scope inheritance, where functions can access variables from parent scopes, but it makes maintenance harder and disguises the function's true dependencies. Being explicit about parameters creates more maintainable code.
Simple functions can still do useful work without complexity. A minimal function implementation with basic looping and conditional logic can accomplish useful tasks, and a recurring structure can be reused with minor revisions, swapping one regular expression for another while leaving the looping and logging intact. Replacement chains are flexible; add as many -replace steps as are needed, and no more.
Parameters can be reused meaningfully too, demonstrating parameter reuse where a $Match variable serves double duty: first as a filename filter in -Include, then as a text pattern for -replace. Nested function calls tie output and input together, as when piping a here-string to Out-File (Join-Path ...) to construct a file path at the moment of writing.
15. Comments
Comments play a quiet but essential role. A line starting with # explains why something is the way it is or temporarily disables a line without deleting it, which is invaluable when testing and refining.
16. File Comparison
Comparison across datasets rounds out common tasks, and Compare-Object identifies differences between two sets, telling you which items are unique to each side or shared by both. Side indicators in the output are compact: <= shows the first set, => the second, and == indicates an item present in both.
17. Common Parameters
Across many commands, common parameters behave consistently. -Force allows operations that would otherwise be blocked and overwrites existing items without prompting in contexts that support it, -LiteralPath treats a path exactly as written without interpreting wildcards, and -Append adds content to existing files rather than overwriting them. These options smooth edges when you know what you want a command to do and prefer to avoid interactive questions or unintended pattern expansion.
18. Advanced Scripting Features
A number of advanced features make scripts sturdier. Automatic variables such as $MyInvocation.MyCommand.Path provide information about the running script, including its full path, which is practical for locating resources relative to the script file. Set-StrictMode -Version Latest enforces stricter rules that turn common mistakes into immediate errors, such as using an uninitialised variable or referencing a property that does not exist. Clearing the console at the outset with Clear-Host gives a clean slate for output when a script begins.
19. .NET Framework Integration
Integration with the .NET Framework extends PowerShell's reach, and here are some examples. For instance, calling [System.IO.Path]::GetFileNameWithoutExtension() extracts a base filename using a tested library method. To gain more control over file I/O, [System.IO.File]::Open() and System.IO.StreamWriter expose low-level handles that specify sharing and access modes, which can help when you need to coordinate writing without blocking other readers. File sharing options like [System.IO.FileShare]::Read allow other processes to read a log file while the script writes to it, reducing contention and surprises.
20. Error Handling
Error handling deserves a clear pattern. Wrapping risky operations in try { } catch { } blocks captures exceptions, so a script can respond gracefully, perhaps by writing a warning and moving on. A finally block can be added for clean-up operations that must run regardless of success or failure.
When transient conditions are expected, a retry logic pattern is often enough, pairing a counter with Start-Sleep to attempt an operation several times before giving up. Waiting for a brief period such as Start-Sleep -Milliseconds 200 gives other processes time to release locks or for temporary conditions to clear.
Alongside this, checking for null values keeps assumptions in check, so conditions like if ($null -ne $process) ensure that you only read properties when an object was created successfully. This defensive approach prevents cascading errors when operations fail to return expected objects.
21. External Process Management
Managing external programmes is a common requirement, and PowerShell's Start-Process offers a controlled route. Several parameters control its behaviour precisely:
The -Wait parameter makes PowerShell pause until the external process completes, essential for sequential processing where later steps depend on earlier ones. The -PassThru parameter returns a process object, allowing you to inspect properties like exit codes after execution completes. The -NoNewWindow parameter runs the external process in the current console rather than opening a new window, keeping output consolidated. If a command expects the Command Prompt environment, calling it via cmd.exe /c $cmd integrates cleanly, ensuring compatibility with programmes designed for the CMD shell.
Exit codes reported with $process.ExitCode indicate success with zero and errors with non-zero values in most tools, so checking these numbers preserves confidence in the sequence of steps. The script demonstrates synchronous execution, processing files one at a time rather than in parallel, which can be an advantage when dependencies exist between stages or when you need to ensure ordered completion.
22. Script Termination
Scripts need to finish in a way that other tools understand. Exiting with Exit 0 signals success to schedulers and orchestrators that depend on numeric codes, while non-zero values indicate error conditions that trigger alerts or retries.
Bringing It All Together
Because this is a granular selection, it leaves it to us to piece everything together to accomplish the tasks that we have to complete. In that way, we can embed the knowledge so that we are vibe coding all the time, ensuring that a more deterministic path is followed.
Stop Microsoft Edge warning you before quitting on macOS
15th August 2025My new client only supports Microsoft Edge for logging onto their systems. Thus, I needed to install that onto my iMac and Mac Mini devices for those occasions when I am not using a Windows device (as it happens, I have yet to try that with Linux). However, Edge issues a warning on exiting it using the CMD+Q shortcut, the quickest way to do that and safer than clicking on the red X on the top right of the application as I have found with other situations on macOS (incidentally, that is similar to using the CMD+W keyboard shortcut). To get rid of the warning, I needed to go to Settings > Appearance > Browser behaviour and features > Warn before quitting with ⌘Q. Once there, it was a matter of toggling the setting to the off position and I was done. However, placing that under Appearance remains an odd decision to me.
What to do when Tuta Mail issues this message when logging into an account on macOS: Could not access Secret Storage
24th September 2024Two things changed before Tuta Mail stopped working as before: modifying Keychain Access settings and upgrading macOS from Sonoma to Sequoia. Either could have been a cause or none of them. The first of these was more likely a culprit than the other.
The result was the same: logging into Tuta Mail yielded an error like this: Could not access Secret Storage. The solution essentially is a two-step process: remove the app and delete its settings folder. Reinstallation then happens after these.
In Finder, go to Applications and move Tuta Mail to the Bin before clearing it from there. That uninstalls the app.
The next step needs you show hidden files and folders using the Command + Shift + . shortcut. Then, go to your home folder (this may need use of the Command + Shift + H shortcut). Open up the Library folder and find the folder called Application Support. Enter that and find the subfolder named tutanota-desktop. That needs to go to the Bin too before expunging it from there. Doing that provides the clean slate for restoration to commence. After this, using the Command + Shift + . shortcut again hides the normally hidden files and folders once more.
Nothing is resolved with the removal of /Users/[username]/Library/Application Support/tutanota-desktop. Using the rm command from the command line interface will remove it faster than Finder, though that may be easier for many users.
Getting rsync to resolve symbolic links
11th September 2024Given how Dropbox changed its handling of symbolic links in 2019 such that internal links within a Dropbox file hierarchy got fixed and links leading outside from the Dropbox area no longer worked. Thankfully, the rsync utility found in many Linux and UNIX settings does not do that, as long as you have called it correctly.
By default, symbolic links are synchronised like any other file. That is what Dropbox does now. To get rsync to resolve the links as shortcuts to either a single file or more likely a folder containing more than one file, it needs the -L switch or option in the command. When that is present, the linked file or files will get synchronised and honours the point of having these links in the first place: allowing more flexibility with folder structures and avoiding any duplication of files and folders.
Opening up Kindle for PC in a maximised window on Windows 11
18th August 2024One irritation with the Windows app for Amazon Kindle is that it does not open in a maximised window, and it scarcely remembers your size settings from session to session. Finding a solution to this sizing issue is no easy task, so I happened on one of my own that I previously used with File Explorer folder shortcuts.
The first step is to find the actual location of the Start Menu shortcut. Trying C:\Users\[User Name]\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Amazon\Amazon Kindle should do that.
Next, right-click on the Kindle icon and choose Properties from the context menu that appears. In the dialogue box that causes to appear on the screen, look for the "Run:" setting. By default, this appears as "Normal Window" but you can change this to "Maximised", which is what I did before clicking on Apply before doing the same for the OK button to dismiss the dialogue box.
If you have pinned the shortcut to the taskbar or elsewhere, you may need to unpin it and pin it again to carry over the change. After that, I found that the Kindle app opened up in a maximised window as I wanted.
Creating soft and hard symbolic links using the Windows command line
19th August 2015In the world of UNIX and Linux, symbolic links are shortcuts, but they do not work like normal Windows shortcuts because you do not jump from one location to another with the file manager's address bar changing what it shows. Instead, it is as if you see the contents of the directory at another quicker to access location in the file system, and the same sort of thinking applies to files too. In some ways, it is like giving files and directories alternative aliases. There are soft links that point to the name of a given directory or file, and hard links that point to actual files or directories.
For a long time, I was under the mistaken impression that such things did not exist on Windows until I came across the mklink command, which came with the launch of Windows Vista at the start of 2007. While this feature might not be widely known, it demonstrates that Windows did adopt some UNIX and Linux capability long before other UNIX-like features, such as virtual desktops, were introduced in Windows 10.
By default, the aforementioned command sets up symbolic links to files and the /D switch allows the same to be done for directories too. The /H switch makes a hard link instead of a soft link, so we get much of the functionality of the ln command in UNIX and Linux. Here is an example that creates a soft symbolic link for a directory:
mklink /D shortcut target_directory
Above, shortcut is the name of the symbolic link file and target_directory is the destination to which it links. In my experience, it works best for destinations beyond your home folder and, from what I have read, hard links may not be possible across different disks either.
Smoother use of more than one SAS DMS session at a time
11th March 2012Unless you have access to SAS Enterprise Guide, being able to work on one project at a time can be a little inconvenient. It is possible to open up more than one Display Manager System (DMS, the traditional SAS programming interface) session at a time only to get a pop-up window for SAS documentation for the second and subsequent sessions. You don't get your settings shared across them, either, while also losing any changes to session options after shutdown.
The cause of both of the above is the locking of the SASUSER directory files by the first SAS session. However, it is possible to set up a number of directories and set the -sasuser option to point at different ones for different sessions.
On Windows, the command in the SAS shortcut becomes:
C:\Program Files\SAS\SAS 9.1\sas.exe -sasuser "c:\sasuser\session 1\"
On UNIX or Linux, it would look similar to this:
sas -sasuser "~/sasuser/session1/"
Since the "session1" in the folder paths above can be replaced with whatever you need, you can have as many as you want too. It might not seem much of a need but synchronising the SASUSER folders every now and again can give you a more consistent set of settings across each session, all without intrusive pop up boxes or extra messages in the log too.
Useful keyboard shortcuts for managing the window sizes of Windows applications
9th June 2008Maximising and minimising windows is all part and parcel of using window-based user interfaces, so it's nice to know that there are keyboard shortcuts that reduce the need to use your mouse all the time. Here are a few that work on Windows:
Alt+Space+N Minimise
Alt+Space+X Maximise
Alt+Space+R Restore (set to default)
Uses for symbolic links
24th April 2007UNIX (and Linux) does a wonderful trick with its file and folder shortcuts; it effectively treats them as file and folder transporters that transfer associate a file or folder that exists in one folder hierarchy with another, and it is treated as if it exists in that hierarchy too. For example, the folder named images under /www/htdocs/blog can have a link under /www/htdocs/ that makes it appear that its contents exist in both places without any file duplication. For instance, the pwd command cannot tell a folder from a folder shortcut. To achieve this, I use what are called symbolic links and the following command achieves the outcome in the example:
ln -s /www/htdocs/blog/images /www/htdocs/images
The first file path is the destination for the link, while the second one is that for the link itself. Once, I had a problem with Google Reader not showing up images in its feed displays, so symbolic links rode to the rescue as they did for resolving a similar conundrum that I was encountering when editing posts in my hillwalking blog.
Adding a new hard drive
21st January 2007
Having during the week obtained a new 320 GB hard drive, today I am adding it to my system after yesterdays scare with a PSU. As with any such item, you need to format and configure it to work with your operating system, be it Windows, Linux or whatever. Good old Partition Magic can help with this (I have version 7 from the Powerquest days) but Windows XP (Professional, anyway) does offer its own tool for the job: the Disk Management console. Unfortunately, it's a bit difficult to find. The easiest way to get to it is to type diskmgmt.msc into the Run command box. Otherwise, it is a matter of setting your Start Menu to show the Administrative Tools group (Taskbar and Start Menu properties> Start Menu tab > Customise > Advanced tab) and accessing through the computer Management console, for which there is a shortcut in this group. Of course, you need to have administrator access to your PC to do any of this.