Technology Tales

Notes drawn from experiences in consumer and enterprise technology

TOPIC: UNIX

Silencing MLX warnings when running Ollama via Homebrew on macOS

15th February 2026

While there is an Ollama app on macOS, I chose to install it using Homebrew instead. That worked well enough, even if I kept seeing a warning message like this on macOS Tahoe:

WARN MLX dynamic library not available error="failed to load MLX dynamic library (searched: [/opt/homebrew/Cellar/ollama/0.16.2/bin

For some reason, the MLX integration into ollama is not what it should be, even if it runs without any other issues as things stand. While the native app does not have this issue and warnings like these can be overlooked at the operating system level, my chosen solution was to specify this alias in the .zshrc file:

alias ollama='command ollama "$@" 2> >(grep -v "MLX dynamic library not available" >&2)'

On executing this command to reload the configuration file, the output from ollama to stderr was much cleaner:

source ~/.zshrc

However, the alias itself still needs some unpacking to explain what is happening. Let us proceed piece by piece, focussing on the less obvious parts of the text within the quotes in the alias definition.

command: Starting the whole aliased statement with this keyword stops everything becoming recursive and makes everything safer, even if I have got away with doing that with the ls command elsewhere. If I were to include aliases in functions, the situation could be different, producing an infinite loop in the process.

"$@": Without this, the arguments to the ollama command would not be passed into the alias.

2>: This is the overall redirection to stderr and >(grep -v "MLX dynamic library not available" >&2) where the text removal takes place.

grep -v: Within the filtering statement, this command prints everything that does not match the search string. In this case, that is "MLX dynamic library not available" .

>&2: Here is where the output is sent back to stderr for the messages that appear in the console.

In all of this, it is important to distinguish between stderr (standard error output) and stdout (standard output). For ollama, the latter is how you receive a response from the LLM, while the former is used for application feedback. This surprised me when I first learned of it, yet it is common behaviour in the world of Linux and UNIX, which includes macOS.

This matters here because suppressing stderr means that you get no idea of how an LLM download is proceeding because that goes to that destination, rather than stdout as I might have expected without knowing better as I do now. Hence, the optimal approach is to subset the stderr output instead.

Streamlining text case conversion across multiple apps with a reusable macOS terminal command

10th January 2026

Changing text from mixed case to lower case is something that I often do. Much of the time until recently, this has been accomplished manually, but I started to wonder if a quicker way could be found. Thus, here is one that I use when working in macOS. It involves using a command that works in the terminal, and I have added a short alias for it to my .zshrc file. Here is the full pipeline:

pbpaste | tr '[:upper:]' '[:lower:]' | pbcopy

In the above, pbpaste reads from the paste buffer (or clipboard) while pbcopy writes the final output to the clipboard, replacing what was there before. In between those, tr '[:upper:]' '[:lower:]' changes any lower case letters to lower case ones.

With that, the process becomes this: copy the text into the paste buffer, run the command, paste output where it is wanted. While there may be a few steps, it is quicker than doing everything manually or opening another app to do the job. This suffices for now, and I. may get to look at something similar for Linux in time.

File comparison using PowerShell

16th August 2025

In the past, I have compared files on the Linux/UNIX command line as well as the legacy Windows command line. Recently, I decided to try it using PowerShell. Here is the command structure:

Compare-Object (Get-Content ".\[name of one text file]") (Get-Content ".\[name of another text file]") > [path and name of output file]

Admittedly, this is more verbose than the others that I have mentioned above. Nevertheless, it does the job and sends everything to a text file for review. The Compare-Object piece does the comparison once the Get-Content portions have read in the content.

Unzipping more than one file at a time in Linux and macOS

10th September 2024

To me, it sounded like a task for shell scripting, but I wanted to extract three zip archives in one go. They had come from Google Drive and contained different splits of the files that I needed, raw images from a camera. However, I found a more succinct method than the line of code that you see below (it is intended for the BASH shell):

for z in *.zip; do; unzip "$z"; done

That loops through each file that matches a glob string. All I needed was something like this:

unzip '*.zip'

Without embarking on a search, I got close but have not quoted the search string. Without the quoting, it was not working for me. To be sure that I was extracting more than I needed, I made the wildcard string more specific for my case.

Once the extraction was complete, I moved the files into a Lightroom Classic repository for working on them later. All this happened on an iMac, but the extraction itself should work on any UNIX-based operating system, so long as the shell supports it.

Another way to supply the terminal output of one BASH command to another

26th April 2023

My usual way for sending the output of one command to another is to be place one command after another, separated by the pipe (|) operator, adjusting the second command as needed. However, I recently found that this approach does not work well for docker pull commands until I uncovered another option.

The solution is to enclose the input command in $( ) within the output command. Within the parentheses, any kind of command can be declared and includes anything with piping as part of it. As long as text is being printed to the terminal, it can be fed to the second command and used as required. Thus, you can have something like the following:

docker pull $([command outputting name of image to download])

This approach has helped with other kinds of automation of docker image and container use and deployment because it is so general. There may be other uses found for the approach yet.

Killing Windows processes from the command line

26th September 2015

During my days at work, I often hear about the need to restart a server because something has gone awry with it. This makes me wonder if you can kill processes from the command line, like you do in Linux and UNIX. A recent need to reset Windows Update on a Windows 10 machine gave me enough reason to answer the question.

Because I already knew the names of the services, I had no need to look at the Services tab in the Task Manager like you otherwise would. Then, it was a matter of opening up a command line session with Administrator privileges and issuing a command like the following (replacing [service name] with the name of the service):

sc queryex [service name]

From the output of the above command, you can find the process identifier, or PID. With that information, you can execute a command like the following in the same command line session (replacing [PID] with the actual numeric value of the PID):

taskkill /f /pid [PID]

After the above, the process no longer exists and the service can be restarted. With any system, you need to find the service that is stuck to kill it, but that would be the subject of another posting. What I have not got to testing is whether these work in PowerShell, since I used them with the legacy command line instead. Along with processes belonging to software applications (think Word, Excel, Firefox, etc.), that may be something else to try should the occasion arise.

Copying only updated and new files

20th October 2008

With Linux/UNIX, the command line remains ever useful and allows you to do all manner of things, including file copying that only adds new files to a destination. Here's a command that accomplishes this on Linux:

cp -urv [source] [destination]

The u switch does the update while r ensures recursion (by default, cp only copies files from a source directory and not anything sitting in subfolders) and v tells the command to tell the user what is happening.

Though buried and hardly promoted, Windows also has its command line and here's what accomplishes a similar result:

xcopy /d /u [source] [destination]

Anything's better than having to approve or reject every instance where source and destination files are the same or, even worse, to overwrite a file when it is not wanted.

How much space is that folder taking up on your disk?

23rd July 2008

On Windows, it's a matter of right-clicking on the folder and looking in its properties. I am sure that there is a better way of doing it in that ever pervasive operating system but, in the worlds of Linux and UNIX, the command line comes to the rescue as it is wont to do. What follows is the command that I use:

du -sh foldername

The s option makes it present the total space taken up; leaving it out gets you a breakdown of how much space the subfolders are taking up as well. The h makes the sizes output more friendly to human eyes with things like 10K, 79M and 51G littering what you get. The command itself is a much shorter way of saying "print disk usage". It's all quick and easy when you know it, and very useful in this age of ever-increasing data volumes.

A quick way to create a blank text file

27th June 2008

The primary job done by the touch command in UNIX or Linux is to update the time stamps on files. However, it also has another function: creating an empty text file where you are "touching" a file that doesn't exist. This has its uses, particularly when you want to reduce the amount of pointing and clicking that you need to do, or you want to generate a series of empty files in a shell script. Whatever you do with it is up to you.

Putting it all on one line

9th March 2008

One of the nice things about the Linux/UNIX command line is that you get the options of stringing together a number of commands on one line for submission of all for processing at one go. Separating them with && does the trick, but I noticed that semicolon delimitation worked as well. Here's a line that will install VMware for you in one fell swoop:

sudo apt-get install linux-headers-$(uname -r) build-essential gcc-3.4 && tar xzf VMware-workstation-6.0.2-59824.i386.tar.gz && export CC=/usr/bin/gcc-3.4 && cd vmware-distrib && sudo ./vmware-install.pl

Another trick is to direct the output of one command into another, like the following, which subsets a process listing:

ps -aux | grep "wine"

It's all good stuff and is the sort of thing that shows why so many Linux/UNIX types love their command line so much.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.