Technology Tales

Adventures in consumer and enterprise technology

TOPIC: GREP

Keeping a graphical eye on CPU temperature and power consumption on the Linux command line

20th March 2025

Following my main workstation upgrade in January, some extra monitoring has been needed. This follows on from the experience with building its predecessor more than three years ago.

Being able to do this in a terminal session keeps things lightweight, and I have done that with text displays like what you see below using a combination of sensors and nvidia-smi in the following command:

watch -n 2 "sensors | grep -i 'k10'; sensors | grep -i 'tdie'; sensors | grep -i 'tctl'; echo "" | tee /dev/fd/2; nvidia-smi"

Everything is done within a watch command that refreshes the display every two seconds. Then, the panels are built up by a succession of commands separated with semicolons, one for each portion of the display. The grep command is used to pick out the desired output of the sensors command that is piped to it; doing that twice gets us two lines. The next command, echo "" | tee /dev/fd/2, adds an extra line by sending a space to STDERR output before the output of nvidia-smi is displayed. The result can be seen in the screenshot below.

However, I also came across a more graphical way to do things using commands like turbostat or sensors along with AWK programming and ttyplot. Using the temperature output from the above and converting that needs the following:

while true; do sensors | grep -i 'tctl' | awk '{ printf("%.2f\n", $2); fflush(); }'; sleep 2; done | ttyplot -s 100 -t "CPU Temperature (Tctl)" -u "°C"

This is done in an infinite while loop to keep things refreshing; the watch command does not work for piping output from the sensors command to both the awk and ttyplot commands in sequence and on a repeating, periodic basis. The awk command takes the second field from the input text, formats it to two places of decimals and prints it before flushing the output buffer afterwards. The ttyplot command then plots those numbers on the plot seen below in the screenshot with a y-axis scaled to a maximum of 100 (-s), units of °C (-u) and a title of CPU Temperature (Tctl) (-t).

A similar thing can be done for the CPU wattage, which is how I learned of the graphical display possibilities in the first place. The command follows:

sudo turbostat --Summary --quiet --show PkgWatt --interval 1 | sudo awk '{ printf("%.2f\n", $1); fflush(); }' | sudo ttyplot -s 200 -t "Turbostat - CPU Power (watts)" -u "watts"

Handily, the turbostat can be made to update every so often (every second in the command above), avoiding the need for any infinite while loop. Since only a summary is needed for the wattage, all other output can be suppressed, though everything needs to work using superuser privileges, unlike the sensors command earlier. Then, awk is used like before to process the wattage for plotting; the first field is what is being picked out here. After that, ttyplot displays the plot seen in the screenshot below with appropriate title, units and scaling. All works with output from one command acting as input to another using pipes.

All of this offers a lightweight way to keep an eye on system load, with the top command showing the impact of different processes if required. While there are graphical tools for some things, command line possibilities cannot be overlooked either.

When a hard drive is unrecognised by the Linux hddtemp command

15th August 2021

One should not do a new PC build in the middle of a heatwave if you do not want to be concerned about how fast fans are spinning and how hot things are getting. Yet, that is what I did last month after delaying the act for numerous months.

My efforts mean that I have a system built around an AMD Ryzen 9 5950X CPU and a Gigabyte X570 Aorus Pro with 64 GB of memory, and things are settling down after the initial upheaval. That also meant some adjustments to the CPU fan profile in the BIOS for quieter running while the use of Be Quiet! Dark Rock 4 cooler also helps, as does a Be Quiet! Silent Wings 3 case fan. All are components from trusted brands, though I wonder how much abuse they got during their installation and subsequent running in.

Fan noise is a non-quantitative indicator of heat levels as much as touch, so more quantitative means are in order. Aside from using a thermocouple device, there are in-built sensors too. My using Linux Mint means that I have the sensors command from the lm-sensors package for checking on CPU and other temperatures, though hddtemp is what you need for checking on the same for hard drives. The latter can be used as follows:

sudo hddtemp /dev/sda /dev/sdb

This has to happen using administrator access and a list of drives needs to be provided because it cannot find them by itself. In my case, I have no mechanical hard drives installed in non-NAS systems and I even got to replace a 6 TB Western Digital Green disk with an 8 TB SSD, but I got the following when I tried checking on things with hddtemp:

WARNING: Drive /dev/sda doesn't seem to have a temperature sensor.
WARNING: This doesn't mean it hasn't got one.
WARNING: If you are sure it has one, please contact me (hddtemp@guzu.net).
WARNING: See --help, --debug and --drivebase options.
/dev/sda: Samsung SSD 870 QVO 8TB: no sensor

The cause of the message for me was that there is no entry for Samsung SSD 870 QVO 8TB in /etc/hddtemp.db so that needed to be added there. Before that could be rectified, I had to get some additional information using smartmontools and these had to be installed using the following command:

sudo apt-get install smartmontools

What I had to do was check the drive's SMART data output for extra information, and that was achieved using the following command:

sudo smartctl /dev/sda -a | grep -i Temp

What this does is to look for the temperature information from smartctl output using the grep command, with output from the first being passed to the second through a pipe. This yielded the following:

190 Airflow_Temperature_Cel 0x0032 072 050 000 Old_age Always - 28

The first number in the above (190) is the thermal sensor's attribute identifier, and that was needed in what got added to /etc/hddtemp.db. The following command added the necessary data to the aforementioned file:

echo \"Samsung SSD 870 QVO 8TB\" 190 C \"Samsung SSD 870 QVO 8TB\" | sudo tee -a /etc/hddtemp.db

Here, the output of the echo command was passed to the tee command for adding to the end of the file. In the echo command output, the first part is the name of the drive, the second is the heat sensor identifier, the third is the temperature scale (C for Celsius or F for Fahrenheit) and the last part is the label (it can be anything that you like, but I kept it the same as the name). On re-running the hddtemp command, I got output like the following, so all was as I needed it to be.

/dev/sda: Samsung SSD 870 QVO 8TB: 28°C

Since then, temperatures may have cooled and the weather become more like what we usually get, yet I am still keeping an eye on things, especially when the system is put under load using Perl, R, Python or SAS. There may be further modifications such as changing the case or even adding water cooling, not least to have a cooler power supply unit, but nothing is being rushed as I monitor things to my satisfaction.

Searching file contents using PowerShell

25th October 2018

Having made plenty of use of grep on the Linux/UNIX command and findstr on the legacy Windows command line, I wondered if PowerShell could be used to search the contents of files for a text string. Usefully, this turns out to be the case, but I found that the native functionality does not use what I have used before. The form of the command is given below:

Select-String -Path <filename search expression> -Pattern "<search expression>" > <output file>

While you can have the output appear on the screen, it always seems easier to send it to a file for subsequent use, and that is what I am doing above. The input to the -Path switch can be a filename or a wildcard expression, while that to the -Pattern can be a text string enclosed in quotes or a regular expression. Given that it works well once you know what to do, here is an example:

Select-String -Path *.sas -Pattern "proc report" > c:\temp\search.txt

The search.txt file then includes both the file information and the text that has been found for the sake of checking that you have what you want. What you do next is up to you.

Batch conversion of DNG files to other file types with the Linux command line

8th June 2016

At the time of writing, Google Drive is unable to accept DNG files, the Adobe file type for RAW images from digital cameras. While the uploads themselves work fine, the additional processing at the end that, I believe, is needed for Google Photos appears to be failing. Because of this, I thought of other possibilities like uploading them to Dropbox or enclosing them in ZIP archives instead; of these, it is the first that I have been doing and with nothing but success so far. Another idea is to convert the files into an image format that Google Drive can handle, and TIFF came to mind because it keeps all the detail from the original image. In contrast, JPEG files lose some information because of the nature of the compression.

Handily, a one line command does the conversion for all files in a directory once you have all the required software installed:

find -type f | grep -i "DNG" | parallel mogrify -format tiff {}

The find and grep commands are standard, with the first getting you a list of all the files in the current directory and sending (piping) these to the grep command, so the list only retains the names of all DNG files. The last part uses two commands for which I found installation was needed on my Linux Mint machine. The parallel package is the first of these and distributes the heavy workload across all the cores in your processor, and this command will add it to your system:

sudo apt-get install parallel

The mogrify command is part of the ImageMagick suite along with others like convert and this is how you add that to your system:

sudo apt-get install imagemagick

In the command at the top, the parallel command works through all the files in the list provided to it and feeds them to mogrify for conversion. Without the use of parallel, the basic command is like this:

mogrify -format tiff *.DNG

In both cases, the -format switch specifies the output file type, with the tiff portion triggering the creation of TIFF files. The *.DNG portion itself captures all DNG files in a directory, but {} does this in the main command at the top of this post. If you wanted JPEG ones, you would replace tiff with jpg. Should you ever need them, a full list of what file types are supported is produced using the identify command (also part of ImageMagick) as follows:

identify -list format

Command line file comparison in Windows

20th August 2012

While UNIX and Linux both have the diff command for comparing the contents of text files, the Windows counterpart was unknown to me until recently. Its name is fc, and it looks as if the f is for file and c is for comparison, though I cannot confirm that as of now. The usage of that command is not dissimilar to the way that things work with diff. Here is an example command:

fc file1.txt file2.txt > file3.txt

This compares file1.txt with file2.txt and sends the output to file3.txt. Any differences between the two files being compared appear to be more clearly labelled than in the diff output's < and > labels. That verbosity could have its uses, but the existence of the fc command is stopping envious glances at the diff one for now, just as findstr is doing the same in comparison with grep.

Harnessing the power of ImageMagick

26th October 2008

Using the command line to process images might sound senseless, only for the tools offered by ImageMagick certainly prove that it has its place. I have always been wary of using bulk processing for my digital photo files (some digitised from film prints with a scanner) but I do agree that some of it is needed to free up some time for other more necessary things. With this in mind, it is encouraging to see the results from ImageMagick and I can see it making a major difference to how I maintain my online photo gallery.

For instance, making thumbnail images for the gallery certainly seems to be one of those operations where command line bulk processing comes into its own, and ImageMagick's own convert command is heaven sent for this one. For resizing images, all that's needed is the following:

convert -resize 40% input.jpg output.jpg

Add a spot of further shell scripting and even a dash of Perl and the possibilities for this sort of thing become clearer, and this is but the pinnacle of the proverbial iceberg. The -rotate switch will do what the name suggests, while there are a whole plethora of other options on tap. So long as you have Ghostscript on your system, conversion of graphics to Postscript (and Encapsulated Postscript too) and PDF files is possible with the -page option controlling the margin around the image itself in the resulting outputs. Unfortunately, portrait is the sole orientation on offer, yet a bit of judicious post-processing will turn things around. Here's a command that'll do the trick:

convert -page 792x612+72+72 input.png ps2:output.ps

For retrieving image metadata like its resolution and size, the identify command comes into play. The -verbose option invokes the output of all manner of image metadata, so using grep or egrep is perhaps advisable, especially for bulking processing with the likes of Perl. Having the ability to stream image metadata makes loading databases like MySQL less of a chore than the manual data entry that has been my way of doing things until now.

The power of pipes

12th July 2007

One of the great features of the UNIX shell is that you can send the output from one command to another for further processing. Take the following example for instance:

ls -l | grep "Jul 12"

This takes the long directory file listing output and sends it to grep for subsetting (all files created today in this example) before it is returned to the screen. The | character is the pipe trigger, and you can have as many pipes in your command as you want, though readability may dictate how far you want to go.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.