Technology Tales

Adventures in consumer and enterprise technology

TOPIC: SUPERUSER

Fixing Crontab editor permissions for www-data

3rd September 2025

There are times when I set jobs to run using the web server account www-data for various website maintenance tasks. To ensure that I am doing this for the right account, I issue the following command:

sudo -u www-data crontab -e

However, doing so on this server yielded the following:

touch: cannot touch '/var/www/.selected_editor': Permission denied
Unable to create directory
/var/www/.local/share/nano/: No such file or directory
It is required for saving/loading search history or cursor positions.
No modification made

While things otherwise worked as they should with nano as the editor, I felt it best to avoid such output if I could. Thus, I modified the command like this:

sudo -u www-data HOME=/tmp crontab -e

This sets the home directory for www-data as /tmp to allow the setting of an editor, at least on an ephemeral basis. The root cause of the messages is that www-data is not a user account like others and does not get a home area. Thus, the above workaround gets around that, without the artificiality of creating a www-data folder in the /home directory. Some might get around the whole business using the Vi editor, but nano suits me better.

Upgrading from OpenMediaVault 6.x to OpenMediaVault 7.x

29th May 2024

Having an older PC to upgrade, I decided to install OpenMediaVault on there a few years ago after adding in 6 TB and 4 TB hard drives for storage, a Gigabit network card to speed up backups and a new BeQuiet! power supply to make it quieter. It has been working smoothly since then, and the release of OpenMediaVault 7.x had me wondering how to move to it.

Usefully, I enabled an SSH service for remote logins and set up an account for anything that I needed to do. This includes upgrades, taking backups of what is on my NAS drives, and even shutting down the machine when I am done with what I need to do with it.

Using an SSH session, the first step was to switch to the administrator account and issue the following command to ensure that my OpenMediaVault 6.x installation was as up-to-date as it could be:

omv-update

Once that had completed what it needed to do, the next step was to do the upgrade itself with the following command:

omv-release-upgrade

With that complete, it was time to reboot the system, and I fired up the web administration interface and spotted a kernel update that I applied. Again, the system was restarted, and further updates were noticed and these were applied, again through the web interface. The whole thing is based on Debian 12.x, but I am not complaining as long as it quietly does exactly what I need of it. There was one slight glitch when doing an update after the changeover, and that was quickly sorted.

Later on, I ran into trouble because I had changed my broadband. Because the router address had changed, the system lost its access to the rest of the internet. The web interface also got disable and was issuing 502: Bad Gateway errors. The solution was to execute the following command with superuser privileges:

omv-salt stage run deploy

That took quite a while to run, though. After it completed, I needed to work out what the administrator credentials were. With that done, I could log in and update the network details as needed to restore external internet access. Since then, all has been well.

Running cron jobs using the www-data system account

22nd December 2018

When you set up your own web server or use a private server (virtual or physical), you will find that web servers run using the www-data account. That means that website files need to be accessible to that system account if not owned by it. The latter is mandatory if you want WordPress to be able to update itself with needing FTP details.

It also means that you probably need scheduled jobs to be executed using the privileges possessed by the www-data account. For instance, I use WP-CLI to automate spam removal and updates to plugins, themes and WordPress itself. Spam removal can be done without the www-data account, but the updates need file access and cannot be completed without this. Therefore, I got interested in setting up cron jobs to run under that account and the following command helps to address this:

sudo -u www-data crontab -e

For that to work, your own account needs to be listed in /etc/sudoers or be assigned to the sudo group in /etc/group. If it is either of those, then entering your own password will open the cron file for www-data, and it can be edited as for any other account. Closing and saving the session will update cron with the new job details.

In fact, the same approach can be taken for a variety of commands where files only can be accessed using www-data. This includes copying, pasting and deleting files as well as executing WP-CLI commands. The latter issues a striking message if you run a command using the root account, a pervasive temptation given what it allows. Any alternative to the latter has to be better from a security standpoint.

Halting constant disk activity on a WD My Cloud NAS

6th June 2018

Recently, I noticed that the disk in my WD My Cloud NAS was active all the time, so it reminded me of another time when this happened. Then, I needed to activate the SSH service on the device and log in as root with the password welc0me. That default password was changed before doing anything else. Since the device runs on Debian Linux, that was a simple case of using the passwd command and following the prompts. One word of caution is in order since only root can be used for SSH connections to a WD My Cloud NAS and any other user that you set up will not have these privileges.

The cause of all the activity was two services: wdmcserverd and wdphotodbmergerd. One way to halt their actions is to stop the services using these commands:

/etc/init.d/wdmcserverd stop
/etc/init.d/wdphotodbmergerd stop

The above act only works until the next system restart, so these command should make for a more persistent disabling of the culprits:

update-rc.d -f wdmcserverd remove
update-rc.d -f wdphotodbmergerd remove

If all else fails, removing executable privileges from the normally executable files that the services need will work, and it is a solution that I have tried successfully between system updates:

cd /etc/init.d
chmod 644 wdmcserverd
reboot

Between all of these, it should be possible to have you WD My Cloud NAS go into power saving mode as it should, even if turning off additional services such as DLNA may be what some need to do. Having turned off these already, I only needed to disable the photo thumbnail services that were the cause of my machine's troubles.

Killing Windows processes from the command line

26th September 2015

During my days at work, I often hear about the need to restart a server because something has gone awry with it. This makes me wonder if you can kill processes from the command line, like you do in Linux and UNIX. A recent need to reset Windows Update on a Windows 10 machine gave me enough reason to answer the question.

Because I already knew the names of the services, I had no need to look at the Services tab in the Task Manager like you otherwise would. Then, it was a matter of opening up a command line session with Administrator privileges and issuing a command like the following (replacing [service name] with the name of the service):

sc queryex [service name]

From the output of the above command, you can find the process identifier, or PID. With that information, you can execute a command like the following in the same command line session (replacing [PID] with the actual numeric value of the PID):

taskkill /f /pid [PID]

After the above, the process no longer exists and the service can be restarted. With any system, you need to find the service that is stuck to kill it, but that would be the subject of another posting. What I have not got to testing is whether these work in PowerShell, since I used them with the legacy command line instead. Along with processes belonging to software applications (think Word, Excel, Firefox, etc.), that may be something else to try should the occasion arise.

Restoring GRUB for dual booting of Linux and Windows

11th April 2015

Once you end up with Windows overwriting your master boot record (MBR), you have lost the ability to use GRUB. Therefore, it would be handy to get it back if you want to start up Linux again. Though the loss of GRUB from the MBR was a deliberate act of mine, I knew that I'd have to restore GRUB to get Linux working again. So, I have been addressing the situation with a Live DVD for the likes of Ubuntu or Linux Mint. Once one of those had loaded its copy of the distribution, issuing the following command in a terminal session gets things back again:

sudo grub-install --root-directory=/media/0d104aff-ec8c-44c8-b811-92b993823444 /dev/sda

When there were error messages, I tried this one to see if I could get additional information:

sudo grub-install --root-directory=/media/0d104aff-ec8c-44c8-b811-92b993823444 /dev/sda --recheck

Also, it is possible to mount a partition on the boot drive and use that in the command to restore GRUB. Here is the required combination:

sudo mount /dev/sda1 /mnt
sudo grub-install --root-directory=/mnt /dev/sda

Either of these will get GRUB working without a hitch, and they are far more snappy than downloading Boot-Repair and using that; I was doing that for a while until a feature on triple booting appeared in an issue of Linux User & Developer that reminded me of the more readily available option. Once, there was a need to manually add an entry for Windows 7 to the GRUB menu too and, with that instated, I was able to dual-boot Ubuntu and Windows using GRUB to select which one was to start for me. Since then, I have been able to dual boot Linux Mint and Windows 8.1, with GRUB finding the latter all by itself. Since your experiences too may show this variation, it's worth bearing in mind.

Adding Microsoft Core Fonts to Fedora 19

6th July 2013

While I have a previous posting from 2009 that discusses adding Microsoft's Core Fonts to the then current version of Fedora, it did strike me that I hadn't laid out the series of commands that were used. Instead, I referred to an external and unofficial Fedora FAQ. That's still there, yet I also felt that I was leaving things a little to chance, given how websites can disappear quite suddenly.

Even after next to four years, it still amazes me that you cannot install Microsoft's Core Fonts in Fedora as you would on Ubuntu, Linux Mint or even Debian. Therefore, the following series of steps is as necessary now as it was then.

The first step is to add in a number of precursor applications such as wget for command line file downloading from websites, cabextract for extracting the contents of Windows CAB files, rpmbuild for creating RPM installers and utilities for the XFS file system that chkfontpath needs:

sudo yum -y install rpm-build cabextract ttmkfdir wget xfs

Here, I have gone with terminal commands that use sudo, but you could become the superuser (root) for all of this and there are those who believe you should. The -y switch tells yum to go ahead with prompting you for permission before it does any installations. The next step is to download the Microsoft fonts package with wget:

sudo wget http://corefonts.sourceforge.net/msttcorefonts-2.0-1.spec

Once that is done, you need to install the chkfontpath package because the RPM for the fonts cannot be built without it:

sudo rpm -ivh http://dl.atrpms.net/all/chkfontpath

Once that is in place, you are ready to create the RPM file using this command:

sudo rpmbuild -ba msttcorefonts-2.0-1.spec

After the RPM has been created, it is time to install it:

sudo yum install --nogpgcheck ~/rpmbuild/RPMS/noarch/msttcorefonts-2.0-1.noarch.rpm

When installation has completed, the process is done. Because I used sudo, all of this happened in my own home area, so there was a need for some housekeeping afterwards. If you did it by becoming the root user, then the files would be there instead, and that's the scenario in the online FAQ.

Setting up MySQL on Sabayon Linux

27th September 2012

For quite a while now, I have offline web servers for doing a spot of tweaking and tinkering away from the gaze of web users that visit what I have on there. Therefore, one of the tests that I apply to any prospective main Linux distro is the ability to set up a web server on there. This is more straightforward for some than for others. For Ubuntu and Linux Mint, it is a matter of installing the required software and doing a small bit of configuration. My experience with Sabayon is that it needs a little more effort than this, so I am sharing it here for the installation of MySQL.

The first step is to install the software using the commands that you find below. The first pops the software onto the system while the second completes the set-up. The --basedir option is need with the latter because it won't find things without it. It specifies the base location on the system, and it's /usr in my case.

sudo equo install dev-db/mysql
sudo /usr/bin/mysql_install_db --basedir=/usr

With the above complete, it's time to start the database server and set the password for the root user. That's what the two following commands achieve. Once your root password is set, you can go about creating databases and adding other users using the MySQL command line

sudo /etc/init.d/mysql start
mysqladmin -u root password 'password'

The last step is to set the database server to start every time you start your Sabayon system. The first command adds an entry for MySQL to the default run level so that this happens. The purpose of the second command is to check that this happened before restarting your computer to discover if it really happens. This procedure also is necessary for having an Apache web server behave in the same way, so the commands are worth having and even may have a use for other services on your system. ProFTP is another that comes to mind, for instance.

sudo rc-update add mysql default
sudo rc-update show | grep mysql

Listing hardware information for Linux systems

3rd August 2012

Curiosity about the graphics card on my backup PC caused me to look for ways of getting this information without opening up the machine or searching for a manual. In the end, a solitary command did the job:

sudo lshw

If you are running it as root, the sudo piece can be dropped, but the result is the same. As it happened, it provided me with the information that I needed.

Taking the sudo command beyond Ubuntu

27th October 2010

Though some may call it introducing a security risk, being able to execute administrator commands on Ubuntu using sudo and gksu by default is handy. It's not the only Linux distribution with the facility, though, since the /etc/sudoers file is found in Debian and I plan to have a look into Fedora. The thing that needs to be done is to add the following line to the aforementioned file (you will need to do this as root):

[your user name] ALL=(ALL) ALL

One that is done, you are all set. Just make sure that you're using a secure password, though, and removing the sudo/gksu permissions is as simple as reversing the change.

Update on 2011-12-03: The very same can be done for both Arch Linux and Fedora, The same file locations apply too.

  • The content, images, and materials on this website are protected by copyright law and may not be reproduced, distributed, transmitted, displayed, or published in any form without the prior written permission of the copyright holder. All trademarks, logos, and brand names mentioned on this website are the property of their respective owners. Unauthorised use or duplication of these materials may violate copyright, trademark and other applicable laws, and could result in criminal or civil penalties.

  • All comments on this website are moderated and should contribute meaningfully to the discussion. We welcome diverse viewpoints expressed respectfully, but reserve the right to remove any comments containing hate speech, profanity, personal attacks, spam, promotional content or other inappropriate material without notice. Please note that comment moderation may take up to 24 hours, and that repeatedly violating these guidelines may result in being banned from future participation.

  • By submitting a comment, you grant us the right to publish and edit it as needed, whilst retaining your ownership of the content. Your email address will never be published or shared, though it is required for moderation purposes.