TOPIC: RM
Keeping a file or directory out of a Git or GitHub repository
26th August 2024Recently, I have begun to do more version control of files with Git and GitHub. However, GitHub is not a place to keep files with log in credentials. Thus, I wanted to keep these locally but avoid having them being tracked in either Git or GitHub.
Adding the names to a .gitignore
file will avoid their inclusion prospectively, but what can you do if they get added in error before you do? The answer that I found is to execute a command like the following:
git rm -r --cached [path to file or directory with its name]
That takes it out of the staging area and allows the .gitignore
functionality to do its job. The -r
switch makes the command recursive, should you be working with the contents of a directory. Then, the --cached
flag is what does the removal from the staging area.
While the aforementioned worked for me when I had an oversight, the following is also suggested:
git update-index --assume-unchanged [path to file or directory with its name]
That may be working without a .gitignore
file, which was not how I was doing things. Nevertheless, it may have its uses for someone else, so that is why I include it above.
Using a BASH command to count the files in a directory
12th March 2024As part of my backup workflow, I maintain a machine running OpenMediaVault that I only power up when backups are to be performed. Typically, this often happens when I have new photography images to load, and I have a NAS that acts as an online backup system. The OpenMediaVault machine is a near-offline counterpart to the NAS for added safety.
Recently, I needed to check on the number of image files in a directory from an SSH session because of a need to create a new repository for 2024. Some files from this year had ended up in the 2023 one, and I needed to be sure that nothing from last year ended in the 2024 folder, or vice versa. Getting a file count from a trusted source was a quick way of doing exactly this.
Due to clumsiness with the NAS, I had to do this using the OpenMediaVault machine. While I could go mounting drives on an interim basis, it was quicker to work from a BASH session. The trick was to use the wc
command for counting the lines output by an invocation of the ls
command. An example follows:
ls -l | wc -l
The -l
(as in l
for Lima) switch forces wc
to count lines, while the counterpart (same letter) for ls
forces it to list the contents in long form, one item per line. Thus, counting the number of lines gets you the count of the number of files. The call to the ls
command can be customised to add other things life the number of dot files, but the above was enough for my purposes. When the files in both 2023 directories matched, I was satisfied that all was in order.
Changing the number of lines produced by the tail command in a Linux or UNIX session
25th April 2023Since I often use the tail command to look at the end of a log file and occasionally in combination with the watch command for constant updates, I got to wondering if the number of lines issued by the tail command could be changed. That is a simple thing to do with the -n switch. All you need is something like the following:
tail -n 20 logfile.log
Here the value of 20 is the number of lines produced when it would be 10 by default, and logfile.log gets replaced by the path and name of what you are examining. One thing to watch is that your terminal emulator can show all the lines being displayed. If you find that you are not seeing all the lines that you expect, then that might be the cause.
While you could find this by looking through the documentation, things do not always register with you during dry reading of something laden with lists of parameters or switches. That is an affliction with tools that do a lot and/or allow a lot of customisation.
Avoiding permissions, times or ownership failure messages when using rsync
22nd April 2023The rsync
command is one that I use heavily for doing backups and web publishing. The latter means that it is part of how I update websites built using Hugo because new and/or updated files need uploading. The command also sees usage when uploading files onto other websites as well. During one of these operations, and I am unsure now as to which type is relevant, I encountered errors about being unable to set permissions.
The cause was the encompassing -a
option. This is a shorthand for -rltpgoD
, and the individual options perform the following:
-r
: recursive transfer, copying all contents within a directory hierarchy
-l
: symbolic links copied as symbolic links
-t
: preserve times
-p
: preserve permissions
-g
: preserve groups
-o
: preserve owners
-D
: preserve device and special files
The solution is to some of the options if they are inappropriate. The minimum is to omit the option for permissions preservation, but others may not apply between different servers either, especially when operating systems differ. Removing the options for preserving permissions, groups and owners results in something like this:
rsync -rltD [rest of command]
While it can be good to have a more powerful command with the setting of a single option, it can mean trying to do too much. Another way to avoid permissions and similar errors is to have consistency between source and destination files systems, but that is not always possible.
Removing duplicate characters from strings using BASH scripting
30th March 2023Recently, I wanted to extract some text from the Linux command by word number only for multiple spaces to make things less predictable. The solution was to remove the duplicate spaces. This can be done using sed
, but you add the complexity of regular expressions if you opt for that solution. Instead, the tr
command offers a neater approach. For removing duplicate spaces, the command takes the following form:
echo "test test" | tr -s " "
Since I was piping some text to the command, that is what I have above. The tr
command is intended to replace or delete characters, and the -s switch is a shorthand for --squeeze-repeats. The actual character to be deduplicated is passed in quotes at the end; here, it is a space, but it could be anything that is duplicated. The resulting text in this example becomes:
test test
After the processing, there is now only one space separating the two words, which is the solution that I sought. It certainly cut out any variability that I was encountering in my usage.
Generating PNG files in SAS using ODS Graphics
21st December 2019Recently, I had someone ask me how to create PNG files in SAS using ODS Graphics, so I sought out the answer for them. Normally, the suggestion would have been to create RTF or PDF files instead, but there was a specific need that needed a different approach. Adding something like the following lines before an SGPLOT
, SGPANEL
or SGRENDER
procedure should do the needful:
ods listing gpath='E:\';
ods graphics / imagename="test" imagefmt=png;
Here, the ODS LISTING
statement declares the destination for the desired graphics file, while the ODS GRAPHICS
statement defines the file name and type. In the above example, the file test.png would be created in the root of the E drive of a Windows machine. However, this also works with Linux or UNIX directory paths.
Copying a directory tree on a Windows system using XCOPY and ROBOCOPY
17th September 2016My usual method for copying a directory tree without any of the files in there involves the use of the Windows command line tool XCOPY and the command takes the following form:
xcopy /t /e <source> <destination>
The /t
switch tells XCOPY to copy only the directory structure, while the /e one tells it to include empty directories too. Substituting /s
for /e
would ensure that only non-empty directories are copied. <source>
and <destination>
are the directory paths that you want to use and need to be enclosed in quotes if you have a space in a directory name.
There is one drawback to this approach that I have discovered. When you have long directory paths, messages about there being insufficient memory are issued and the command fails. The limitation has nothing to do with the machine that you are using, but is a limitation of XCOPY itself.
After discovering that, I got to check if ROBOCOPY can do the same thing without the same file path length limitation because I did not have the liberty of shortening folder names to get the whole path within the length expected by XCOPY. The following is the form of the command that I found did what I needed:
robocopy <source1> <destination1> /e /xf *.* /r:0 /w:0 /fft
Here, <source1>
and <destination1>
are the directory paths that you want to use and need to be enclosed in quotes if you have a space in a directory name. The /e
switch copies all subdirectories and not just non-empty ones. Then, the xf *.*
portion excludes all files from the copying process. The remaining options are added to help with getting around access issues and to try to copy only those directories that do not exist in the destination location. The /ftt
switch was added to address the latter by causing ROBOCOPY to assume FAT file times. To get around the folder permission delays, the /r:0
switch was added to stop any operation being retried, with /w:0
setting wait times to 0 seconds. All this was enough to achieve what I wanted, and I am keeping it on file for my future reference, as well as sharing it with you.
Creating soft and hard symbolic links using the Windows command line
19th August 2015In the world of UNIX and Linux, symbolic links are shortcuts, but they do not work like normal Windows shortcuts because you do not jump from one location to another with the file manager's address bar changing what it shows. Instead, it is as if you see the contents of the directory at another quicker to access location in the file system, and the same sort of thinking applies to files too. In some ways, it is like giving files and directories alternative aliases. There are soft links that point to the name of a given directory or file, and hard links that point to actual files or directories.
For a long time, I was under the mistaken impression that such things did not exist on Windows until I came across the mklink
command, which came with the launch of Windows Vista at the start of 2007. While this feature might not be widely known, it demonstrates that Windows did adopt some UNIX and Linux capability long before other UNIX-like features, such as virtual desktops, were introduced in Windows 10.
By default, the aforementioned command sets up symbolic links to files and the /D switch allows the same to be done for directories too. The /H switch makes a hard link instead of a soft link, so we get much of the functionality of the ln command in UNIX and Linux. Here is an example that creates a soft symbolic link for a directory:
mklink /D shortcut target_directory
Above, shortcut is the name of the symbolic link file and target_directory is the destination to which it links. In my experience, it works best for destinations beyond your home folder and, from what I have read, hard links may not be possible across different disks either.
Sorting out a system update failure for FreeBSD
3rd April 2014With my tendency to apply Linux updates using the command, I was happy to see that something similar was possible in FreeBSD too. The first step is to fire up a terminal session and drop into root using the su
command. That needs the root superuser password to continue, and the next step is to update the local repositories using the following command:
pkg update
After that, it is time to download updated packages and install these by issuing this command:
pkg upgrade
Most of the time, that is sufficient, but I discovered that there are times when the above fails and additional interventions are needed. What I had uncovered were dependency error messages, and I set to looking around the web for remedies to this. One forum question that was similar to what I had met with the suggestion of consulting the file called UPDATING in /usr/ports/
. An answer like that looks unhelpful, but for the inclusion of advice where extra actions were needed. Also, there is a useful article on updating FreeBSD ports that gives more in the way of background knowledge so you understand the more about what needs doing.
Following both that and the UPDATING file resulted in my taking the following sequence of steps. The first act was to download and initialise the Ports Collection, a set of build instructions.
portsnap fetch extract
The above is a one time only action, so future updates are done as follows:
portsnap fetch update
With an up to date Ports Collection in place, it was time to install portman
:
pkg install portman
A look through /usr/ports/UPDATING
revealed the commands I needed for updating Python and Perl to address the dependency problem that I was having:
portmaster -o devel/py-setuptools27 devel/py-setuptools
portmaster -r py\*setuptools
With those completed, I re-ran pkg update
again and all was well. The extra actions needed to get that result will not get forgotten, and I am sharing them on here so I know where they are. If anyone else has use for them, that would be even better.
Creating empty text files and changing file timestamps using Windows Command Prompt & Powershell
17th May 2013Linux and UNIX have the touch command for changing the creation dates and times for files. However, it also creates empty text files for you as well. In fact, there are times when I feel the need to do this sort of thing on Windows too and the following command accomplishes the deed when run in a Command Prompt window:
type nul > command.bat
Essentially, null output is sent to a file that is created anew, command.bat in this case. Then, you can edit it in Notepad (or whatever is your choice of text editor) and add in what you need. This will not work in PowerShell, so you need another command for that:
New-Item command.bat -type file1
This uses the New-Item command, which also can be used to create folders as well if you so desire. Then, the command becomes the following:
New-Item c:\commands -type directory1
Note that file1 in the previous example has become directory1
and there is the -force
option should you need to overwrite what already exists for some reason...
That other use of the UNIX/Linux touch command can be performed from the Command Prompt too, and here is an example command:
copy /b file.txt +,,
The /b switch switches on binary behaviour for the copy command, though that appears to be the default action anyway. The +
operator triggers concatenation and ,,
gets around not having a defined destination because you cannot copy a file over itself. If that were possible, then there would no need for special syntax for changing the date and time for a file.
For doing the same thing with PowerShell, try the following:
(GetChildItem test.txt).LastWriteTime=Get-Date
The GetChildItem
command has aliases of gci
, dir
and ls
and the last two of these give away its essential purpose. Here, it is used to pick out the test.txt file so that its timestamp can be replaced with the current date and time returned by the Get-Date command. The syntax looks a little more complex, even if it achieves the same end. Somehow, that touch command is easier to explain. Are Linux and UNIX that complicated, after all?