Technology Tales

Adventures & experiences in contemporary technology

Creating empty text files and changing file timestamps using Windows Command Prompt & Powershell

17th May 2013

Linux and UNIX have the touch command for changing the creation dates and times for files. However, it also will create empty text files for you as well. In fact, there are times when I feel the need to do this sort of thing on Windows too and the following command accomplishes the deed when run in a Command Prompt window:

type nul > command.bat

Essentially, null output is sent to a file that is created anew, command.bat in this case. Then, you can edit it in Notepad (or whatever is your choice of text editor) and add in what you need. This will not work in Powershell so you need another command for that:

New-Item command.bat -type file

This uses the New-Item command, which also can be used to create folders as well if you so desire. Then, the command becomes the following:

New-Item c:\commands -type directory

Note that file on the previous example has become directory and there is the -force option should you need to overwrite what already exists for some reason…

That other use of the UNIX/Linux touch command can be performed from the Command Prompt too and here is an example command:

copy /b file.txt +,,

The /b switch switches on binary behaviour for the copy command though that appears to be the default action anyway. The + operator triggers concatenation and ,, gets around not having a defined destination because you cannot copy a file over itself. If that were possible, then there would no need for special syntax for changing the date and time for a file.

For doing the same thing with Powershell, try the following:

(GetChildItem test.txt).LastWriteTime=Get-Date

The GetChildItem command has aliases of gci, dir and ls and the last two of these give away its essential purpose. Here, it is used to pick out the test.txt file so that its timestamp can be replaced with the current date and time returned by the Get-Date command. The syntax looks a little more complex even if it achieves the same end. Somehow, that touch command is easier to explain. Are Linux and UNIX that complicated after all?

Saving Windows Command Prompt & Powershell command history to a file for later useage

15th May 2013

It’s amazing what ideas Linux gives that you wouldn’t encounter that clearly in the world of Windows. One of these is output and command line history so a script can be created. In the Windows world, this would be called a batch file. Linux usefully has the history command and it does the needful for taking a snapshot like so:

history > ~/commands.sh

All of the commands stored in a terminal’s command history get stored in the commands.sh in the user’s home area. The command for doing the same thing from the Windows command line is not as obvious because it uses the doskey command that is intended for command line macro writing and execution. Usefully, it has a history option that tells it to output all the commands issued in a command line session. Unless, you create a file with them in there, there seems to be no way to store all those commands across sessions, unlike UNIX and Linux. Therefore, a command like the following is a partial solution that is more permanent than using the F7 key on your keyboard:

doskey /history > c:\commands.bat

Windows Powershell has something similar too and it even has aliases of history and even h. All Powershell scripts have file extensions of ps1 and the example below follows that scheme:

get-history > c:\commands.ps1

However, I believe that even Powershell doesn’t carry over command history between sessions though Microsoft are working on adding this useful functionality. They could co-opt Cygwin of course but that doesn’t seem to be their way of going about things.

Command line file comparison in Windows

20th August 2012

While UNIX and Linux both have the diff command for comparing the contents of text files, the Windows counterpart was unknown to me until recently. Its name is fc and it looks as if the f is for file and c is for comparison though I cannot confirm that as of now. That command and its usage is not dissimilar to the way that things work with diff. Here is an example command:

fc file1.txt file2.txt > file3.txt

This compares file1,txt with file2.txt and sends the output to file3.txt. Any differences between the two files being compared seem to be more clearly labelled than in the diff output’s < and > labels. That verbosity could have its uses but existence of the fc command is stopping envious glances at the diff one for now, just as findstr is doing the same in comparison with grep.

Synchronising package selections between Linux Mint and Linux Mint Debian Edition

18th April 2012

To generate the package list on the GNOME version of Linux Mint, I used the Backup Tool. It simply was a matter of using the Backup Software Selection button and telling it where to put the file that it generates. Alternatively, dpkg can be used from the command line like this:

sudo dpkg --get-selections > /backup/installed-software.txt

After transferring the file to the machine with Linux Mint Debian Edition, I tried using the Backup Tool on there too. However, using the Restore Software Selection button and loading the required only produced an irrecoverable error. Therefore, I set to looking around the web and found a command line approach that did the job for me.

The first step is to load the software selection using dpkg by issuing this command (it didn’t matter that the file wasn’t made using the dpkg command though I suspect that’s what the Linux Mint Backup Tool was doing that behind the scenes):

sudo dpkg --set-selections < /backup/installed-software.txt

Then, I started dselect and chose the installation option from the menu that appeared. First time around, it fell over but trying again was enough to complete the job. Packages available to the vanilla variant of Linux Mint but not found in the LMDE repositories were overlooked as I had hoped and installation of the extra packages had no impact on system stability either.

sudo dselect

Apparently, there is an alternative to using dselect that is based on the much used apt-get command but I didn’t make use of it so cannot say more:

sudo apt-get dselect-upgrade

All that I can say is that the dpkg/dselect combination did what I wanted so I’ll keep them in mind if ever need to synchronise software selections between two Debian-based distributions in the future again. The standard edition of Linux Mint may be based on Ubuntu rather than Debian but Ubuntu is itself based on Debian so the description holds here.

Command line setting of Windows file attributes

11th February 2012

Aside from permissions that can be set using the cacls command, Windows files have properties like read only, archive and hidden. Of course, these are not the same or as robust as access permissions but they may have a use in stopping accidental updates to files when you don’t have access to use of the cacls command. While you could set these attributes using the properties page of any file, executing the attrib command on the Windows command is more convenient. Here are some possible usage options:

Set the read only flag on a file:

attrib +r test.txt

Remove the read only flag from a file (found a use for this one recently):

attrib -r test.txt

Set the archive flag on a file:

attrib +a test.txt

Remove the archive flag from a file:

attrib -a test.txt

Set the hidden only flag on a file:

attrib +h test.txt

Remove the hidden flag from a file:

attrib -h test.txt

Using the /s option and wildcards processes a number of files at a time and /d applies the command to directories. They could come in handy when removing read only attributes (also called bits in places) from files copied from read only optical media such as CD’s and DVD’s.

Shell swapping in Windows

28th April 2010

Until the advent of PowerShell, Windows had been the poor relation when it came to working from the command line when compared with UNIX, Linux and so on. A recent bit of fiddling had me trying to run FTP from the legacy command prompt when I ran into problems with UNC address resolution (it’s unsupported by the old technology) and mapping of network drives. It turned out that my error 85 was being caused by an unavailable drive letter that the net use command didn’t reveal as being in use. Reassuringly, this wasn’t a Vista issue that I couldn’t circumvent.

During this spot of debugging, I tried running batch files in PowerShell and discovered that you cannot run them there like you would from the old command prompt. In fact, you need a line like the following:

cmd /c script.bat

In other words, you have to call cmd.exe like perl.exe, wscript.exe and cscript.exe for batch files to execute. If I had time, I might have got to exploring the use ps1 files for setting up PowerShell commandlets but that is something that needs to wait until another time. What I discovered though is that UNC addressing can be used with PowerShell without the need for drive letter mappings, not a bad development at all. While on the subject of discoveries, I discovered that the following command opens up a command prompt shell from PowerShell without any need to resort to the Start Menu:

cmd /k

Entering the exit command returns you to the PowerShell command line again and entering cmd /? reveals the available options for the command so you need never be constrained by your own knowledge or its limitations.

Using the Windows Command Line for Security Administration

24th July 2009

While there are point and click tools for the job, being able to set up new user groups, attaching them to folders and assign uses to them using the command line has major advantages when there are a number to be set up and logs of execution can be retained too. In light of this, it seems a shame that terse documentation along with its being hard to rack down answers to any questions using Google, or whatever happens to be your search engine of choice, makes it less easy to discern what commands need to be run. This is where a book would help but the whole experience is in direct contrast to the community of information providers that is the Linux user community, with Ubuntu being a particular shining example. Saying that, the Windows help system is not so bad once you can track down what you need. For instance, knowing that you need commands like CACLS and NET LOCALGROUP, the ones that have been doing the back work for me, it offers useful information quickly enough. To illustrate the usefulness of the aforementioned commands, here are a few scenarios.

Creating a new group:

net localgroup [name of new group] /comment:”[more verbose description of new group]” /add

Add a group to a folder:

cacls [folder address] /t /e /p [name of group]

The /t switch gets cacls to apply changes to the ACL for the specified folder and all its subfolders, recursive action in other words, while the /e specifies ACL editing rather than its replacement and /p induces replacement of permissions for a given user or group. Using :n, :f, :c or :r directly after the name of a specified user or group assigns no, full, change (write) or read access, respectively. Replacing /p with /r revokes access and leaving off the :n/:f/:c/:r will remove the group or user from the folder.

Add a user to a group:

net localgroup [name of group] [user name (with domain name if on a network)] /add

In addition to NET LOCALGROUP, there is also NET GROUP for wider network operations, something that I don’t have cause to do. Casting the thinking net even wider, I suspect that VB scripting and its ability to tweak the Windows Management Interface might offer more functionality than what is above (PowerShell also comes to mind while we are on the subject) but I am sharing what has been helping me and it can be hard to find if you don’t know where to look.

How much space is that folder taking up on your disk?

23rd July 2008

In Windows, it’s a matter of right-clicking on the folder and looking in its properties. I am sure that there is a better way of doing it in that ever pervasive operating system but, in the worlds of Linux and UNIX, the command line comes to the rescue as it is wont to do. What follows is the command that I use:

du -sh foldername

The s option makes it present the total space taken up and leaving it out gets you a breakdown of how much space the subfolders are taking up as well. The h makes the sizes output more friendly to human eyes with things like 10K, 79M and 51G littering what you get. The command itself is a much shorter way of saying “print disk usage”. It’s all quick and easy when you know it and very useful in this age of ever increasing data volumes.

Automating FTP II: Windows

15th April 2008

Having thought about automating command line FTP on UNIX/Linux, the same idea came to me for Windows too and you can achieve much the same results, even if the way of getting there is slightly different. The first route to consider is running a script file with the ftp command at the command prompt (you may need %windir%system32ftp.exe to call the right FTP program in some cases):

ftp -s:script.txt

The contents of script are something like the following:

open ftp.server.host
user
password
lcd destination_directory
cd source_directory
prompt
get filename
bye

It doesn’t take much to turn your script into a batch file that takes the user name as its first input and your password as its second for sake of enhanced security and deletes any record thereof for the same reason:

echo open ftp.server.host >  script.txt
echo %1 >> script.txt
echo %2 >> script.txt
echo cd htdocs >> script.txt
echo prompt >> script.txt
echo mget * >> script.txt
echo bye >> script.txt
%windir%system32ftp.exe -s:script.txt
del script.txt

The feel of the Windows command line (in Windows 2000, it feels very primitive but Windows XP is better and there’s PowerShell now too) can leave a lot to be desired by someone accustomed to its UNIX/Linux counterpart but there’s still a lot of tweaking that you can do to the above, given a bit of knowledge of the Windows batch scripting language. Any escape from a total dependence on pointing and clicking can only be an advance.

Automating FTP I: UNIX and Linux

11th April 2008

Having got tired of repeated typing in everything at the prompt of an interactive command line FTP session and doing similar things via the GUI route, I started to wonder if there was a scripting alternative and, lo and behold, I found it after a spot of googling. There are various opportunities for its extension such as prompting for username and password instead of the risky approach of including them in a script or cycling through a directory structure but here’s the foundation stone for such tinkering anyway:

HOSTNAME='ftp.server.host'
USER='user'
PSSWD='password'
REP_SRC='source_directory'
REP__DEST='destination_directory'
FILENAME='*'

rm -rf log_file.tmp

cd "${REP_DEST}"

ftp -i -n -v <<EndFTP >>log_file.tmp 2>>log_file.tmp
open ${HOSTNAME}
user ${USER} ${PSSWD}
prompt
cd "${REP_SRC}"
mget "${FILENAME}"
EndFTP

cd ~

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.