Technology Tales

Adventures & experiences in contemporary technology

Adding Microsoft Core Fonts to Fedora 19

6th July 2013

While I have a previous posting from 2009 that discusses adding Microsoft’s Core Fonts to the then current version of Fedora, it did strike me that I hadn’t laid out the series of command that were used. Instead, I referred to an external and unofficial Fedora FAQ. That’s still there but I also felt that I was leaving things a little to chance given how websites can disappear quite suddenly.

Even after next to four years, it still amazes me that you cannot install Microsoft’s Core Fonts in Fedora as you would in Ubuntu, Linux Mint or even Debian. Therefore, the following series of steps is as necessary now as it was then.

The first step is to add in a number of precursor applications such as wget for command line file downloading from websites, cabextract for extracting the contents of Windows CAB files, rpmbuild for creating RPM installers and utilities for the XFS file system that chkfontpath needs:

sudo yum -y install rpm-build cabextract ttmkfdir wget xfs

Here, I have gone with terminal commands that use sudo but you could become the superuser (root) for all of this and there are those who believe you should. The -y switch tells yum to go ahead with prompting you for permission before it does any installations. The next step is to download the Microsoft fonts package with wget:

sudo wget http://corefonts.sourceforge.net/msttcorefonts-2.0-1.spec

Once that is done, you need to install the chkfontpath package because the RPM for the fonts cannot be built without it:

sudo rpm -ivh http://dl.atrpms.net/all/chkfontpath

Once that is in place, you are ready to create the RPM file using this command:

sudo rpmbuild -ba msttcorefonts-2.0-1.spec

After the RPM has been created, it is time to install it:

sudo yum install --nogpgcheck ~/rpmbuild/RPMS/noarch/msttcorefonts-2.0-1.noarch.rpm

When installation has completed, the process is done. Because I used sudo, all of this happened in my own home area so there was a need for some housekeeping afterwards. If you did it by becoming the root user, then the files would be there instead and that’s the scenario in the online FAQ.

Piggybacking an Android Wi-Fi device off your Windows PC’s internet connection

16th March 2013

One of the disadvantages of my Google/Asus Nexus 7 is that it needs a Wi-Fi connection to use. Most of the time this is not a problem since I also have a Huawei mobile WiFi hub from T-Mobile and this seems to work just about anywhere in the U.K. Away from the U.K. though, it won’t work because roaming is not switched on for it and that may be no bad thing with the fees that could introduce. My HTC Desire S could deputise but I need to watch costs with that too.

There’s also the factor of download caps and those apply both to the Huawei and to the HTC. Recently, I added Anquet‘s Outdoor Map Navigator (OMN) to my Nexus 7 through the Google Play store for a fee of £7 and that allows access to any walking maps that I have bought from Anquet. However, those are large downloads so the caps start to come into play. Frugality would help but I began to look at other possibilities that make use of a laptop’s Wi-Fi functionality.

Looking on the web, I found two options for this that work on Windows 7 (8 should be OK too): Connectify Hotspot and Virtual Router Manager. The first of these is commercial software but there is a Lite edition for those wanting to try it out; that it is not a time limited demo is not something that I can confirm though that did not seem to be the case since it looked as if only features were missing from it that you’d get if you paid for the Pro variant. The second option is an open source one and is free of charge apart from an invitation to donate to the project.

Though online tutorials show the usage of either of these to be straightforward, my experiences were not all that positive at the outset. In fact, there was something that I needed to do and that is why this post has come to exist at all. That happened even after the restart that Conectify Hotspot needed as part of its installation; it runs as a system service so that’s why the restart was needed. In fact, it was Virtual Router Manager that told me what the issue was and it needed no reboot. Neither did it cause network disconnection of a laptop like the Connectify offering did on me and that was the cause of its ejection from that system; limitations in favour of its paid addition aside, it may have the snazzier interface but I’ll take effective simplicity any day.

Using Virtual Router Manager turns out to be simple enough. It needs a network name (also known as an SSID), a password to restrict who accesses the network and the internet connection to be shared. In my case, the was Local Area Connection on the drop down list. With all the required information entered,  I was ready to start the router using the Start Network Router button. The text on this changes to Stop Network Router when the hub is operational or at least it should have done for me on the first time that I ran it. What I got instead was the following message:

The group or resource is not in the correct state to perform the requested operation.

The above may not say all that much but it becomes more than ample information if you enter it into the likes of Google. Behind the scenes, Virtual Router Manager is using native Windows functionality is create a WiFi hub from a PC and it appears to be the Microsoft Virtual Wi-Fi Miniport Adapter from what I have seen. When I tried setting up an adhoc Wi-Fi network from a laptop to the Nexus 7 using Windows’ own network set up capability via its Control Panel, it didn’t do what I needed so there might be something that third party software can do. So, the interesting thing about the solution to my Virtual Router Manager problem was that it needed me to delve into the innards of Windows a little.

Firstly, there’s running Command Prompt (All Programs > Accessories) from the Start Menu with Administrator privileges. It helps here if the account with which you log into Windows is in the Administrators group since all you have  to do then is right click on the Start Menu entry and choose Run as administrator entry in the pop-up context menu. With a command line window now open, you then need to issue the following command:

netsh wlan set hostednetwork mode=allow ssid=[network name] key=[password] keyUsage=persistent

When that had done its thing, Virtual Router Manager worked without a hitch though it did turn itself after a while and that may be no bad thing from the security standpoint. On the Android side, it was a matter of going in Settings > Wi-Fi and choose the new network that have been creating on the laptop. This sort of thing may apply to other types of tablet (Dare I mention iPads?) so you could connect anything to the hub without needing to do any more on the Windows side.

For those wanting to know what’s going on behind the scenes on Windows, there’s a useful tutorial on Instructables that shows what third party software is saving you from having to do. Even if I never go down the more DIY route, I probably have saved myself having to buy a mobile Wi-Fi hub for any trips to Éire. For now, the Irish 3G dongle that I already have should be enough.

Upgrading from Windows 7 to Windows 8 in a VMWare Virtual Machine

1st November 2012

Though my main home PC runs Linux Mint, I do like to have the facility to use Windows software from time to time and virtualisation has allowed me to continue doing that. For a good while, it was a Windows 7 guest within a VirtualBox virtual machine and, before that, one running Windows XP fulfilled the same role. However, it did feel as if things were running slower in VirtualBox than once might have been the case and I jumped ship to VMware Player. It may be proprietary and closed source but it is free of charge and has been doing what was needed. A subsequent recent upgrade of video driver on the host operating system allowed the enabling of a better graphical environment in the Windows 7 guest.

Instability

However, there were issues with stability and I lost the ability to flit from the VM window to the Linux desktop at will with the system freezing on me and needing a reboot. Working in Windows 7 using full screen mode avoided this but it did feel as I was constrained to working in a Windows machine whenever I did so. The graphics performance was imperfect too with screening refreshing being very blocky with some momentary scrambling whenever I opened the Start menu. Others would not have been as patient with that as I was though there was the matter of an expensive Photoshop licence to be guarded too.

In hindsight, a bit of pruning could have helped. An example would have been driver housekeeping in the form of removing VirtualBox Guest Additions because they could have been conflicting with their VMware counterparts. For some reason, those thoughts entered my mind and I was pondering another more expensive option instead.

Considering NAS & Windows/Linux Networking

That would have taken the form of setting aside a PC for running Windows 7 and having a NAS for sharing files between it and my Linux system. In fact, I did get to exploring what a four bay QNAP TS-412 would offer me and realised that you cannot put normal desktop hard drives into devices like that. For a while, it looked as if it would be a matter of getting drives bundled with the device or acquiring enterprise grade disks so as to main the required continuity of operation. The final edition of PC Plus highlighted another one though: the Western Digital Red range. These are part way been desktop and enterprise classifications and have been developed in association with NAS makers too.

Looking at the NAS option certainly became an education but it has exited any sort of wish list that I have. After all, there is the cost of such a setup and it’s enough to get me asking if I really need such a thing. The purchase of a Netgear FS 605 ethernet switch would have helped incorporate it but there has been no trouble sorting alternative uses for it since it bumps up the number of networked devices that I can have, never a bad capability to have. As I was to find, there was a less expensive alternative that became sufficient for my needs.

In-situ Windows 8 Upgrade

Microsoft have been making available evaluation copies of Windows 8 Enterprise that last for 90 days before expiring. One is in my hands has been running faultlessly in a VMware virtual machine for the past few weeks. That made me wonder if upgrading from Windows 7 to Windows 8 help with my main Windows VM problems. Being a curious risk-taking type I decided to answer the question for myself using the £24.99 Windows Pro upgrade offer that Microsoft have been running for those not needing a disk up front; they need to pay £49.99 but you can get one afterwards for an extra £12.99 and £3.49 postage if you wish, a slightly cheaper option. There also was a time cost in that it occupied a lot of a weekend on me but it seems to have done what was needed so it was worth the outlay.

Given the element of risk, Photoshop was deactivated to be on the safe side. That wasn’t the only pre-upgrade action that was needed because the Windows 8 Pro 32-bit upgrade needs at least 16 GB before it will proceed. Of course, there was the matter of downloading the installer from the Microsoft website too. This took care of system evaluation and paying for the software as well as the actual upgrade itself.

The installation took a few hours with virtual machine reboots along the way. Naturally, the licence key was needed too as well as the selection of a few options though there weren’t many of these. Being able to carry over settings from the pre-exisiting Windows 7 instance certainly helped with this and with making the process smoother too. No software needed reinstatement and it doesn’t feel as if the system has forgotten very much at all, a successful outcome.

Post-upgrade Actions

Just because I had a working Windows 8 instance didn’t mean that there wasn’t more to be done. In fact, it was the post-upgrade sorting that took up more time than the actual installation. For one thing, my digital mapping software wouldn’t work without .Net Framework 3.5 and turning on the operating system feature form the Control Panel fell over at the point where it was being downloaded from the Microsoft Update website. Even removing Avira Internet Security after updating it to the latest version had no effect and it was a finding during the Windows 8 system evaluation process. The solution was to mount the Windows 8 Enterprise ISO installation image that I had and issue the following command from a command prompt running with administrative privileges (it’s all one line though that’s wrapped here):

dism.exe /online /enable-feature /featurename:NetFX3 /Source:d:\sources\sxs /LimitAccess

For sake of assurance regarding compatibility, Avira has been replaced with Trend Micro Titanium Internet Security. The Avira licence won’t go to waste since I have another another home in mind for it. Removing Avira without crashing Windows 8 proved impossible though and necessitating booting Windows 8 into Safe Mode. Because of much faster startup times, that cannot be achieved with a key press at the appropriate moment because the time window is too short now. One solution is to set the Safe Boot tickbox in the Boot tab of Msconfig (or System Configuration as it otherwise calls itself) before the machine is restarted. There may be others but this was the one that I used. With Avira removed, clearing the same setting and rebooting restored normal service.

Dealing with a Dual Personality

One observer has stated that Windows 8 gives you two operating systems for the price of one: the one in the Start screen and the one on the desktop. Having got to wanting to work with one at a time, I decided to make some adjustments. Adding Classic Shell got me back a Start menu and I left out the Windows Explorer (or File Explorer as it is known in Windows 8) and Internet Explorer components. Though Classic Shell will present a desktop like what we have been getting from Windows 7 by sweeping the Start screen out of the way for you, I found that this wasn’t quick enough for my liking so I added Skip Metro Suite to do this and it seemed to do that a little faster. The tool does more than sweeping the Start screen out of the way but I have switched off these functions. Classic Shell also has been configured so the Start screen can be accessed with a press of Windows key but you can have it as you wish. It has updated too so that boot into the desktop should be faster now. As for me, I’ll leave things as they are for now. Even the possibility of using Windows’ own functionality to go directly to the traditional desktop will be left untested while things are left to settle. Tinkering can need a break.

Outcome

After all that effort, I now have a seemingly more stable Windows virtual machine running Windows 8. Flitting between it and other Linux desktop applications has not caused a system freeze so far and that was the result that I wanted. There now is no need to consider having separate Windows and Linux PC’s with a NAS for sharing files between them so that option is well off my wish-list. There are better uses for my money.

Not everyone has had my experience though because I saw a report that one user failed to update a physical machine to Windows 8 and installed Ubuntu instead; they were a Linux user anyway even if they used Fedora more than Ubuntu. It is possible to roll back from Windows 8 to the previous version of Windows because there is a windows.old directory left primarily for that purpose. However, that may not help you if you have a partially operating system that doesn’t allow you to do just that. In time, I’ll remove it using the Disk Clean-up utility by asking it to remove previous Windows installations or running File Explorer with administrator privileges. Somehow, the former approach sounds the safer.

What About Installing Afresh?

While there was a time when I went solely for upgrades when moving from one version of Windows to the next, the annoyance of the process got to me. If I had known that installing the upgrade twice onto a computer with a clean disk would suffice, it would have saved me a lot. Staring from Windows 95 (from the days when you got a full installation disk with a PC and not the rescue media that we get now) and moving through a sequence of successors not only was time consuming but it also revealed the limitations of the first in the series when it came to supporting more recent hardware. It was enough to have me buying the full retailed editions of Windows XP and Windows 7 when they were released; the latter got downloaded directly from Microsoft. These were retail versions that you could move from one computer to another but Windows 8 will not be like that. In fact, you will need to get its System Builder edition from a reseller and that can only be used on one machine. It is the merging of the former retail and OEM product offerings.

What I have been reading is that the market for full retail versions of Windows was not a big one anyway. However, it was how I used to work as you have read above and it does give you a fresh system. Most probably get Windows with a new PC and don’t go building them from scratch like I have done for more than a decade. Maybe the System Builder version would apply to me anyway and it appears to be intended for virtual machine use as well as on physical ones. More care will be needed with those licences by the looks of things and I wonder what needs not to be changed so as not to invalidate a licence. After all, making a mistake might cost between £75 and £120 depending on the edition.

Final Thoughts

So far Windows 8 is treating me well and I have managed to bend to my will too, always a good thing to be able to say. In time, it might be that a System Builder copy could need buying yet but I’ll leave well alone for now. Though I needed new security software, the upgrade still saved me money over a hardware solution to my home computing needs and I have a backup disk on order from Microsoft too. That I have had to spend some time settling things was a means of learning new things for me but others may not be so patient and, with Windows 7 working well enough for most, you have to ask if it’s only curious folk like me who are taking the plunge. Still, the dramatic change has re-energised the PC world in an era when smartphones and tablets have made so much of the running recently. That too is no bad thing because an unchanging technology is one that dies and there are times when big changes are needed, as much as they upset some folk. For Microsoft, this looks like one of them and it’ll be interesting to see where things go from here for PC technology.

Google Reader

22nd April 2007

Going through the stats for my other blog, I noticed some activity from Google Reader and decided to investigate. what I discovered was a very capable feed reader, much better than Technorati’s equivalent. The interface feels a little like an email client with a different entry of the sidebar for each feed. It also gives you full text and pictures for each blog article that it picks up, though it messes with my hillwalking blog for some reason… As a feed aggregator, it performs very well and makes my blog surveying a lot more effortless. I know that Outlook 2007 has aggregation functionality too but the portability of Google’s little online offering makes it worth taking further, especially as you wouldn’t need to pay for it anyway.

Update: Google Reader also allows you to share items from your feeds; you can find what I am sharing here.

Choices, choices…

10th November 2007

Choice is a very good thing but too much of it can be confusing and the world of Linux is a one very full of decisions. The first of these centres around the distro to use when taking the plunge and there can be quite a lot to it. In fact, it is a little like buying your first SLR/DSLR or your first car: you only really know what you are doing after your first one. Putting it another way, you only how to get a house built after you have done.

With that in mind, it is probably best to play a little on the fringes of the Linux world before committing yourself. It used to be that you had two main choices for your dabbling:

  • using a spare PC
  • dual booting with Windows by either partitioning a hard drive or dedicating one for your Linux needs.

In these times, innovations such as Live CD distributions and virtualisation technology keep you away from such measures. In fact, I would suggest starting with the former and progressing to the latter for more detailed perusal; it’s always easy to wipe and restore virtual machines anyway and you can evaluate several distros at the same time if you have the hard drive space. It also a great way to decide which desktop environment you like. Otherwise, terms like KDE, GNOME, XFCE, etc. might not mean much.

The mention of desktop environments brings me to software choices because they do drive what software is available to you. For instance, the Outlook lookalike that is Evolution is more likely to appear where GNOME is installed than where you have KDE. The opposite applies to the music player Amarok. Nevertheless, you do find certain stalwarts making a regular appearance; Firefox, OpenOffice and the GIMP all fall into this category.

The nice thing about Linux is that distros more often than not contain all of the software that you are likely to need. However, that doesn’t mean that its all on the disk and that you have to select what you need during the installation. There might have been a time when it might have felt like that but my recent experience has been that a minimum installation is set in place that does all of the basics and you easily can add the extras later on an as needed basis. I have also found that online updates are a strong feature too.

Picking up what you need when you need it has major advantages, the big one being that Linux grows with you. You can add items like Apache, PHP and MySQL when you know what they are and why you need them. It’s a long way from picking applications of which you know very little at installation time and with the suspicion that any future installation might land you in dependency hell while performing compilation of application source code; the temptation to install everything that you saw was a strong one. The learn before you use approach favoured by the ways that things are done nowadays is an excellent one.

Even if life is easier in the Linux camp these days, there is no harm in sketching out your software needs. Any distribution should be able to fulfill most if not all of them. As it happened, the only third party application that I have needed to install on Ubuntu without recourse to Synaptic was VMware Workstation and that procedure thankfully turned out to be pretty painless.

HAL.DLL: a roadblock on the resurrection of a poorly PC

2nd August 2007

My PC is very poorly at the moment and Windows XP re-installation is the prescribed course of action. However, I have getting errors reporting missing or damaged HAL.DLL at the first reboot of the system during installation. I thought that there might be hard disk confusion and so unplugged all but the Windows boot drive. That only gave me an error about hard drives not being set up properly. Thankfully, a quick outing on Google turn up a few ideas. I should really have started with Microsoft since they have an article on the problem. About.com has also got something to offer on the subject and seems to be a good resource on installing XP to boot: I had forgotten how to do a repair installation and couldn’t find the place in the installation menus. In any event, a complete refresh should be a good thing in the long run, even if it is going to be a very disruptive process. I did consider moving to Vista at the point but getting XP back online seems the quickest route to getting things back together again. Strangely, I feel like a fish out of water right now but that’ll soon change…

Update: It was in fact my boot.ini that was causing this and replacement of the existing contents with defaults resolved the problem…

A feast of plugins

6th July 2007

Themed Login

Here’s a useful idea: get your blog login page to look like it’s part of your blog. It does work well on my hillwalking blog but you do have to watch how it behaves with whatever theme you are using. Strangely, I couldn’t get it to work on my offline blog, the development mirror of what you see online. The ability to set what page is displayed after logging in or logging out is an especially useful inclusion.

My Category Order

My Link Order

My Page Order

These sound like really good ideas: being able to control the running order of things on your blog sidebar is a good thing. What scuppered my using them is that you need widgets turned on for the effect to work and i have seen issues with how ID’s have been set when things are widgetised.

Trouble with my Canon CanoScan 5000F

26th June 2007

I have had my Canon CanoScan 5000F scanner for nearly four years now and it has until yesterday performed faultlessly. However, it has now developed a fault that may hasten its replacement and I have to say that my eye is on Epson’s Perfection V350 Photo. Looking on the web, I did find scanners hidden away and that the selection available wasn’t what I might have expected it to be. Maybe, the digital photography revolution has made the humble scanner a less essential item. And the fault? Scan results are featuring an unacceptably strong magenta cast. In fact, the first scans result in nothing but pitch black but allowing things to stay on for a while does improve things. That suggests a hardware fault to me. I have raised the issue with Canon and will await their reply even though it is stopping me from adding any new photos to my online photo gallery. If Canon comes back to me with the "uneconomical to repair" response, I will be ready to go out and buy the Epson. Time will tell with this one…

Update 1: A spot of further exploration has left me wondering if it is the lamp that’s on the way out. If that’s replaceable at a reasonable price, then the CanoScan might live on after a spot of repair.

Update 2: Canon’s advice included reinstalling the scanner driver and, surprisingly given the symptoms, that seems to have helped. I’ll continue to keep an eye on things but it looks like I’ll be hanging onto my money for now.

Perl vs. PHP: A Personal Experience

11th June 2007

Ever since I converted it from a client-side Javascript-powered affair, my online photo gallery has been written in Perl. There have been some challenges along the way, figuring out how to use hash tables has been one, but everything has worked as expected. However, I am now wondering if it is better to write things in PHP for sake of consistency with the rest to the website. I had a go a rewriting the random photo page and, unless I have been missing something in the Perl world, things do seem more succinct with PHP. For instance, actions that formerly involved several lines of code can now be achieved in one. Reading the contents of a file into an array and stripping HTML/XML tags from a string fall into this category and seeing the number of lines of code halving is a striking observation. I am not going to completely abandon Perl, it’s a very nice language, but I do rather suspect that there is now an increased chance of I having a website whose server-side processing needs are served entirely by PHP.

Exploring AJAX

7th June 2007

When I first started it, my online photo gallery started out simply as a set of interlinked HTML pages. Over time, I discovered frames (yes, them!) and started to make use of JavaScript to make the slideshows slicker. In those days, I was working off free webspace provided by my ISP and client-side scripting was the only tool that I had for enhancing functionality. Having tired of the vagaries of client-side scripting -- the browser wars were in full swing and incompatibilities reigned supreme, I went with paid hosting in order to get access to tools like Perl and PHP for server-side processing; their flexibility compared to JavaScript was a breath of fresh air to me and I am still a fan of the server-side approach.

The journey that I have just described is one that I now know was followed by a lot of website builders around the same time. Nevertheless, I have still held onto JavaScript for some things, particularly for updating the DOM as part of making the pages more responsive to user interaction. In the last few years, a hybrid approach has been gaining currency: AJAX. This offers the ability to modify parts of a page without needing to reload the whole thing and that has generated a considerable amount of interest among web application developers.

The world of AJAX is evidently a complex one though the underlying principle can be explained in simple terms. The essential idea is that you use JavaScript to call a server-side script, PHP is as good an example as any, that returns either text or XML that can be used to update part of a web page in situ without the need to reload it as per the traditional way of working. It has opened up so many possibilities from the interface design point of view that AJAX became a hot topic that still receives much attention today. One bugbear is efficiency because I have seem an AJAX application lock up a PC with a little help from IE6. There will always remain times where server-side processing is the best route and that needs to balanced against the client-side and vice versa.

Like its forbear DHTML, AJAX is really a development approach using a number of different technologies in combination. The DHTML elements such as (X)HTML, CSS, DOM and JavaScript are very much part of the AJAX world but server-side elements such as HTTP, PHP, MySQL and XML are also very much part of the fabric of the landscape. In fact, while AJAX can use plain text as the transfer format, XML is the one implied by the AJAX acronym and XSLT is used to transform XML in HTML. However, AJAX is not limited to the aforementioned technologies; for instance, I cannot see why Perl cannot play a role in place of PHP and ASP can be used for the same things.

Even in these standards-compliant days, browser support for AJAX remains diverse, to say the least, and it is akin to having MSIE in one corner and the rest in the other. Mind you, Microsoft did introduce the tools in the first place but they used ActiveX and Mozilla created a new object type rather than continue this method of operation. Given that ActiveX is a Windows-only technology, I can see why Mozilla did what they did and it is a sensible decision. In fact, IE7 appears to have picked up the Mozilla way of doing things.

Even with the apparent convergence, there will continue to be a need for the AJAX JavaScript libraries that are currently out there. Incidentally, Adobe has included one called Spry with Dreamweaver CS3. Nevertheless, I still like to find out how things work at the basic level and feel somewhat obstructed when I cannot do this. I remember perusing Wrox’s Professional AJAX and found the constant references to the associated function library rather grating; the writing style didn’t help either.

My taking a more granular approach has got me reading SAMS Teach Yourself AJAX in 10 Minutes as a means for getting my foot in the door. As with their Teach Yourself … in 24 Hours series, the title is a little misleading since there are 22 lessons of 10 minutes in duration (the 24 Hours moniker refers to there being 24 lessons, each of one hour in length). Anything composed of 10 minute lessons, even 22 of them, is never going to be comprehensive but, as a means for getting started, I have to say that the approach seems effective on the basis of this volume. It has certainly whet my appetite for giving AJAX a go and it’ll be interesting to see how things progress from here.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.