Technology Tales

Adventures & experiences in contemporary technology

Sorting a stalled Windows Update service

30th January 2015

Following a recent family death, I have ended up with the laptop belonging to the deceased and, since it has been offline most of its life, I set to getting it updated. The McAfee security suite was straightforward enough but trying Windows Update produced errors suggesting that it was not working that a system restart was needed. Doing that did nothing so a little further investigation was needed.

The solution turned out to be stopping the Windows Update service and clearing a certain folder before starting it again. To stop the service, I typed in services.msc into the search box on the Start Menu and clicked on the Services entry that appeared. Then I sought out the Windows Update entry, selected it and clicked on the Stop link on the left hand side. After that, I used Windows Explorer to navigate C:\Windows\SoftwareDistribution and deleted everything in there. The, I went back to the Services window and started Windows Update again. That sorted the problem and the system began to be updated as needed.

All of this was on Windows 7, hence the mention of the Start Menu, and the machine is Toshiba Satellite C660 from 2011 with an AMD E-300 APU, 4 GB of RAM and a 320 GB hard drive. Those specs may not be the most impressive but it feels spritely enough and is far better than the lethargic Toshiba Equium A200-1VO that I acquired in 2008 though the HP Pavilion dm4 that I bought in November 2011 probably will travel more often than either of these, if truth be told. After all, it now has 8 GB of RAM and a 1 TB Samsung SSHD along with its Core i3 CPU so it should last a while yet.

Faster IE7 phishing monitoring

28th October 2007

The standard phishing detection that comes with IE7 really does slow things down when it comes to navigating web pages. In contrast, the option offered as part of Norton 360 is much faster. So much so that you hardly notice that it’s there at all. When I restored IE7 on my PC and ran it for the first time, Norton asked me to be its default fraud detection and I was away from there. Norton 360 offers nothing for Firefox, my preferred and default option but there may be plug-ins that address that need.

Best left until later in the year?

26th January 2010

In the middle of last year, my home computing experience was one of feeling displaced. A combination of a stupid accident and a power outage had rendered my main PC unusable. What followed was an enforced upgrade that use combination that was familiar to me: Gigabyte motherboard, AMD CPU and Crucial memory. However, assembling that lot and attaching components from the old system from the old system resulted in the sound of whirring fans but nothing appearing on-screen. Not having useful beeps to guide me meant that it was a case of undertaking educated guesswork until the motherboard was found to be at fault. In a situation like this, a deeper knowledge of electronics would have been handy and might have saved me money too. As for the motherboard, it is hard to say whether it was a faulty set from the outset or whether there was a mishap along the way, either due ineptitude with static or incompatibility with a power supply. What really tells the tale on the mainboard was the fact that all of the other components are working well in other circumstances, even that old power supply.

A few years back, I had another experience with a problematic motherboard, an Asus this time, that ate CPU’s and damaged a hard drive before I stabilised things. That was another upgrade attempted in the first half of the year. My first round of PC building was in the third quarter of 1998 and that went smoothly once I realised that a new case was needed. Similarly, another PC rebuild around the same time of year in 2005 was equally painless. Based on these experiences, I should not be blamed for waiting until later in the year before doing another rebuild, preferably a planned one rather than an emergency.

Of course, there may be another factor involved too. The hint was a non-working Sony DVD writer that was acquired early last year when it really was obvious that we were in the middle of a downturn. Could older unsold inventory be a contributor? Well, it fits in with seeing poor results twice, In addition, it would certainly tally with a problematical PC rebuild in 2002 following the end of the Dot Com bubble and after the deadly Al Qaeda attack on New York’s World Trade Centre. An IBM hard drive that was acquired may not have been the best example of the bunch and the same comment could apply to the Asus motherboard. The resulting construction may have been limping but it was working and I tolerated.

In contrast, last year’s episode had me launched into using a Toshiba laptop and a spare older PC for my needs with an external hard drive enclosure used to extract my data onto other external hard drives to keep me going. It felt a precarious arrangement but it was a useful experience in ways too. There was cause for making acquaintance with nearby PC component stores that I hadn’t visited before and I got to learning about things that otherwise wouldn’t have come my way. Using an external hard drive enclosure for accessing data on hard drives from a non-functioning PC is one of these. Discovering that it is possible to boot from external optical and hard disk drives came as a surprise too and will work so long as there is motherboard support for it. Another experience came from a crisis of confidence that had me acquiring a bare-bones system from Novatech and populating it with optical and hard disk drives. Then, I discovered that I have no need for power supplies rated more than 300 watts (around 200 W suffices). Turning my PC off more often became a habit friendly both to the planet and to household running costs too. Then, there’s the beneficial practice of shopping locally and it can suffice even if what PC magazines stick on their hot lists but shopping online for those pieces doesn’t guarantee success either. All of these were useful lessons and, while I’d rather not throw away good money after bad, it goes to show that even unsuccessful acquisitions had something to offer in the form of learning opportunities. Whether you consider that is worthwhile is up to you.

Pondering storage options

1st June 2011

The combination of curiosity and a little spare time had me browsing online computing technology stores recently. A spot of CD and DVD burning brought on by a flurry of Linux distribution testing reminded me of the possibility. Because I have built up a sizeable library of digital photos, ensuring that I have backups of them is something that needs doing. A 2 GB Samsung external hard drive is brought to life every now and again for that purpose but the prospect of using Blu-Ray discs has appealed to me. After all capacities of 25 GB for single layer discs and 50 GB for dual layer ones sound not inappropriate for my purposes. However, they aren’t a cheap option at the time of writing with each disc costing in the region of £3-4 at one place where I was looking. The cost of BD writers themselves seems not to be so bad though with a few in the £60-100 bracket; any lower than this and you could end up with a combo drive that reads Blu-Ray discs and writes to DVD’s and CD’s so a modicum of concentration is needed. As attractive as the idea might be, the cost of BD media means that I’ll wait a little while before deciding to take the plunge. The price premium at the moment is a reminder of the way that things used to be when CD and DVD writers first came on the market. It is very telling when discs come packaged in jewel cases, something that you won’t see too often with CD’s or DVD’s.

Another piece of storage excitement that hasn’t escaped me is the advent of SSD hard drives. With no moving parts like in conventional hard drives, they bring a speed boost. Concerns about their lifetimes and the numbers of read/write events per drive would stall me when it comes to storing personal data on them but using them for the likes of operating system files sounds attractive, especially with my partiality to Linux perhaps not hammering drives so much. As with any new technology, there is a price premium though a drive big enough for hosting an operating system can be acquired for less than £100. As with many of my hardware purchase brainwaves, there’s no rush but this is an option that I’ll keep at the back of my mind.

Another appealing notion is the idea of getting a NAS so that files can be shared between a few computers. While I have seen prices starting at just above £70 for single disk enclosures, these generally are a more expensive option than external drives and that’s before you consider the cost of any hard drives. Nevertheless, the advantages of a unit containing more than a single hard drive while operating as a print server for any compatible printer too. When you get to 4 or 5 hard drive trays, then the cost has mounted but that could be when they pay their way too. What reminded me of these was a bookazine on home networking that I recently found at a branch of WHSmith’s and their attractions are subject to the networking side of things being made to work without a drama. Once that’s out of the way, then their usefulness really does appeal.

Mulling over all these brainwaves is one thing but it doesn’t mean that the purse strings will become too loose in this age of economic constraint. In fact, pondering them may serve to staunch any impulse purchases. Sometimes, a spot of virtual shopping serves to control things rather than losing the run of oneself.

JavaScript: write it yourself or use a library?

3rd July 2008

I must admit that I have never been a great fan of JavaScript. For one thing, its need to interact with browser objects places you at the mercy of the purveyors of such pieces of software. Debugging is another fine art that can seem opaque to the the uninitiated since the amount and quality of the logging is determined an interpreter that isn’t provided by the language’s overseers. All in all, it seems to present a steep and obstacle-strewn learning curve to newcomers. As it happens, I have always found server side scripting languages like PHP and Perl to be more to my taste and I have no aversion at all to writing SQL.

In the late 1990’s when I was still using free web hosting, JavaScript probably was the best option for my then new online photo gallery. Whatever was the truth, it certainly was the way that I went. Learning Java or Flash might have been useful but I never managed to devote sufficient time to the task so JavaScript turned out to be the way forward until I got a taste of server side scripting. Moving to paid hosting allowed for that to develop and the JavaScript option took a back seat.

Based on my experience of the browser wars and working with JavaScript throughout their existence, I was more than a little surprised at the buzz surrounding AJAX. Ploughing part of the way through WROX’s Beginning AJAX did nothing to sell the technology to me; it came across as a very dry jargon-blighted read. Nevertheless, I do see the advantages of web applications being as responsive as their desktop equivalents but AJAX doesn’t always guarantee this; as someone that has seen such applications crawling on IE6, I can certainly vouch for this. In fact, I suspect that may be behind the appearance of technologies such as AIR and Silverlight so JavaScript may get usurped yet again, just like my move to a photo gallery powered on the server side.

Even with these concerns, using JavaScript to add a spot more interactivity is never a bad thing even if it can be overdone, hence the speed problems that I have witnessed. In fact, I have been known to use DOM scripting but I need to have the use in mind before I can experiment with a technology; I cannot do it the other way around. Nevertheless, I am keen to see what JavaScript libraries such as jQuery and Prototype might have to offer (both have been used in WordPress). I have happened on their respective websites so they might make good places to start and who knows where my curiosity might take me?

Upgrading from Windows 7 to Windows 8 in a VMWare Virtual Machine

1st November 2012

Though my main home PC runs Linux Mint, I do like to have the facility to use Windows software from time to time and virtualisation has allowed me to continue doing that. For a good while, it was a Windows 7 guest within a VirtualBox virtual machine and, before that, one running Windows XP fulfilled the same role. However, it did feel as if things were running slower in VirtualBox than once might have been the case and I jumped ship to VMware Player. It may be proprietary and closed source but it is free of charge and has been doing what was needed. A subsequent recent upgrade of video driver on the host operating system allowed the enabling of a better graphical environment in the Windows 7 guest.

Instability

However, there were issues with stability and I lost the ability to flit from the VM window to the Linux desktop at will with the system freezing on me and needing a reboot. Working in Windows 7 using full screen mode avoided this but it did feel as I was constrained to working in a Windows machine whenever I did so. The graphics performance was imperfect too with screening refreshing being very blocky with some momentary scrambling whenever I opened the Start menu. Others would not have been as patient with that as I was though there was the matter of an expensive Photoshop licence to be guarded too.

In hindsight, a bit of pruning could have helped. An example would have been driver housekeeping in the form of removing VirtualBox Guest Additions because they could have been conflicting with their VMware counterparts. For some reason, those thoughts entered my mind and I was pondering another more expensive option instead.

Considering NAS & Windows/Linux Networking

That would have taken the form of setting aside a PC for running Windows 7 and having a NAS for sharing files between it and my Linux system. In fact, I did get to exploring what a four bay QNAP TS-412 would offer me and realised that you cannot put normal desktop hard drives into devices like that. For a while, it looked as if it would be a matter of getting drives bundled with the device or acquiring enterprise grade disks so as to main the required continuity of operation. The final edition of PC Plus highlighted another one though: the Western Digital Red range. These are part way been desktop and enterprise classifications and have been developed in association with NAS makers too.

Looking at the NAS option certainly became an education but it has exited any sort of wish list that I have. After all, there is the cost of such a setup and it’s enough to get me asking if I really need such a thing. The purchase of a Netgear FS 605 ethernet switch would have helped incorporate it but there has been no trouble sorting alternative uses for it since it bumps up the number of networked devices that I can have, never a bad capability to have. As I was to find, there was a less expensive alternative that became sufficient for my needs.

In-situ Windows 8 Upgrade

Microsoft have been making available evaluation copies of Windows 8 Enterprise that last for 90 days before expiring. One is in my hands has been running faultlessly in a VMware virtual machine for the past few weeks. That made me wonder if upgrading from Windows 7 to Windows 8 help with my main Windows VM problems. Being a curious risk-taking type I decided to answer the question for myself using the £24.99 Windows Pro upgrade offer that Microsoft have been running for those not needing a disk up front; they need to pay £49.99 but you can get one afterwards for an extra £12.99 and £3.49 postage if you wish, a slightly cheaper option. There also was a time cost in that it occupied a lot of a weekend on me but it seems to have done what was needed so it was worth the outlay.

Given the element of risk, Photoshop was deactivated to be on the safe side. That wasn’t the only pre-upgrade action that was needed because the Windows 8 Pro 32-bit upgrade needs at least 16 GB before it will proceed. Of course, there was the matter of downloading the installer from the Microsoft website too. This took care of system evaluation and paying for the software as well as the actual upgrade itself.

The installation took a few hours with virtual machine reboots along the way. Naturally, the licence key was needed too as well as the selection of a few options though there weren’t many of these. Being able to carry over settings from the pre-exisiting Windows 7 instance certainly helped with this and with making the process smoother too. No software needed reinstatement and it doesn’t feel as if the system has forgotten very much at all, a successful outcome.

Post-upgrade Actions

Just because I had a working Windows 8 instance didn’t mean that there wasn’t more to be done. In fact, it was the post-upgrade sorting that took up more time than the actual installation. For one thing, my digital mapping software wouldn’t work without .Net Framework 3.5 and turning on the operating system feature form the Control Panel fell over at the point where it was being downloaded from the Microsoft Update website. Even removing Avira Internet Security after updating it to the latest version had no effect and it was a finding during the Windows 8 system evaluation process. The solution was to mount the Windows 8 Enterprise ISO installation image that I had and issue the following command from a command prompt running with administrative privileges (it’s all one line though that’s wrapped here):

dism.exe /online /enable-feature /featurename:NetFX3 /Source:d:\sources\sxs /LimitAccess

For sake of assurance regarding compatibility, Avira has been replaced with Trend Micro Titanium Internet Security. The Avira licence won’t go to waste since I have another another home in mind for it. Removing Avira without crashing Windows 8 proved impossible though and necessitating booting Windows 8 into Safe Mode. Because of much faster startup times, that cannot be achieved with a key press at the appropriate moment because the time window is too short now. One solution is to set the Safe Boot tickbox in the Boot tab of Msconfig (or System Configuration as it otherwise calls itself) before the machine is restarted. There may be others but this was the one that I used. With Avira removed, clearing the same setting and rebooting restored normal service.

Dealing with a Dual Personality

One observer has stated that Windows 8 gives you two operating systems for the price of one: the one in the Start screen and the one on the desktop. Having got to wanting to work with one at a time, I decided to make some adjustments. Adding Classic Shell got me back a Start menu and I left out the Windows Explorer (or File Explorer as it is known in Windows 8) and Internet Explorer components. Though Classic Shell will present a desktop like what we have been getting from Windows 7 by sweeping the Start screen out of the way for you, I found that this wasn’t quick enough for my liking so I added Skip Metro Suite to do this and it seemed to do that a little faster. The tool does more than sweeping the Start screen out of the way but I have switched off these functions. Classic Shell also has been configured so the Start screen can be accessed with a press of Windows key but you can have it as you wish. It has updated too so that boot into the desktop should be faster now. As for me, I’ll leave things as they are for now. Even the possibility of using Windows’ own functionality to go directly to the traditional desktop will be left untested while things are left to settle. Tinkering can need a break.

Outcome

After all that effort, I now have a seemingly more stable Windows virtual machine running Windows 8. Flitting between it and other Linux desktop applications has not caused a system freeze so far and that was the result that I wanted. There now is no need to consider having separate Windows and Linux PC’s with a NAS for sharing files between them so that option is well off my wish-list. There are better uses for my money.

Not everyone has had my experience though because I saw a report that one user failed to update a physical machine to Windows 8 and installed Ubuntu instead; they were a Linux user anyway even if they used Fedora more than Ubuntu. It is possible to roll back from Windows 8 to the previous version of Windows because there is a windows.old directory left primarily for that purpose. However, that may not help you if you have a partially operating system that doesn’t allow you to do just that. In time, I’ll remove it using the Disk Clean-up utility by asking it to remove previous Windows installations or running File Explorer with administrator privileges. Somehow, the former approach sounds the safer.

What About Installing Afresh?

While there was a time when I went solely for upgrades when moving from one version of Windows to the next, the annoyance of the process got to me. If I had known that installing the upgrade twice onto a computer with a clean disk would suffice, it would have saved me a lot. Staring from Windows 95 (from the days when you got a full installation disk with a PC and not the rescue media that we get now) and moving through a sequence of successors not only was time consuming but it also revealed the limitations of the first in the series when it came to supporting more recent hardware. It was enough to have me buying the full retailed editions of Windows XP and Windows 7 when they were released; the latter got downloaded directly from Microsoft. These were retail versions that you could move from one computer to another but Windows 8 will not be like that. In fact, you will need to get its System Builder edition from a reseller and that can only be used on one machine. It is the merging of the former retail and OEM product offerings.

What I have been reading is that the market for full retail versions of Windows was not a big one anyway. However, it was how I used to work as you have read above and it does give you a fresh system. Most probably get Windows with a new PC and don’t go building them from scratch like I have done for more than a decade. Maybe the System Builder version would apply to me anyway and it appears to be intended for virtual machine use as well as on physical ones. More care will be needed with those licences by the looks of things and I wonder what needs not to be changed so as not to invalidate a licence. After all, making a mistake might cost between £75 and £120 depending on the edition.

Final Thoughts

So far Windows 8 is treating me well and I have managed to bend to my will too, always a good thing to be able to say. In time, it might be that a System Builder copy could need buying yet but I’ll leave well alone for now. Though I needed new security software, the upgrade still saved me money over a hardware solution to my home computing needs and I have a backup disk on order from Microsoft too. That I have had to spend some time settling things was a means of learning new things for me but others may not be so patient and, with Windows 7 working well enough for most, you have to ask if it’s only curious folk like me who are taking the plunge. Still, the dramatic change has re-energised the PC world in an era when smartphones and tablets have made so much of the running recently. That too is no bad thing because an unchanging technology is one that dies and there are times when big changes are needed, as much as they upset some folk. For Microsoft, this looks like one of them and it’ll be interesting to see where things go from here for PC technology.

Databases & Programming

29th September 2012

The world of UNIX appears to attract those interested in the more technical aspects of computing. Since Linux is cut from the same lineage, it is apt to include lists of computing languages. Both scripting and programming appear here despite the title, itself shortened for the sake of brevity. Since much code cutting involves working with databases, these appear here too.

In time, I plan to correct the imbalance between programming and scripting languages that currently exists. The original list was bare, so descriptions have been added and will be more and more needed should there be any expansion of what you find here.

Programming and Scripting Languages

Apache Groovy

My first encounter with an implementation of this language was with that belonging to a statistical computing environment (SCE) and that remains an ongoing dalliance. It is easy to think of Groovy as a way of working with a Java-based API using a scripting language and it certainly feels like that. Saying that, it all works better if you know Java, though you do have to watch for the development of domain-specific language capability. That last comment probably applies to the aforementioned SCE in that it has its own object and method hierarchy that means that not all standard Groovy functionality is available.

Clojure

Clojure is a dynamic, functional programming language that runs on the Java Virtual Machine (JVM) and is designed for building robust and scalable software applications. It is characterised by its emphasis on immutability, persistent data structures, and seamless interoperability with Java. Clojure embraces the Lisp programming language’s principles, providing a concise syntax and powerful abstractions for managing state, concurrency, and functional programming paradigms. With its focus on simplicity, expressiveness, and the ability to leverage the vast Java ecosystem, Clojure enables developers to create efficient and maintainable code for a wide range of applications.

Erlang

This is a programming language designed for building highly concurrent, fault-tolerant, and scalable systems that was developed by Ericsson in the late 1980s for telecommunication systems, where reliability and performance are critical. Erlang incorporates features such as lightweight processes, message passing, and built-in support for fault tolerance, making it well-suited for developing distributed and real-time applications. Its unique concurrency model and emphasis on fault tolerance have led to its widespread use in industries such as telecommunications, banking, gaming, and web development, where systems need to handle high loads, be resilient to failures, and provide real-time responsiveness.

Elixir

Inspired by Erlang, Elixir is a functional, concurrent programming language designed for building scalable and fault-tolerant applications. It leverages the powerful concurrency model of the Erlang Virtual Machine (BEAM) while providing a more accessible and expressive syntax. It offers features such as lightweight processes, message passing, pattern matching, and a robust ecosystem of libraries and frameworks. With its focus on reliability, performance, and ease of development, Elixir is well-suited for developing highly concurrent and distributed systems, making it a popular choice for building web applications, real-time systems, and software that requires high availability.

Go

Computing languages often get strange names like single letters or small words like this one; that means that you need to look for “Golang” in any online search. In any case, Go was originated at Google and numbered among its inventors was one of the creators of the C programming language. The intent here is massively multithreaded system programming using stand-alone executable components while retaining or enhancing code readability. Another facet is the ability to function efficiently in distributed computing environments like those at SoundCloud or Uber. A variety of different tools have been written using the language and these include the ever pervasive Docker and Kubernetes.

Julia

It remains an odd decision to give a computing language a girl’s name, but the purpose is a serious one. Often, there is a trade-off between speed of code writing and speed of execution with the result being that data programming involves prototyping in one language and porting to another for production usage. The first group includes R and Python while the second includes C, C++, FORTRAN and even Java, so there is an element of translation involved that often means that different people are involved, which adds an element of error caused by misunderstandings. This gets described as the two language problem and Julia’s major raison d’être is the avoidance of that: its top-line description is that it is as quick to program as Python but runs as fast as C because of its just-in-time compilation, multiple dispatch and in-built multithreading. This also allows for extensive capabilities for scientific computing that go beyond machine learning and an example comes in the number of differential equation solvers that are available. It also helps that meta-programming makes everything more generalisable.

Perl

It has been around since the 1980’s and still pervades though it is not as dominant as it once was for creating dynamic websites or system administration. PHP has taken on much of the former while Python is making inroads into the latter. Still, no list would be complete with complete without a mention of the once ubiquitous scripting language and it once powered my online photo gallery. It may be an easier language, but there is plenty of documentation on the web with Perldoc, Perl Maven and Perlmeister being some good places to look, and Dan Massey has some interesting articles on his site too. Not only that, but it is extensible too, with plenty of extra modules to be found on CPAN.

PHP

This usurper has taken the place of Perl for powering many of the world’s websites. That the language is less verbose probably helps its case and many if not most CMS packages make use of its versatility.

Python

It may be Google’s preferred scripting language for system administration but it is its usefulness for Data Science where it really has shone in the eyes of many. There are numerous packages for data wrangling, data visualisation and machine learning that make the language ever present in any Data Scientist’s toolbox and looking in the PyPi archive will allow you to find what you need. It also has its place in web scripting too, even if it is not as pervasive as PHP though CMS’s like Plone run on Python and there is the Django framework together with the Gunicorn web server.

OpenJDK

One of the acts of Jonathon Schwartz while he was head at Sun Microsystems was to make Java open source after more than a decade of its being largely proprietary and this is the website for the project. Of course, his more notable act at Sun was to sell the company to Oracle, but that’s another story altogether…

R

This is an open-source implementation of the S language that is much appreciated by statisticians and is much used in the teaching of the subject. The base language only has so much functionality but there are many packages available that do just that and there are many to find on repositories like the CRAN and others can be found on various GitHub repositories, though these tend to be more experimental in nature. There are commonly used and well-supported mainstays that everyone uses, but there always is a need to verify that a particular package does what it claims to do. Given that, there are possibilities for data wrangling, data tabulation, data visualisation and data science. While quick to code, R is slow to execute compared with others and I have found that Python is faster but it still has a use for smaller data sets; both keep their temporary data sets in system memory so that will help.

Rust

It came as a surprise that this Mozilla-originated language is gaining traction in scientific data analysis, possibly because it is a fast multithreaded counterpart to C and C++ with some added safety features (though these can be turned off if needed and extra care gets taken). The downsizing of Mozilla led to a sharp reduction in its team of Rust developers and the Rust Foundation has been set up to oversee the language instead. There are online books like The Rust Programming Language and the Rust Cookbook, with the first of these also having paper and e-book counterparts from No Starch Press. For those interested in a more interactive introduction, there also is the Tour of Rust.

Databases

MariaDB

This essentially is a fork of MySQL (see below) now that Oracle owns it. The originators of MySQL are the creators of MariaDB so their claims of it being a drop-in replacement for it may have some traction. So far, I have seen no exodus from MySQL, though.

MySQL

After being in the hands of a number of owners until it incongruously came into the custodianship of Oracle (who of course already had and still have one of their own), the database system that powers many dynamic websites almost remains a de facto standard and looks set to remain thus for now.

MongoDB

This may a document-based and not a relationship database like many of us understand them but it still is being touted as an alternative to the more mainstream competition. Database technology isn’t just about SQL and MongoDB champions a NoSQL approach; it sounds as if the emergence of XML might be what’s facilitating the NoSQL database technologies.

PostgreSQL

This project may have more open-source credibility than MySQL, but it seems to remain in its shadow, though that may be explained by its being a more complex piece of software to use (at least, that has been my experience, anyway). It so happens that this is what Debian installs if you specify the web server option at operating system installation time.

A collection of lessons learnt about web hosting

28th March 2008

Putting this blog back on its feet after a spot of web hosting bother caused me to learnt a bit more about web hosting than I  otherwise might have done. Here’s a selection and they are in no particular order:

  1. Store your passwords securely and where you can find them because you never know how a foul up of your own making can strike. For example, a faux pas with a configuration file is all that’s needed to cause havoc for a database site such as a WordPress blog. After all, nobody’s perfect and your hosting provider may not get you out of trouble as quickly as you might like.
  2. Get a MySQL database or equivalent as part of your package rather than buying one separately. If your provider allows a trial period, then changing from one package to another could be cheaper and easier than if you bought a separate database and needed to jettison it because you changed from, say, a Windows package to a Linux one or vice versa.
  3. It might be an idea to avoid a reseller unless the service being offered is something special. Going for the sake of lower cost can be a false economy and it might be better to cut out the middleman altogether and go direct to their provider. Being able to distinguish a reseller from a real web host would be nice but I don’t see that ever becoming a reality; it is hardly in resellers’ interests, after all.
  4. Should you stick with a provider that takes several days to resolve a serious outage? The previous host of this blog had a major MySQL server outage that lasted for up to three days and seeing that was one of the factors that made me turn tail to go to a more trusted provider that I have used for a number of years. The smoothness of the account creation process might be another point worthy of consideration.
  5. Sluggish system support really can frustrate, especially if there is no telephone support provided and the online ticketing system seems to take forever to deliver solutions. I would advise strongly that a host who offers a helpline is a much better option than someone who doesn’t. Saying all of that, I think that it’s best to be patient and, when your website is offline, that might not be as easy you’d hope it to be.
  6. Setting up hosting or changing from one provider to another can take a number of days because of all that needs doing. So, it’s best to allow for this and plan ahead. Account creation can be very quick but setting up the website can take time while domain name transfer can take up to 24 hours.
  7. It might not take the same amount of time to set up Windows hosting as its Linux equivalent. I don’t know if my experience was typical but I have found that the same provider set up Linux hosting far quicker (within 30 minutes) than it did for a Windows-based package (several hours).
  8. Be careful what package you select; it can be easy to pick the wrong one depending on how your host’s sight is laid out and what they are promoting at the time.
  9. You can have a Perl/PHP/MySQL site working on Windows, even with IIS being used in place instead of Apache. The Linux/Apache/Perl/PHP/MySQL approach might still be better, though.
  10. The Windows option allows for ASP, .Net and other such Microsoft technologies to be used. I have to say that my experience and preference is for open source technologies so Linux is my mainstay but learning about the other side can never hurt from a career point of view. After, I am writing this on a Windows Vista powered laptop to see how the other half live as much as anything else.
  11. Domains serviced by hosting resellers can be visible to the systems of those from whom they buy their wholesale hosting. This frustrated my initial attempts to move this blog over because I couldn’t get an account set up for technologytales.com because a reseller had it already on the same system. It was only when I got the reseller to delete the account with them that things began to run more smoothly.
  12. If things are not going as you would like them, getting your account deleted might be easier than you think so don’t procrastinate because you think it a hard thing to do. Of course, it goes without saying that you should back things up beforehand.

Work locally, update remotely

4th December 2008

Here’s a trick that might have its uses: using a local WordPress instance to update your online blog (yes, there are plenty of applications that promise to edit your online blog but these need file permissions to the likes of xmlrpc.php to be opened up). Along with the right database access credentials and the ability to log in remotely, adding the following two lines to wp-config.php does the trick:

define('WP_SITEURL', 'http://localhost/blog');

define('WP_HOME', 'http://localhost/blog');

These two constants override what is in the database and allow to update the online database from your own PC using WordPress running on a local web server (Apache or otherwise). One thing to remember here is that both online and offline directory structures are similar. For example, if your online WordPress files are in blog in the root of the online web server file system (typically htdocs for Linux), then they need to be contained in the same directory in the root of the offline server too. Otherwise, things could get confusing and perhaps messy. Another thing to consider is that you are modifying your online blog so the usual rules about care and attention apply, particularly with respect to using the same version of WordPress both locally and remotely. This is especially a concern if you, like me, run development versions of WordPress to see if there are any upheavals ahead of us like the overhaul that is coming in with WordPress 2.7.

An alternative use of this same trick is to keep a local copy of your online database in case of any problems while using a local WordPress instance to work with it. I used to have to edit the database backup directly (on my main Ubuntu system), first with GEdit but then using a sed command like the following:

sed -e s/www\.onlinewebsite\.com/localhost/g backup.sql > backup_l.sql

The -e switch uses regular expression substitution that follows it to edit the input with the output being directed to a new file. It’s slicker than the interactive GEdit route but has been made redundant by defining constants for a local WordPress installation as described above.

Transferring data between SAS and R

5th June 2008

A question regarding the ability to transfer of data between SAS and R set me off on a spot of investigation a while back and I have always planned to share the results of my labours. Once I managed to locate the required documentation, things became clearer with further inspection. Functions from the foreign package seem to offer the most from the data import and export point of view so they’re what I’ll be featuring in this posting.

I’ll start with importing and using the read.ssd function makes life so much easier for getting SAS data into R. I discovered that the foreign package may not be loaded by default but you can determine this easily by issuing the following command:

search()

If “package:foreign” isn’t in the list, then you need to issue the following function call:

library(foreign)

Of course, if the foreign package isn’t installed, none of this will work. It should live in the library sub-folder of the main R installation directory but if it isn’t there, then downloading the relevant binary package from CRAN is in order. Assuming that all is installed, then a command like the following will perform the needful:

read.ssd("c:/data","data1",sascmd="C:/Program Files/SAS Institute/SAS/V8/sas.exe")

This creates a temporary SAS program that converts the SAS data set into a transport file for reading by another R function that is called in the background, read.xport. Form my experience, it all seems to work fairly seamlessly.

To get data out of R and into SAS is a multi-stage process, even with the foreign package. There are other ways but using the write.foreign seems more useful than most and here’s an example function call:

write.foreign(data1,"C:/test.txt","C:/test.sas",package="SAS",dataname="data1",validvarname="V7")

No SAS data sets are created at this stage but a text file is generated along with a SAS program for converting it into a data set. Running the SAS program is a separate step that follows the creation of the two files. Even if it is less streamlined than read.ssd, write.foreign does make easier to transfer data into SAS than having to write a program from scratch to read in write.table output.

In summary, R can neither read or write SAS data sets by itself so you need SAS installed to really make things happen. SAS gets called by read.ssd and I feel that it would be better if was called by write.foreign also rather than a SAS program generated for execution later on. Even so, it is good to see some custom functionality being provided that makes life easier. There’s also the hmisc package but my experiences while working with that on S-Plus have been such that it compares less favourably with foreign on the reliability front. Saying that, things may have changed since I last tried it.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.