Technology Tales

Adventures & experiences in contemporary technology

A collection of lessons learnt about web hosting

28th March 2008

Putting this blog back on its feet after a spot of web hosting bother caused me to learnt a bit more about web hosting than I  otherwise might have done. Here’s a selection and they are in no particular order:

  1. Store your passwords securely and where you can find them because you never know how a foul up of your own making can strike. For example, a faux pas with a configuration file is all that’s needed to cause havoc for a database site such as a WordPress blog. After all, nobody’s perfect and your hosting provider may not get you out of trouble as quickly as you might like.
  2. Get a MySQL database or equivalent as part of your package rather than buying one separately. If your provider allows a trial period, then changing from one package to another could be cheaper and easier than if you bought a separate database and needed to jettison it because you changed from, say, a Windows package to a Linux one or vice versa.
  3. It might be an idea to avoid a reseller unless the service being offered is something special. Going for the sake of lower cost can be a false economy and it might be better to cut out the middleman altogether and go direct to their provider. Being able to distinguish a reseller from a real web host would be nice but I don’t see that ever becoming a reality; it is hardly in resellers’ interests, after all.
  4. Should you stick with a provider that takes several days to resolve a serious outage? The previous host of this blog had a major MySQL server outage that lasted for up to three days and seeing that was one of the factors that made me turn tail to go to a more trusted provider that I have used for a number of years. The smoothness of the account creation process might be another point worthy of consideration.
  5. Sluggish system support really can frustrate, especially if there is no telephone support provided and the online ticketing system seems to take forever to deliver solutions. I would advise strongly that a host who offers a helpline is a much better option than someone who doesn’t. Saying all of that, I think that it’s best to be patient and, when your website is offline, that might not be as easy you’d hope it to be.
  6. Setting up hosting or changing from one provider to another can take a number of days because of all that needs doing. So, it’s best to allow for this and plan ahead. Account creation can be very quick but setting up the website can take time while domain name transfer can take up to 24 hours.
  7. It might not take the same amount of time to set up Windows hosting as its Linux equivalent. I don’t know if my experience was typical but I have found that the same provider set up Linux hosting far quicker (within 30 minutes) than it did for a Windows-based package (several hours).
  8. Be careful what package you select; it can be easy to pick the wrong one depending on how your host’s sight is laid out and what they are promoting at the time.
  9. You can have a Perl/PHP/MySQL site working on Windows, even with IIS being used in place instead of Apache. The Linux/Apache/Perl/PHP/MySQL approach might still be better, though.
  10. The Windows option allows for ASP, .Net and other such Microsoft technologies to be used. I have to say that my experience and preference is for open source technologies so Linux is my mainstay but learning about the other side can never hurt from a career point of view. After, I am writing this on a Windows Vista powered laptop to see how the other half live as much as anything else.
  11. Domains serviced by hosting resellers can be visible to the systems of those from whom they buy their wholesale hosting. This frustrated my initial attempts to move this blog over because I couldn’t get an account set up for technologytales.com because a reseller had it already on the same system. It was only when I got the reseller to delete the account with them that things began to run more smoothly.
  12. If things are not going as you would like them, getting your account deleted might be easier than you think so don’t procrastinate because you think it a hard thing to do. Of course, it goes without saying that you should back things up beforehand.

Databases & Programming

29th September 2012

The world of UNIX appears to attract those interested in the more technical aspects of computing. Since Linux is cut from the same lineage, it is apt to include lists of computing languages. Both scripting and programming appear here despite the title, itself shortened for the sake of brevity. Since much code cutting involves working with databases, these appear here too.

In time, I plan to correct the imbalance between programming and scripting languages that currently exists. The original list was bare, so descriptions have been added and will be more and more needed should there be any expansion of what you find here.

Programming and Scripting Languages

Apache Groovy

My first encounter with an implementation of this language was with that belonging to a statistical computing environment (SCE) and that remains an ongoing dalliance. It is easy to think of Groovy as a way of working with a Java-based API using a scripting language and it certainly feels like that. Saying that, it all works better if you know Java, though you do have to watch for the development of domain-specific language capability. That last comment probably applies to the aforementioned SCE in that it has its own object and method hierarchy that means that not all standard Groovy functionality is available.

Clojure

Clojure is a dynamic, functional programming language that runs on the Java Virtual Machine (JVM) and is designed for building robust and scalable software applications. It is characterised by its emphasis on immutability, persistent data structures, and seamless interoperability with Java. Clojure embraces the Lisp programming language’s principles, providing a concise syntax and powerful abstractions for managing state, concurrency, and functional programming paradigms. With its focus on simplicity, expressiveness, and the ability to leverage the vast Java ecosystem, Clojure enables developers to create efficient and maintainable code for a wide range of applications.

Erlang

This is a programming language designed for building highly concurrent, fault-tolerant, and scalable systems that was developed by Ericsson in the late 1980s for telecommunication systems, where reliability and performance are critical. Erlang incorporates features such as lightweight processes, message passing, and built-in support for fault tolerance, making it well-suited for developing distributed and real-time applications. Its unique concurrency model and emphasis on fault tolerance have led to its widespread use in industries such as telecommunications, banking, gaming, and web development, where systems need to handle high loads, be resilient to failures, and provide real-time responsiveness.

Elixir

Inspired by Erlang, Elixir is a functional, concurrent programming language designed for building scalable and fault-tolerant applications. It leverages the powerful concurrency model of the Erlang Virtual Machine (BEAM) while providing a more accessible and expressive syntax. It offers features such as lightweight processes, message passing, pattern matching, and a robust ecosystem of libraries and frameworks. With its focus on reliability, performance, and ease of development, Elixir is well-suited for developing highly concurrent and distributed systems, making it a popular choice for building web applications, real-time systems, and software that requires high availability.

Go

Computing languages often get strange names like single letters or small words like this one; that means that you need to look for “Golang” in any online search. In any case, Go was originated at Google and numbered among its inventors was one of the creators of the C programming language. The intent here is massively multithreaded system programming using stand-alone executable components while retaining or enhancing code readability. Another facet is the ability to function efficiently in distributed computing environments like those at SoundCloud or Uber. A variety of different tools have been written using the language and these include the ever pervasive Docker and Kubernetes.

Julia

It remains an odd decision to give a computing language a girl’s name, but the purpose is a serious one. Often, there is a trade-off between speed of code writing and speed of execution with the result being that data programming involves prototyping in one language and porting to another for production usage. The first group includes R and Python while the second includes C, C++, FORTRAN and even Java, so there is an element of translation involved that often means that different people are involved, which adds an element of error caused by misunderstandings. This gets described as the two language problem and Julia’s major raison d’être is the avoidance of that: its top-line description is that it is as quick to program as Python but runs as fast as C because of its just-in-time compilation, multiple dispatch and in-built multithreading. This also allows for extensive capabilities for scientific computing that go beyond machine learning and an example comes in the number of differential equation solvers that are available. It also helps that meta-programming makes everything more generalisable.

Perl

It has been around since the 1980’s and still pervades though it is not as dominant as it once was for creating dynamic websites or system administration. PHP has taken on much of the former while Python is making inroads into the latter. Still, no list would be complete with complete without a mention of the once ubiquitous scripting language and it once powered my online photo gallery. It may be an easier language, but there is plenty of documentation on the web with Perldoc, Perl Maven and Perlmeister being some good places to look, and Dan Massey has some interesting articles on his site too. Not only that, but it is extensible too, with plenty of extra modules to be found on CPAN.

PHP

This usurper has taken the place of Perl for powering many of the world’s websites. That the language is less verbose probably helps its case and many if not most CMS packages make use of its versatility.

Python

It may be Google’s preferred scripting language for system administration but it is its usefulness for Data Science where it really has shone in the eyes of many. There are numerous packages for data wrangling, data visualisation and machine learning that make the language ever present in any Data Scientist’s toolbox and looking in the PyPi archive will allow you to find what you need. It also has its place in web scripting too, even if it is not as pervasive as PHP though CMS’s like Plone run on Python and there is the Django framework together with the Gunicorn web server.

OpenJDK

One of the acts of Jonathon Schwartz while he was head at Sun Microsystems was to make Java open source after more than a decade of its being largely proprietary and this is the website for the project. Of course, his more notable act at Sun was to sell the company to Oracle, but that’s another story altogether…

R

This is an open-source implementation of the S language that is much appreciated by statisticians and is much used in the teaching of the subject. The base language only has so much functionality but there are many packages available that do just that and there are many to find on repositories like the CRAN and others can be found on various GitHub repositories, though these tend to be more experimental in nature. There are commonly used and well-supported mainstays that everyone uses, but there always is a need to verify that a particular package does what it claims to do. Given that, there are possibilities for data wrangling, data tabulation, data visualisation and data science. While quick to code, R is slow to execute compared with others and I have found that Python is faster but it still has a use for smaller data sets; both keep their temporary data sets in system memory so that will help.

Rust

It came as a surprise that this Mozilla-originated language is gaining traction in scientific data analysis, possibly because it is a fast multithreaded counterpart to C and C++ with some added safety features (though these can be turned off if needed and extra care gets taken). The downsizing of Mozilla led to a sharp reduction in its team of Rust developers and the Rust Foundation has been set up to oversee the language instead. There are online books like The Rust Programming Language and the Rust Cookbook, with the first of these also having paper and e-book counterparts from No Starch Press. For those interested in a more interactive introduction, there also is the Tour of Rust.

Databases

MariaDB

This essentially is a fork of MySQL (see below) now that Oracle owns it. The originators of MySQL are the creators of MariaDB so their claims of it being a drop-in replacement for it may have some traction. So far, I have seen no exodus from MySQL, though.

MySQL

After being in the hands of a number of owners until it incongruously came into the custodianship of Oracle (who of course already had and still have one of their own), the database system that powers many dynamic websites almost remains a de facto standard and looks set to remain thus for now.

MongoDB

This may a document-based and not a relationship database like many of us understand them but it still is being touted as an alternative to the more mainstream competition. Database technology isn’t just about SQL and MongoDB champions a NoSQL approach; it sounds as if the emergence of XML might be what’s facilitating the NoSQL database technologies.

PostgreSQL

This project may have more open-source credibility than MySQL, but it seems to remain in its shadow, though that may be explained by its being a more complex piece of software to use (at least, that has been my experience, anyway). It so happens that this is what Debian installs if you specify the web server option at operating system installation time.

Why all the commas?

4th December 2022

In recent times, I have been making use of Grammarly for proofreading what I write for online consumption. That has applied as much to what I write in Markdown form as it has for what is authored using content management systems like WordPress and Textpattern.

The free version does nag you to upgrade to a paid subscription, but is not my main irritation. That would be its inflexibility because you cannot turn off rules that you think intrusive, at least in the free version. This comment is particularly applicable to the unofficial plugin that you can install in Visual Studio Code. To me, the add-on for Firefox feels less scrupulous.

There are other options though, and one that I have encountered is LanguageTool. This also offers a Firefox add-on, but there are others not only for other browsers but also Microsoft Word. Recent versions of LibreOffice Writer can connect to a LanguageTool server using in-built functionality, too. There are also dedicated editors for iOS, macOS or Windows.

The one operating that does not get specific add-on support is Linux, but there is another option there. That uses an embedded HTTP server that I installed using Homebrew and set to start automatically using cron. This really helps when using the LanguageTool Linter extension in Visual Studio Code because it can connect to that instead of the public API, which bans your IP address if you overuse it. The extension is also configurable with the ability to add exceptions (both grammatical and spelling), though I appear to have enabled smart formatting only to have it mess up quotes in a Markdown file that then caused Hugo rendering to fail.

Like Grammarly, there is an online editor that offers more if you choose an annual subscription. That is cheaper than the one from Grammarly, so that caused me to go for that instead to get rephrasing suggestions both in the online editor and through a browser add-on. It is better not to get nagged all the time…

The title may surprise you, but I have been using co-ordinating conjunctions without commas for as long as I can remember. Both Grammarly and LanguageTool pick up on these, so I had to do some investigation to find a gap in my education, especially since LanguageTool is so good at finding them. What I also found is how repetitive my writing style can be, which also means that rephrasing has been needed. That, after all, is the point of a proofreading tool, and it can rankle if you have fixed opinions about grammar or enjoy creative writing.

Putting some off-copyright texts from other authors triggers all kinds of messages, but you just have to ignore these. Turning off checks needs care, even if turning them on again is easy to do. There, however, is the danger that artificial intelligence tools could make writing too uniform, since there is only so much that these technologies can do. They should make you look at your text more intently, though, which is never a bad thing because computers still struggle with meaning.

Battery life

2nd October 2011

In recent times, I have lugged my Toshiba Equium with me while working away from home; I needed a full screen laptop of my own for attending to various things after work hours so it needs to come with me. It’s not the most portable of things with its weight and the lack of battery life. Now that I think of it, I reckon that it’s more of a desktop PC replacement machine than a mobile workhorse. After all, it only lasts an hour on its own battery away from a power socket. Virgin Trains’ tightness with such things on their Pendolinos is another matter…

Unless my BlackBerry is discounted, battery life seems to be something with which I haven’t had much luck because my Asus Eee PC isn’t too brilliant either. Without decent power management, two hours seems to be as good as I get from its battery. However, three to four hours become possible with better power management software on board. That makes the netbook even more usable though there are others out there offering longer battery life. Still, I am not tempted by these because the gadget works well enough for me that I don’t need to wonder about how money I am spending on building a mobile computing collection.

While I am not keen on spending too much cash or having a collection of computers, the battery life situation with my Toshiba is more than giving me pause for thought. The figures quoted for MacBooks had me looking at them though they aren’t at all cheap. Curiosity about the world of the Mac may make them attractive to me but the prices forestalled that and the concept was left on the shelf.

Recently, PC Pro ran a remarkably well-timed review of laptops offering long battery life (in issue 205). The minimum lifetime in this collection was over five hours so the list of reviewed devices is an interesting one for me. In fact, it even may become a shortlist should I decide to spend money on buying a more portable laptop than the Toshiba that I already have. The seventeen hour battery life for a Sony VAIO SB series sounds intriguing even if you need to buy an accessory to gain this. That it does over seven hours without the extra battery slice makes it more than attractive anyway. The review was food for thought and should come in handy if I decide that money needs spending.

Adobe CS3 Launch

28th March 2007

Last night, I sat through part of Adobe’s CS3 launch and must admit that I came away intrigued. Products from the Macromedia stable have been very much brought under the Adobe umbrella and progressed to boot. One of these that attracts my interest in Dreamweaver and Adobe is promoting its AJAX capabilities (using the Spry library), its browser compatibility checking facility and integration with Photoshop, among other things. Dreamweaver’s CSS support also gets taken forward. In addition, Dreamweaver can now integrate with Adobe Bridge and Adobe Device Central. The latter allows you to preview how your site might look on a plethora of WAP-enabled mobile phones while the latter, unless I have been missing something, seems to have become a media manager supporting all of CS3 and not just Photoshop.

Speaking of Photoshop, this now gets such new features as smart filters, I think of these as adjustment layers for things like sharpening, monochrome conversion and much more. Raw image processing now has a non-destructive element and Photoshop Lightroom is being touted as a companion for the main Photoshop. Speaking of new additions to the Photoshop family, there is a new Extended edition for those working with digital imaging with a 3D aspect and this is targeted at scientists, engineers, medical professionals and others. It seems that data analysis and interpretation is becoming part of the Photoshop remit now as well.

Dreamweaver and Photoshop are the components of the suite in which I have most interest but I also note that Contribute now has blogging capabilities; it would be interesting to see how these work, especially given Word 2007’s support for blogging tools like WordPress and Blogger. Another member of note is Version Cue, adding version control to the mix and making CS3 more like a group of platforms than collections of applications.

Unsurprisingly, the changes are rung out for the rest of the suite with integration being a major theme and this very much encompasses Flash too. The sight of an image selection being copied straight into Dreamweaver was wondrous in its own way and rendering of Photoshop files into 3D images was also something to behold. The latter was used to demonstrate the optimisations that have been added for the Mac platform, a major selling point apparently.

I suppose that the outstanding question is this: do I buy into all of this? It’s a good question because the computer enthusiast seems to be getting something of a sidelining lately. And that seems to the impression left by Windows Vista in its giving the appearance that Microsoft is trying to be system administrator to the world. There is no doubt but CS3 is very grown up now and centred around work flows and processes. These have always been professional tools and the present level of sophistication and pricing* very much reflects this. That said, enthusiasts like me have been known to use them too, at least for learning purposes. The latter point may yet cause me to get my hands on Photoshop CS3 with its powerful tools for digital imaging but Dreamweaver is another story. It doesn’t fit what how I work now so this is an upgrade that I may give a miss, as impressive as it looks. For a learning experience, I might download a demo but that would a separate matter from updating my web presence. This time next month may tell a tale…

*Pricing remains the bugbear for the U.K. market that it always has been. At the present exchange rates, we should be getting a much better deal on Adobe products that we do. For instance, Amazon.com has the Web Premium CS3 suite from Macromedia Studio 8 priced at $493.99 while it is £513.99 on Amazon.co.uk. Using the exchange rate current as I write this, £1 buying $1.96605, the U.K. price is a whopping $1010.53 in U.S. terms. To me, this looks like price gouging and Microsoft has been slated for this too. I wonder what will be said to Adobe on this one.

Taking a camera on a walk…

24th July 2007

On Saturday, I happened to be in Jessops and overheard a salesman emphatically state that you don’t buy a camera for its specifications but for the photos that it produces. While his tone of voice was a touch condescending and he seemed to be putting down a DSLR, he was essentially right. Nevertheless, the specifications do help you get the images and they have to be seen in that light. For instance, having on-board sensor cleaning may save you from having either to clean the thing yourself or send the camera away for the professionals to do the needful, a much safer option in my view. And there may be occasions where image stabilisation is very useful, low light wildlife photography for instance. Yes, there are features that I consider surplus to requirements, like live viewing and movie capture and that is very much due to my buying cameras to make photos. The salesman in question would surely have agreed…

Sunday saw me head to the Lakeland Fells for some walking and a spot of testing of my new Pentax K10D. The details of the walk itself are not for here but for my hillwalking blog and that is where you will find them. While making my way from Crewe to Windermere, I perused the manual looking particularly for information pertaining to functions that I actually use, I should really have done this beforehand, but distractions meant that I hadn’t got around to it. I had to wade through something designed for a new SLR user before I got to what I consider the important stuff. Though this may be a bit irritating, I can understand and accept why they do it this way; we were all new users once and they are hardly likely to want to know about things like aperture priority, raw file capture, ISO control and such like straight away.

What do I think of it then? Let’s start with first impressions. It is definitely smaller than the Canon EOS 10D it accompanies in my possession. That said, it is not too small and there is a decent grip hosting the shutter release button and the camera on/off switch. It also feels well-assembled and reassuringly weighty, an important consideration given that it will see the outdoors a lot. A discussion of the features most relevant to me follows.

On the subject of switching on and off, the camera is set to go into a sleep mode after a second of inactivity but it reawakens quickly when needed, the trigger being half-depression of the shutter release button. In fact, the camera does reawaken much faster than my Canon as it happens and where the delay is a constant source of some irritation. It might sound strange but the on/off switch is also used to activate the depth of field preview, something that no SLR should not have. The location may be unusual, but maybe the designers thought that having shutter release and depth of field preview next to each was a logical way to do it. From a camera operation point of view, there is certainly something to that way of thinking. Behind the shutter release, you’ll find a screen that is a reminder of film SLR’s and it conveys information such as battery life, number of exposures remaining on the card and exposure details (aperture & shutter speed).

Staying on the subject of screens, the one on the back of the camera is larger than that on the Canon. As is customary for these, it allows replay of photos taken and access to the various menus required to control the camera’s operation. In comparison to the Canon, which is essentially a one menu affair with a thumb wheel controlling scrolling and an OK button at its centre to perform operations, the Pentax has a more elaborate system of submenus: one each for recording, playback and set-up. The playback menu is where I found the setting that makes the camera highlight areas of underexposure and overexposure during image playback. This is something that I missed with respect to the Canon until I happened upon it. Camera cleaning is located on the set-up menu and the camera is now set to clean the sensor every time that it is turned on. Why this is not enabled by default is a little beyond me, but the designers might have thought that a vibration from the camera on turning it on could have resulted in a load of support calls. The same submenu also hosts memory card formatting. The recording submenu is where I set the camera to deliver RAW DNG files, an Adobe innovation, rather than the default JPEG’s. There are other options like RAW PEF files, Pentax’s own format, or RAW and JPEG simultaneously, but my choice reflects my workflow in Photoshop Elements; I have yet to stop the said software editing the DNG files, however. With all these options, it is fortunate that there is a navigation wheel whose operation uses arrow buttons to get about. While on the subject of the back screen, there are further settings that are accessed with the FN button rather than the Menu one. These include ISO, white balance, shooting mode (single, continuous, timed and so on) and flash. The only setting that I changed out of this lot was to set the ISO to 400; I prefer to feel that I am in control.

Returning to the camera’s top plate, the exposure mode dial is on the left-hand side, which is no hardship to me as this is in the same place as on the Canon. There are no scene modes, but the available exposure modes are more than sufficient: fully automatic, program, sensitivity priority, shutter priority, aperture priority, shutter and aperture priority, manual, bulb and external flash synchronisation. A few of these need a spot of explaining. Sensitivity priority is no one on me but it is a consequence of the ability of DSLR’s to offer a range of ISO settings; the aperture and the shutter speed are varied according to the ISO setting. Shutter and aperture priority is like manual exposure and is the inverse of sensitivity priority: set both aperture and shutter speed, and the camera will vary the ISO setting. Both of the foregoing assume that you let the camera set the ISO but my setting the thing myself may have put paid to these functions. Shutter priority and aperture priority are, as far as I can tell, their usual selves. For all exposure modes, the thumb wheels at the front and back of the shutter release handgrip set apertures and shutter speeds where appropriate and this arrangement works well.

Mounting on the same column as the exposure dial the metering mode selector and here is where I see more options than my Canon, which has only full and partial multi-segment metering. With the Pentax, you get spot and centre-weighted metering in addition to the default multisegmented variety. Spot metering is definitely very useful but capturing the reading is a multitasking affair: half pressing the shutter button and fulling pressing the AE lock one at the same time. In contrast, Canon’s partial metering is a more convenient single button operation meter and retain facility. Pentax would do well to learn from this.

The focussing mode selector is found on the left of the body next to the lens coupling. I am used to having this on the lenses themselves, so this is a new arrangement for me and one to which I can easily become accustomed. In fact, it is easy to find it while composing a picture. The modes themselves are manual focus, one-time autofocus and continuous autofocus; the last of these is for focussing on moving objects.

I could go further, perhaps overboard, with a discussion of the features of this camera but I draw a line at what’s here. Yes, it is useful to set the focussing point and activate image stabilisation but the above are what matter to me and its performance in the photo making department is the most important aspect. That neatly brings me to my appraisal of how it performs. With inspection of the first few images on the review screen, I was a little disappointed to see how dark the foreground was in comparison to the sky. When I brought everything home as I always do, I found that things weren’t necessarily as they appeared in the field. The Pentax more usefully offers histogram review and highlighting of any areas that are either underexposed or overexposed. It is these functions that I will be using in reshooting decisions while out and about with the Pentax and the same can be said for how I currently use the Canon. In fast changing lighting, the AE lock technique was a bit irritating but I am certain that I will get better at it. The autofocus doesn’t always lock onto the subject, especially in tricky lighting, so manual focussing is a definite necessity and is more useful more for landscape photography, in fact. Nevertheless, the autofocus did do well most of the time and my Sigma lenses have done worse things on me. All in all, I am happy with the K10D and will continue to use it. I have got some decent photos from my excursion and that, as that Jessops salesman would agree, is the main point of a camera.

On keyboards

17th April 2009

There cannot be too many Linux users who go out and partner a Microsoft keyboard with their system but my recent cable-induced mishap has resulted in exactly that outcome. Keyboards are such standard items that it is not so possible to generate any excitement about them, apart from RSI-related concerns. While I wasn’t about to go for something cheap and nasty that would do me an injury, going for something too elaborate wasn’t part of the plan either, even if examples of that ilk from Microsoft and Logitech were sorely tempting.

Shopping in a bricks and mortar store like I was has its pluses and its minuses. The main plus points are that you see and feel what you are buying with the main drawback being that the selection on offer isn’t likely to be as extensive as you’d find on the web, even if I was in a superstore. Despite the latter, there was still a good deal available. There were PS/2 keyboards for anyone needing them but USB ones seemed to be the main offer with wireless examples showcased too. Strangely, the latter were only available as kits with mice included, further adding to the cost of an already none too cheap item. The result was that I wasn’t lured away from the wired option.

I didn’t emerge with what would have been my first choice because that was out of stock but that’s not to say that what I have doesn’t do the job for me. Key action is soft and cushioned rather than clicky like that to which I am accustomed; some keyboards feel like they belong on a laptop but not this one. There are other bells and whistles too with a surprising number of them working. The calculator and email buttons number among these along with the play/pause, back and forward ones for a media player; I am not so convinced about the volume controls though an on-screen indicator does pop up. You’d expect a Microsoft item to be more Windows specific than others but mine works as well as anything else in the Ubuntu world and I have no reason to suspect that other Linux distros would spurn it either. Keyboards are one of those “buy-it-and-forget-it” items and the new arrival should be no different.

A waiting game

20th August 2011

Having been away every weekend in July, I was looking forward to a quiet one at home to start August. However, there was a problem with one of my websites hosted by Fasthosts that was set to occupy me for the weekend and a few weekday evenings afterwards.

The issue appeared to be slow site response so I followed advice given to me by second line support when this website displayed the same type of behaviour: upgrade from Apache 1.3 to 2.2 using the control panel. Unfortunately for me, that didn’t work smoothly at all and there seemed to be serious file loss as a result. Raising a ticket with the support desk only got me the answer that I had to wait for completion and I now have come to the conclusion that the migration process may have got stuck somewhere along the way. Maybe another ticket is in order.

There were a number of causes of the waiting that gave rise to the title of this post. Firstly, support for low costing isn’t exactly timely and I do wonder if it’s any better for more prominent websites. Restoration of websites by FTP is another activity that takes up plenty of time as does rebuilding databases and populating them with data. Lastly, there’s changing the DNS details for a website. In hindsight, there may be ways of reducing the time demands of these. For instance, contacting a support team by telephone may be quicker unless there is a massive queue awaiting attention and there was a wait of several hours one night when a security changeover affected a multitude of Fasthosts users. Of course, it is not a panacea at the best of times as we have known since all those stories began to do the rounds in the middle of the 1990’s. Doing regular backups would help the second though the ones that I was using for the restoration weren’t too bad at all. Nevertheless, they weren’t complete so there was unfinished business that required resolution later. The last of these is helped along by more regular PC restarts so that unexpected discovery will remain a lesson for the future though I don’t plan on moving websites around for a while. After all, getting DNS details propagated more quickly really is a big help.

While awaiting a response from Fasthosts, I began to ponder the idea of using an alternative provider. Perusal of the latest digital edition of .Net (I now subscribe to the non-paper edition so as to cut down on the clutter caused by having paper copies about the place) ensued before I decided to investigate the option of using Webfusion. Having decided to stick with shared hosting, I gave their Unlimited Linux option a go. For someone accustomed to monthly billing, it was unusual to see annual biannual and triannual payment schemes too. The first of these appears to be the default option so a little care and attention is needed if you want something else. In order to encourage you to stay with Webfusion longer, the per month is on sliding scale: the longer the period you buy, the lower the cost of a month’s hosting.

Once the account was set up, I added a database and set to the long process of uploading files from my local development site using FileZilla. Having got a MySQL backup from the Fasthosts site, I used the provided PHPMyAdmin interface to upload the data in pieces not exceeding the 8 MB file size limitation. It isn’t possible to connect remotely to the MySQL server using the likes of MySQL Administrator so I bear with this not so smooth process. SSH is another connection option that isn’t available but I never use it much on Fasthosts sites anyway. There were some questions to the support people along and the first of these got a timely answer though later ones took longer before I got an answer. Still, getting advice on the address of the test website was a big help while I was sorting out the DNS changeover.

Speaking of the latter, it took a little doing and not little poking around Webfusion’s FAQ’s before I made it happen. First, I tried using name servers that I found listed in one of the articles but this didn’t seem to achieve the end that I needed. Mind you, I would have seen the effects of this change a little earlier if I had rebooted my PC earlier than I did than I did but it didn’t occur to me at the time. In the end, I switched to using my domain provider’s name servers and added the required information to them to get things going. It was then that my website was back online in some fashion so I could any outstanding loose ends.

With the site essentially operating again, it was time to iron out the rough edges. The biggest of these was that MOD_REWRITE doesn’t seem to work the same on the Webfusion server like it does on the Fasthosts ones. This meant that I needed to use the SCRIPT_URI CGI variable instead of PATH_INFO in order to keep using clean URL’s for a PHP-powered photo gallery that I have. It took me a while to figure that out and I felt much better when I managed to get the results that I needed. However, I also took the chance to tidy up site addresses with redirections in my .htaccess file in an attempt to ensure that I lost no regular readers, something that I seem to have achieved with some success because one such visitor later commented on a new entry in the outdoors blog.

Once any remaining missing images were instated or references to them removed, it was then time to do a full backup for sake of safety. The first of these activities was yet another consumer while the second didn’t take so long and I need to do this more often in case anything happens. Hopefully though, the relocated site’s performance continues to be as solid as it is now.

The question as to what to do with the Fasthosts webspace remains outstanding. Currently, they are offering free upgrades to existing hosting packages so long as you commit for a year. After my recent experience, I cannot say that I’m so sure about doing that kind of thing. In fact, the observation leaves me wondering if instating that very extension was the cause of breaking my site. In fact, it appears that the migration from Apache 1.3 to 2.2 seems to have got stuck for whatever reason. Maybe another ticket should be raised but I am not decided on that yet. All in all, what happened to that Fasthosts website wasn’t the greatest of experiences but the service offered by Webfusion is rock solid thus far. While wondering if the service from Fasthosts wasn’t as good as it once was, I’ll keep an open mind and wait to see if my impressions change over time.

All that was needed was a trip to a local shop

5th March 2011

In the end, I did take the plunge and acquired a Sigma 50-200 mm f4-5.6 DC OS HSM lens to fit my ever faithful Pentax K10D. After surveying a few online retailers, I plumped for Park Cameras where the total cost, including delivery, came to something to around £125. This was around £50 less than what others were quoting for the same lens with delivery costs yet to be added. Though the price was good at Park Cameras, I was wondering still about how they could manage to do that sort of deal when others don’t. Interestingly, it appears that the original price of the lens was around £300 but that may have been at launch and prices do seem to tumble after that point in the life of many products of an electrical or electronic nature.

All that was needed was a trip to a local shopUnlike the last lens that I bought from them around two years ago, delivery of this item was a prompt affair with dispatch coming the day after my order and delivery on the morning after that. All in all, that’s the kind of service that I like to get. On opening the box, I was surprised to find that the lens came with a hood but without a cap. However, that was dislodged slightly from my mind when I remembered that I neglected to order a UV or skylight filter to screw into the 55 mm front of it. In the event, it was the lack of a lens cap needed sorting more than the lack of a filter. The result was that I popped in the local branch of Wildings where I found the requisite lens cap for £3.99 and asked about a filter while I was at it. Much to my satisfaction, there was a UV filter that matched my needs in stock though it was that cheap at £18.99 and was made by a company of which I hadn’t heard before, Massa. This was another example of good service when the shop attendant juggled two customers, a gentleman looking at buying a DSLR and myself. While I would not have wanted to disturb another sales interaction, I suppose that my wanting to complete a relatively quick purchase was what got me the attention while the other customer was left to look over a camera, something that I am sure he would have wanted to do anyway. After all, who wouldn’t?

With the extras acquired, I attached them to the front of the lens and carried out a short test (with the cap removed, of course). When it was pointed at an easy subject, the autofocus worked quickly and quietly. A misty hillside had the lens hunting so much that turning to manual focussing was needed a few times to work around something understandable. Like the 18-125 mm Sigma lens that I already had, the manual focussing ring is generously proportioned with a hyperfocal scale on it though some might think the action a little loose. In my experience though, it seems no worse than the 18-125 mm so I can live with it. Both lenses share something else in common in the form of the zoom lens having a stiffer action than the focus ring. However, the zoom lock of the 18-125 mm is replaced by an OS (Optical Stabilisation) one on the 50-200 mm and the latter has no macro facility either, another feature of the shorter lens though it remains one that I cannot ever remember using. In summary, first impressions are good but I plan to continue appraising it. Maybe an outing somewhere tomorrow might offer a good opportunity for using it a little more to get more of a feeling for its performance.

Exploring the option of mobile broadband

20th September 2010

Last week, I decided to buy and experiment with a Vodafone PAYG mobile broadband dongle (the actual device is a ZTE K3570-Z)) partly as a backup for my usual broadband (it has had its moments recently) and partly to allow me to stay more connected while on the move. Thoughts of blogging and checking up on email or the realtime web while travelling to and from different places must have swayed me.

Hearing that the use of Windows or OS X with the device had me attempting to hook up the device to Windows 7 running within a VirtualBox virtual machine on my main home computer. When that proved too big a request of the software setup, I went googling out of curiosity and found that there was a way to get the thing going with Linux. While I am not so sure that it works with Ubuntu without any further changes, my downloading of a copy of the Sakis3G script was enough to do the needful and I was online from my main OS after all. So much for what is said on the box…

More success was had with Windows 7 as loaded on my Toshiba Equum notebook with setting up and connections being as near to effortless as these things can be. Ubuntu is available on there too, courtesy of Wubi, and the Sakis3G trick didn’t fail for that either.

That’s not to say that mobile broadband doesn’t have its limitations as I found. For instance, Subversion protocols and Wubi installations aren’t supported but that may be a result of non-support of IPv6 than anything else. nevertheless, connection speeds are good as far as I can see though I yet have to test out the persistence of Vodafone’s network while constantly on the move. Having seen how flaky T-Mobile’s network can be in the U.K. as I travel around using my BlackBerry, that is something that needs doing but all seems painless enough so far. However, the fact that Vodafone uses the more usual mobile phone frequency may be a help.

Download Sakis3G

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.