Technology Tales

Adventures & experiences in contemporary technology

Useful Python packages for working with data

14th October 2021

My response to changes in the technology stack used in clinical research is to develop some familiarity with programming and scripting platforms that complement and compete with SAS, a system with which I have been programming since 2000. One of these has been R but Python is another that has taken up my attention and I now also have Julia in my sights as well. There may be others to assess in the fullness of time.

While I first started to explore the Data Science world in the autumn of 2017, it was in the autumn of 2019 that I began to complete LinkedIn training courses on the subject. Good though they were, I find that I need to actually use a tool in order to better understand it. At that time, I did get to hear about Python packages like Pandas, NumPy, SciPy, Scikit-learn, Matplotlib, Seaborn and Beautiful Soup  though it took until of spring of this year for me to start gaining some hands-on experience with using any of these.

During the summer of 2020, I attended a BCS webinar on the CodeGrades initiative, a programming mentoring scheme inspired by the way classical musicianship is assessed. In fact, one of the main progenitors is a trained classical musician and teacher of classical music who turned to Python programming when starting a family so as to have a more stable income. The approach is that a student selects a project and works their way through it with mentoring and periodic assessments carried out in a gentle and discursive manner. Of course, the project has to be engaging for the learning experience to stay the course and that point came through in the webinar.

That is one lesson that resonates with me with subjects as diverse as web server performance and the ongoing pandemic pandemic supplying data and there are other sources of public data to examine as well before looking through my own personal archive gathered over the decades. Some subjects are uplifting while others are more foreboding but the key thing is that they sustain interest and offer opportunities for new learning. Without being able to dream up new things to try, my knowledge of R and Python would not be as extensive as it is and I hope that it will help with learning Julia too.

In the main, my own learning has been a solo effort with consultation of documentation along with web searches that have brought me to the likes of Real Python, Stack Abuse, Data Viz with Python and R and others for longer tutorials as well as threads on Stack Overflow. Usually, the web searching begins when I need a steer on a particular or a way to resolve a particular error or warning message but books always are worth reading even if that is the slower route. Those from the Dummies series or from O’Reilly have proved must useful so far but I do need to read them more completely than I already have; it is all too tempting to go with the try the “programming and search for solutions as you go” approach instead.

To get going, many choose the Anaconda distribution to get Jupyter notebook functionality but I prefer a more traditional editor so Spyder has been my tool of choice for Python programming and there are others like PyCharm as well. Spyder itself is written in Python so it can be installed using pip from PyPi like other Python packages. It has other dependencies like Pylint for code management activities but these get installed behind the scenes.

The packages that I first met in 2019 may be the mainstays for doing data science but I have discovered others since then. It also seems that there is porosity between the worlds of R an Python so you get some Python packages aping R packages and R has the Reticulate package for executing Python code. There are Python counterparts to such Tidyverse stables as dply and ggplot2 in the form of Siuba and Plotnine, respectively. The syntax of these packages are not direct copies of what is executed in R but they are close enough for there to be enough familiarity for added user friendliness compared to Pandas or Matplotlib. The interoperability does not stop there for there is SQLAlchemy for connecting to MySQL and other databases (PyMySQL is needed as well) and there also is SASPy for interacting with SAS Viya.

Pyhton may not have the speed of Julia but there are plenty of packages for working with larger workloads. Of these, Dask, Modin and RAPIDS all have there uses for dealing with data volumes that make Pandas code crawl. As if to prove that there are plenty of libraries for various forms of data analytics, data science, artificial intelligence and machine learning, there also are the likes of Keras, TensorFlow and NetworkX. These are just a selection of what is available and there is no need not to check out more. It may be tempting to stick with the most popular packages all the time, especially when they do so much, but it never hurst to keep an open mind either.

Moving a website from shared hosting to a virtual private server

24th November 2018

This year has seen some optimisation being applied to my web presences guided by the results of GTMetrix scans. It was then that I realised how slow things were, so server loads were reduced. Anything that slowed response times, such as WordPress plugins, got removed. Usage of Matomo also was curtailed in favour of Google Analytics while HTML, CSS and JS minification followed. What had yet to happen was a search for a faster server. Now, another website has been moved onto a virtual private server (VPS) to see how that would go.

Speed was not the only consideration since security was a factor too. After all, a VPS is more locked away from other users than a folder on a shared server. There also is the added sense of control, so Let’s Encrypt SSL certificates can be added using the Electronic Frontier Foundation’s Certbot. That avoids the expense of using an SSL certificate provided through my shared hosting provider and a successful transition for my travel website may mean that this one undergoes the same move.

For the VPS, I chose Ubuntu 18.04 as its operating system and it came with the LAMP stack already in place. Have offload development websites, the mix of Apache, MySQL and PHP is more familiar to me than anything using Nginx or Python. It also means that .htaccess files become more useful than they were on my previous Nginx-based platform. Having full access to the operating system by means of SSH helps too and should mean that I have fewer calls on technical support since I can do more for myself. Any extra tinkering should not affect others either, since this type of setup is well known to me and having an offline counterpart means that anything riskier is tried there beforehand.

Naturally, there were niggles to overcome with the move. The first to fix was to make the MySQL instance accept calls from outside the server so that I could migrate data there from elsewhere and I even got my shared hosting setup to start using the new database to see what performance boost it might give. To make all this happen, I first found the location of the relevant my.cnf configuration file using the following command:

find / -name my.cnf

Once I had the right file, I commented out the following line that it contained and restarted the database service afterwards using another command to stop the appearance of any error 111 messages:

bind-address 127.0.0.1
service mysql restart

After that, things worked as required and I moved onto another matter: uploading the requisite files. That meant installing an FTP server so I chose proftpd since I knew that well from previous tinkering. Once that was in place, file transfer commenced.

When that was done, I could do some testing to see if I had an active web server that loaded the website. Along the way, I also instated some Apache modules like mod-rewrite using the a2enmod command, restarting Apache each time I enabled another module.

Then, I discovered that Textpattern needed php-7.2-xml installed, so the following command was executed to do this:

apt install php7.2-xml

Then, the following line was uncommented in the correct php.ini configuration file that I found using the same method as that described already for the my.cnf configuration and that was followed by yet another Apache restart:

extension=php_xmlrpc.dll

Addressing the above issues yielded enough success for me to change the IP address in my Cloudflare dashboard so it pointed at the VPS and not the shared server. The changeover happened seamlessly without having to await DNS updates as once would have been the case. It had the added advantage of making both WordPress and Textpattern work fully.

With everything working to my satisfaction, I then followed the instructions on Certbot to set up my new Let’s Encrypt SSL certificate. Aside from a tweak to a configuration file and another Apache restart, the process was more automated than I had expected so I was ready to embark on some fine-tuning to embed the new security arrangements. That meant updating .htaccess files and Textpattern has its own, so the following addition was needed there:

RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

This complemented what was already in the main .htaccess file and WordPress allows you to include http(s) in the address it uses, so that was another task completed. The general .htaccess only needed the following lines to be added:

RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.assortedexplorations.com/$1 [R,L]

What all these achieve is to redirect insecure connections to secure ones for every visitor to the website. After that, internal hyperlinks without https needed updating along with any forms so that a padlock sign could be shown for all pages.

With the main work completed, it was time to sort out a lingering niggle regarding the appearance of an FTP login page every time a WordPress installation or update was requested. The main solution was to make the web server account the owner of the files and directories, but the following line was added to wp-config.php as part of the fix even if it probably is not necessary:

define('FS_METHOD', 'direct');

There also was the non-operation of WP Cron and that was addressed using WP-CLI and a script from Bjorn Johansen. To make double sure of its effectiveness, the following was added to wp-config.php to turn off the usual WP-Cron behaviour:

define('DISABLE_WP_CRON', true);

Intriguingly, WP-CLI offers a long list of possible commands that are worth investigating. A few have been examined but more await attention.

Before those, I still need to get my new VPS to send emails. So far, sendmail has been installed, the hostname changed from localhost and the server restarted. More investigations are needed but what I have not is faster than what was there before, so the effort has been rewarded already.

A display of brand loyalty

12th July 2013

Since 2007, my main camera has been a Pentax K10D DSLR and it has gone on many journeys with me. In fact, more than 15,000 images have been captured with it and I have classed it as an unfailing servant. The autofocus may not be the fastest but my subjects tend to be stationary: landscapes, architecture, flora and transport. Even any bus and train photos have included parked vehicles rather than moving ones so there never have been issues. The hint of underexposure in any photos always can be sorted because DNG files are what I create, with all the raw capture information that is possible to retain. In fact, it has been hard to justify buying another SLR because the K10D has done so well for me.

In recent months, I have looking at processed photos and asking myself if time has moved along for what is not far from being a six year old camera. At various times, I have been looking at higher members of the Pentax while wondering if an upgrade would be a good idea. First, there was the K7 and then the K5 before the K5 II got launched. Even though its predecessor is still to be found on sale, it was the newer model that became my choice.

Pentax K5-II

My move to Pentax in 2007 was a case of brand disloyalty since I had been a Canon user from when I acquired my first SLR, an EOS 300. Even now, I still have a Powershot G11 that finds itself slipped into a pocket on many a time. Nevertheless, I find that Canon images feel a little washed out prior to post processing and that hasn’t been the case with the K10D. In fact, I have been hearing good things about Nikon cameras delivering punchy results so one of them would be a contender were it not for how well the Pentax performed.

So, what has my new K5 II body gained me that I didn’t have before? For one thing, the autofocus is a major improvement on that in the K10D. It may not stop me persevering with manual focusing for most of the time but there are occasions the option of solid autofocus is good to have. Other advances include a 16.3 megapixel sensor with a much larger ISO range. The advances in sensor technology since when the K10D appeared may give me better quality photos and noise is something that my eyes may have begun to detect in K10D photos even at my usual ISO of 400.

There have been innovations that I don’t need too. Live View is something that I use heavily with the Powershot G11 because it has such a pitiful optical viewfinder. The K5 II has a very bright and sharp one so that function lays dormant, especially when I witnessed dodgy autofocus performance with it in use; manual focusing should be OK, I reckon. By default too, the screen stays on all the time and that’s a nuisance for an optical viewfinder user like me so I looked through the manual and the menus to switch off the thing. My brief flirtation with the image level display met an end for much the same reason though it’s good that it’s there. There is some horizon auto-correction available as a feature and this is left on to see what it offers since there have been a multitude of times when I needed to sort out crooked horizons caused by my handholding the camera.

The K5 II may have a 3″ screen on its back but it has done nothing to increase the size of the camera. If anything, it is smaller that the K10D and that usefully means that I am not on the lookout for a new camera holster. Not having a bigger body also means there is little change in how the much camera feels in the hand compared with the older one.

In many ways, the K5 II works very like the K10D once I took control over settings that didn’t suit me. Both have Shake Reduction in their camera bodies though the setting has been moved into the settings menu in the new camera when the older one had a separate switch on its body. Since I’d be inclined to leave it on all the time and prefer not to have it knocked off accidentally, this is not an issue.  Otherwise, many of the various switches are in the same places so it’s not that hard to find my way around them.

That’s not to say that there aren’t other changes like the addition of a lock to the mode dial but I have used Canon EOS camera bodies with that feature so I do not consider it a step backwards. The exposure compensation button has been moved to the top of the camera where I found it very easily and have been using perhaps more than on the K10D; it’s also something that I use on the G11 so the experimentation is being brought across to the K5 II now as well. Beside it, there’s a new ISO button so further experimentation can be attempted with that to see how it does.

If I have any criticism, it’s about the clutter of the menus on the K5 II. The long lists through you scrolled on the K10D have been replaced with a series of extra tabs so that on-screen scrolling is not needed as before. However, I reckon that this breaks up things too much and makes working through the settings look more foreboding to anyone who is not so technical in mindset. Nevertheless, settings such as the the type of file to capture are there and I continue to use RAW DNG files as is usual for me though JPEG and Pentax’s own RAW format also are there. For a while, I forgot to set the date, soon found out what I did and the situation was remedied. The same sort of thing applied to storing files in different folders according to the capture date. For my own reasons, I turned this off to put everything into a single PENTX directory to suit my own workflow. My latest discovery among the menus was the ability to add photographer and copyright holder information to the EXIF metadata attached to the image files created by the camera. With legislative proposals that dilute the automatic rights of copyright holders going through the U.K. parliament, this seems a very timely inclusion even if most would prefer that there was no change to copyright law.

Of course, the worth of any camera is in the images that it produces and I have been happy with what I have been getting so far. The bigger files mean less images fit on a memory card as before. Thankfully, SDHC card capacities have grown even if I don’t wish to machine gun my photography altogether. While out and about, I was surprised to apertures like F/14 and F/18 when I was more accustomed to a progression like F/11, F/13, F/16, F/19, F/22, etc. Most of those older values still are there though so there hasn’t been a complete break with convention. The same comment applies to shutter speeds where ones like 1/100 and 1/160 made there appearance where I might have expected just ones like 1/90, 1/125, 1/250 and so on. The extra possibilities, and that is what they are, do allow more flexibility I suppose and may even make it easier to make correct exposures though any judgement of correctness has to be in the eye of a photographer and not what a computer algorithm in a camera determines. For much of the time until now, I have stuck with an ISO of 400 apart from a little testing in a woodland area of an evening soon after the camera arrived.

Since the K5 II came my way a few months ago, I have been meaning to collect my thoughts on here and there has been a delay while I brought mu thinking to a sensible close.At one point, it felt like there was so much to say that the piece became larger in my mind that even what you have been reading now. After all, there are other things that I can adjust to see how the resulting images look and white balance is but one of these.The K10D isn’t beyond experimentation either, especially since I discovered that shake reduction was switched off and it has me asking if that lacking in quality that I mentioned earlier has another explanation. Of course, actually making use of my tripod would be another good suggestion so it’s safe to say that yet more photographic explorations await.

Installing VMware Player 4.04 on Linux Mint 13

15th July 2012

Curiosity about the Release Preview of Windows 8 saw me running into bother when trying to see what it’s like in a VirtualBox VM. While doing some investigations on the web, I saw VMware Player being suggested as an alternative. Before discovering VirtualBox, I did have a licence for VMware Workstation and was interested in seeing what Player would have to offer. The, it was limited to running virtual machines that were created using Workstation. Now, it can create and manage them itself and without any need to pay for the tool either. Registration on VMware’s website is a must for downloading it though but that’s no monetary cost.

One I had downloaded Player from the website, I needed to install it on my machine. There are Linux and Windows versions and it was the former that I needed and there are 32-bit and 64-bit variants so you need to know what your system is running. With the file downloaded, you need to set it as executable and the following command should do the trick once you are in the right directory:

chmod +x VMware-Player-4.0.4-744019.i386.bundle

Then, it needs execution as a superuser. With sudo access for my user account, it was a matter of issuing the following command and working through the installation screens to instate the Player software on the system:

sudo ./VMware-Player-4.0.4-744019.i386.bundle

Those screens proved easy for me to follow so life would have been good if that were all that was needed to get Player working on my PC. Having Linux Mint 13 means that the kernel is of the 3,2 stock and that means using a patch to finish off the Player installation because the required VMware kernel modules seem to silently fail to compile during the installation process. This only manifests itself when you attempt to start Player afterwards to find a module installation screen appear. That wouldn’t be an issue of itself were it not for the compilation failure of the vmnet module and subsequent inability to start VMware services on the machine. There is a prompt to peer into the log file for the operation and that is a little uninformative for the non-specialist.

Rummaging around the web brought me to the requisite patch and it will work for Player 4.0.3 and Workstation 8.0.2 by default. Doing some tweaking allowed me to make it work for Player 4.04 too. My first step was to extract the contents of the tarball to /tmp where I could edit patch-modules_3.2.0.sh. Line 8 was changed to the following:

plreqver=4.0.4

With the amendment saved, it was time to execute the shell script as a superuser having made it executable before hand. This can be accomplished using the following command:

chmod +x patch-modules_3.2.0.sh && sudo ./patch-modules_3.2.0.sh

With that completed successfully, VMware Player ran as it should. An installation of Windows 8 into a new VM ran very smoothly and I was impressed with performance and responsiveness of the operating system within a Player VM. There are a few caveats though. First, it doesn’t run at all well with VMware Tools so it’s best to leave them uninstalled and it doesn’t seem to need them either; it was possible to set the resolution to the same as my screen and use the CTRL+ALT+ENTER shortcut to drop in and out of full screen mode anyway. Second, the unattended Windows installation wasn’t the way forward for setting up the VM but it was no big deal to have that experiment thwarted. The feature remains an interesting one though.

With Windows 8 running so well in Player, I was reminded of the sluggish nature of my Windows 7 VM and an issue with a Fedora 17 one too. The result was that I migrated the Windows 7 VM from VirtualBox to VMware and all is so much more responsive. Getting it there took not a little tinkering so that’s a story for another entry. On the basis of my experiences so far, I reckon that VMware Player will remain useful to me for a little while yet. Resolving the installation difficulty was worth that extra effort.

A waiting game

20th August 2011

Having been away every weekend in July, I was looking forward to a quiet one at home to start August. However, there was a problem with one of my websites hosted by Fasthosts that was set to occupy me for the weekend and a few weekday evenings afterwards.

The issue appeared to be slow site response so I followed advice given to me by second line support when this website displayed the same type of behaviour: upgrade from Apache 1.3 to 2.2 using the control panel. Unfortunately for me, that didn’t work smoothly at all and there seemed to be serious file loss as a result. Raising a ticket with the support desk only got me the answer that I had to wait for completion and I now have come to the conclusion that the migration process may have got stuck somewhere along the way. Maybe another ticket is in order.

There were a number of causes of the waiting that gave rise to the title of this post. Firstly, support for low costing isn’t exactly timely and I do wonder if it’s any better for more prominent websites. Restoration of websites by FTP is another activity that takes up plenty of time as does rebuilding databases and populating them with data. Lastly, there’s changing the DNS details for a website. In hindsight, there may be ways of reducing the time demands of these. For instance, contacting a support team by telephone may be quicker unless there is a massive queue awaiting attention and there was a wait of several hours one night when a security changeover affected a multitude of Fasthosts users. Of course, it is not a panacea at the best of times as we have known since all those stories began to do the rounds in the middle of the 1990’s. Doing regular backups would help the second though the ones that I was using for the restoration weren’t too bad at all. Nevertheless, they weren’t complete so there was unfinished business that required resolution later. The last of these is helped along by more regular PC restarts so that unexpected discovery will remain a lesson for the future though I don’t plan on moving websites around for a while. After all, getting DNS details propagated more quickly really is a big help.

While awaiting a response from Fasthosts, I began to ponder the idea of using an alternative provider. Perusal of the latest digital edition of .Net (I now subscribe to the non-paper edition so as to cut down on the clutter caused by having paper copies about the place) ensued before I decided to investigate the option of using Webfusion. Having decided to stick with shared hosting, I gave their Unlimited Linux option a go. For someone accustomed to monthly billing, it was unusual to see annual biannual and triannual payment schemes too. The first of these appears to be the default option so a little care and attention is needed if you want something else. In order to encourage you to stay with Webfusion longer, the per month is on sliding scale: the longer the period you buy, the lower the cost of a month’s hosting.

Once the account was set up, I added a database and set to the long process of uploading files from my local development site using FileZilla. Having got a MySQL backup from the Fasthosts site, I used the provided PHPMyAdmin interface to upload the data in pieces not exceeding the 8 MB file size limitation. It isn’t possible to connect remotely to the MySQL server using the likes of MySQL Administrator so I bear with this not so smooth process. SSH is another connection option that isn’t available but I never use it much on Fasthosts sites anyway. There were some questions to the support people along and the first of these got a timely answer though later ones took longer before I got an answer. Still, getting advice on the address of the test website was a big help while I was sorting out the DNS changeover.

Speaking of the latter, it took a little doing and not little poking around Webfusion’s FAQ’s before I made it happen. First, I tried using name servers that I found listed in one of the articles but this didn’t seem to achieve the end that I needed. Mind you, I would have seen the effects of this change a little earlier if I had rebooted my PC earlier than I did than I did but it didn’t occur to me at the time. In the end, I switched to using my domain provider’s name servers and added the required information to them to get things going. It was then that my website was back online in some fashion so I could any outstanding loose ends.

With the site essentially operating again, it was time to iron out the rough edges. The biggest of these was that MOD_REWRITE doesn’t seem to work the same on the Webfusion server like it does on the Fasthosts ones. This meant that I needed to use the SCRIPT_URI CGI variable instead of PATH_INFO in order to keep using clean URL’s for a PHP-powered photo gallery that I have. It took me a while to figure that out and I felt much better when I managed to get the results that I needed. However, I also took the chance to tidy up site addresses with redirections in my .htaccess file in an attempt to ensure that I lost no regular readers, something that I seem to have achieved with some success because one such visitor later commented on a new entry in the outdoors blog.

Once any remaining missing images were instated or references to them removed, it was then time to do a full backup for sake of safety. The first of these activities was yet another consumer while the second didn’t take so long and I need to do this more often in case anything happens. Hopefully though, the relocated site’s performance continues to be as solid as it is now.

The question as to what to do with the Fasthosts webspace remains outstanding. Currently, they are offering free upgrades to existing hosting packages so long as you commit for a year. After my recent experience, I cannot say that I’m so sure about doing that kind of thing. In fact, the observation leaves me wondering if instating that very extension was the cause of breaking my site. In fact, it appears that the migration from Apache 1.3 to 2.2 seems to have got stuck for whatever reason. Maybe another ticket should be raised but I am not decided on that yet. All in all, what happened to that Fasthosts website wasn’t the greatest of experiences but the service offered by Webfusion is rock solid thus far. While wondering if the service from Fasthosts wasn’t as good as it once was, I’ll keep an open mind and wait to see if my impressions change over time.

Extending ASUS Eee PC Battery Life Without Changing From Ubuntu 11.04

25th May 2011

It might just be my experience of the things but I do tend to take claims about laptop or netbook battery life with a pinch of salt. After all, I have a Toshiba laptop that only lasts an hour or two away from the mains and that runs Windows 7. For a long time, my ASUS Eee PC netbook was looking like that too but a spot of investigation reveals that there is something that I could do to extend the length of time before the battery ran out of charge. For now, the solution would seem to be installing eee-control and here’s what I needed to do that for Ubuntu 11.04, which has gained a reputation for being a bit of a power hog on netbooks if various tests are to be believed.

Because eee-control is not in the standard Ubuntu repositories, you need to add an extra one for install in the usual way. To make this happen, launch Synaptic and find the Repositories entry on the Settings menu and click on it. If there’s no sign of it , then Software Sources (this was missing on my ASUS) needs to be installed using the following command:

sudo apt-get install software-properties-gtk

Once Software Sources opens up after you entering your password, go to the Other Software tab. The next step is to click on the Add button and enter the following into the APT Line box before clicking on the Add Source button:

ppa:eee-control/eee-control

With that done, all that’s need is to issue the following command before rebooting the machine on completion of the installation:

sudo apt-get install eee-control

When you are logged back in to get your desktop, you’ll notice a new icon in your top with the Eee logo and clicking on this reveals a menu with a number of useful options. Among these is the ability to turn off a number of devices such as the camera, WiFi or card reader. After that there’s the Preferences entry in the Advanced submenu for turning on such things as setting performance to Powersave for battery-powered operation or smart fan control. The notifications issued to you can be controlled too as can be a number of customisable keyboard shortcuts useful for quickly starting a few applications.

So far, I have seen a largely untended machine last around four hours and that’s around double what I have been getting until now. Of course, what really is needed is a test with constant use to see how it gets on. Even if I see lifetimes of around 3 hours, this still will be an improvement. Nevertheless, being of a sceptical nature, I will not scotch the idea of getting a spare battery just yet.

All that was needed was a trip to a local shop

5th March 2011

In the end, I did take the plunge and acquired a Sigma 50-200 mm f4-5.6 DC OS HSM lens to fit my ever faithful Pentax K10D. After surveying a few online retailers, I plumped for Park Cameras where the total cost, including delivery, came to something to around £125. This was around £50 less than what others were quoting for the same lens with delivery costs yet to be added. Though the price was good at Park Cameras, I was wondering still about how they could manage to do that sort of deal when others don’t. Interestingly, it appears that the original price of the lens was around £300 but that may have been at launch and prices do seem to tumble after that point in the life of many products of an electrical or electronic nature.

All that was needed was a trip to a local shopUnlike the last lens that I bought from them around two years ago, delivery of this item was a prompt affair with dispatch coming the day after my order and delivery on the morning after that. All in all, that’s the kind of service that I like to get. On opening the box, I was surprised to find that the lens came with a hood but without a cap. However, that was dislodged slightly from my mind when I remembered that I neglected to order a UV or skylight filter to screw into the 55 mm front of it. In the event, it was the lack of a lens cap needed sorting more than the lack of a filter. The result was that I popped in the local branch of Wildings where I found the requisite lens cap for £3.99 and asked about a filter while I was at it. Much to my satisfaction, there was a UV filter that matched my needs in stock though it was that cheap at £18.99 and was made by a company of which I hadn’t heard before, Massa. This was another example of good service when the shop attendant juggled two customers, a gentleman looking at buying a DSLR and myself. While I would not have wanted to disturb another sales interaction, I suppose that my wanting to complete a relatively quick purchase was what got me the attention while the other customer was left to look over a camera, something that I am sure he would have wanted to do anyway. After all, who wouldn’t?

With the extras acquired, I attached them to the front of the lens and carried out a short test (with the cap removed, of course). When it was pointed at an easy subject, the autofocus worked quickly and quietly. A misty hillside had the lens hunting so much that turning to manual focussing was needed a few times to work around something understandable. Like the 18-125 mm Sigma lens that I already had, the manual focussing ring is generously proportioned with a hyperfocal scale on it though some might think the action a little loose. In my experience though, it seems no worse than the 18-125 mm so I can live with it. Both lenses share something else in common in the form of the zoom lens having a stiffer action than the focus ring. However, the zoom lock of the 18-125 mm is replaced by an OS (Optical Stabilisation) one on the 50-200 mm and the latter has no macro facility either, another feature of the shorter lens though it remains one that I cannot ever remember using. In summary, first impressions are good but I plan to continue appraising it. Maybe an outing somewhere tomorrow might offer a good opportunity for using it a little more to get more of a feeling for its performance.

Worth the attention?

21st July 2010

The latest edition of Web Designer has features and tutorials on modern trends one new ways to use fonts and typography in websites. One thing that’s at the heart of the attention is the @font-face CSS selector. It’s what allows you to break away from the limitations of whatever fonts your visitors might have on their PC’s to use something available remotely.

In principle, that sounds a great idea but there are caveats. The first of these is the support for the @font-face selector in the first place though the modern browsers that I have tried seem to do reasonably OK on this score. These include the latest versions of Firefox, Internet Explorer, Opera and Chrome. The new fonts may render OK but there’s a short delay in the full loading of a web page. With Firefox, the rendering seems to treat the process like an interleaved image so you may see fonts from your own PC before the remote ones come into place, a not too ideal situation in my opinion. Also, I have found that this is more noticeable on the Linux variant of the browser than its Windows counterpart. Loading a page that is predominantly text is another scenario where you’ll see the behaviour more clearly. Having a sizeable image file loading seems to make things less noticeable. Otherwise, you may see a short delay to the loading of a web page because the fonts have to be downloaded first. Opera is a particular offender here with IE8 loading things quite quickly and Chrome not being too bad either.

In the main, I have been using Google’s Fonts Directory but, in the interests of supposedly getting a better response, I tried using font files stored on a test web server only to discover that there was more of a lag with the fonts on the web server. While I do not know what Google has done with their set up, using their font delivery service appears to deliver better performance in my testing so it’ll be my choice for now. There’s Typekit too but I’ll be hanging onto to my money in the light of my recent experiences.

After my brush with remote font loading, I am inclined to wonder if the current hype about fonts applied using the @font-face directive is deserved until browsers get better and faster at loading them. As things stand, they may be better than before but the jury’s still out for me with Firefox’s rendering being a particular irritant. Of course, things can get better…

When buttons stop working…

16th November 2009

One of the things that stopped working as it should after my recent Ubuntu 9.10 upgrade was the Eclipse PDT installation that I had in place. Editing files went a bit haywire and creating projects had me pushing buttons with nothing happening. Whether this was a Java or GNOME issue, I don’t know but I found it happening too on openSUSE 11.2 (there should be more on that distro in a later entry). That was enough to get me looking again at Netbeans.

In both openSUSE (NB version 6.5) and Ubuntu (NB version 6.7.1), I plucked the default offering of Netbeans from the respective software repositories and added the PHP plugin in both cases. Unlike when I last gave the platform a go, things seemed to go smoothly and it looks to have replaced Eclipse for PHP development duties. Project scanning make take a little while but it’s far from annoying and my earlier dalliance with using Netbeans as a PHP editor was stymied by performance that was so sluggish as to make the thing a pain to use. Up to now, Netbeans’ footprints when it comes to its use of PC power never was light so I am wondering if dual-core and quad-core CPU’s help along with a copious supply of RAM. Only time will tell if these inital positive impressions stay the course and I’ll be keeping an open mind for now.

A performance improvement?

4th March 2009

I have just upgraded to VirtualBox 2.1.4 and noticed something surprising: a performance improvement. I didn’t notice this with a Windows 200 guest but a Windows XP one now ran freely when it felt like it was immersed in treacle before. Since I had some photos to process for the hillwalking blog, that was a welcome boost and will be well used if it continues. What’s more, a Windows 7 VM that I have doesn’t run so sluggish now either. These observations do point towards 2.1.2 being a sluggard on my Ubuntu box, though hogs like Norton 360 didn’t help matters either. Whatever the truth was, things now feel much better and any enhancement to system speed has to be a good thing.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.