Technology Tales

Adventures & experiences in contemporary technology

Thoughts on eBooks

20th August 2016

In recent months, I have been doing a clear out of paper books in case the recent European Union referendum result in the U.K. affects my ability to stay there since I am an Irish citizen. In my two decades here, I have not felt as much uncertainty and lack of belonging as I do now. It is as if life wants to become difficult for a while.

What made the clearance easier was that there was of making sure that the books were re-used and eBooks replaced anything that I would wanted to keep. However, what I had not realised is that demand for eBooks has flatlined, something that only became apparent in recent article in PC Pro article penned by Stuart Turton. He had all sorts of suggestions about how to liven up the medium but I have some of my own.

Niall Benvie also broached the subject from the point of view of photographic display in an article for Outdoor Photography because most are looking at photos on their smartphones and that often reduces the quality of what they see. Having a partiality to photo books, it remains the one class of books that I am more likely to have in paper form, even I have an Apple iPad Pro (the original 12.9 inch version) and am using it to write these very words. There also is the six year old 24 inch Iiyama screen that I use with my home PC.

The two apps with which I have had experience are Google Play Books and Amazon Kindle, both of which I have used on both iOS and Android while I use the Windows app for the latter too. Both apps are simple and work effectively until you end up with something of a collection. Then, shortcomings become apparent.

Search functionality is something that can be hidden away on menus and that is why I missed it for so long. For example, Amazon’s Kindle supports puts the search box in a prominent place on iOS but hides the same function in menus on its Android or Windows incarnations. Google Play Books consistently does the latter from what I have seen and it would do no harm to have a search box on the library screen since menus and touchscreen devices do not mix as well. The ability to search within a book is similarly afflicted so this also needs moving to a more prominent place and is really handy for guidebooks or other more technical textbooks.

The ability to organise a collection appears to be another missed opportunity. The closest that I have seen so far are the Cloud and Device screens on Amazon’s Kindle app but even this is not ideal. Having the ability to select some books as favourites would help as would hiding others from the library screen would be an improvement. Having the ability to re-sell unwanted eBooks would be another worthwhile addition because you do just that with paper books.

When I started on this piece, I reached the conclusion the eBooks too closely mimicked libraries of paper books. Now, I am not so sure. It appears to me that the format is failing to take full advantage of its digital form and that might have been what Turton was trying to evoke but the examples that he used did not appeal to me. Also, we could do with more organisation functionality in apps and the ability to resell could be another opportunity. Instead, we appear to be getting digital libraries and there are times when a personal collection is best.

All the while, paper books are being packaged in ever more attractive ways and there always will be some that look better in paper form than in digital formats and that still applies to those with glossy appealing photos. Paper books almost feel like gift items these days and you cannot fault the ability to browse them by flicking through the pages with your hands.

Batch conversion of DNG files to other file types with the Linux command line

8th June 2016

At the time of writing, Google Drive is unable to accept DNG files, the Adobe file type for RAW images from digital cameras. The uploads themselves work fine but the additional processing at the end that I believe is needed for Google Photos appears to be failing. Because of this, I thought of other possibilities like uploading them to Dropbox or enclosing them in ZIP archives instead; of these, it is the first that I have been doing and with nothing but success so far. Another idea is to convert the files into an image format that Google Drive can handle and TIFF came to mind because it keeps all the detail from the original image. In contrast, JPEG files lose some information because of the nature of the compression.

Handily, a one line command does the conversion for all files in a directory once you have all the required software installed:

find -type f | grep -i “DNG” | parallel mogrify -format tiff {}

The find and grep commands are standard with the first getting you a list of all the files in the current directory and sending (piping) these to the grep command so the list only retains the names of all DNG files. The last part uses two commands for which I found installation was needed on my Linux Mint machine. The parallel package is the first of these and distributes the heavy workload across all the cores in your processor and this command will add it to your system:

sudo apt-get install parallel

The mogrify command is part of the ImageMagick suite along with others like convert and this is how you add that to your system:

sudo apt-get install imagemagick

In the command at the top, the parallel command works through all the files in the list provided to it and feeds them to mogrify for conversion. Without the use of parallel, the basic command is like this:

mogrify -format tiff *.DNG

In both cases, the -format switch specifies the output file type with tiff triggering the creation of TIFF files. The *.DNG portion itself captures all DNG files in a directory but {} does this in the main command at the top of this post. If you wanted JPEG ones, you would replace tiff with jpg. Shoudl you ever need them, a full list of what file types are supported is produced using the identify command (also part of ImageMagick) as follows:

identify -list format

Turning off Advanced Content Filtering in CKEditor

3rd February 2015

On one of my websites, I use Textpattern with CKEditor for editing of articles on there. This was working well until I upgraded CKEditor to a version with a number of 4.1 or newer because it started to change the HTML in my articles when I did not want it to do so, especially when it broke the appearance of the things. A search on Google revealed an unhelpful forum exchange that produced no solution to the issue so I decided to share one on here when I found it.

What I needed to do was switch off what is known as Advanced Content Filtering. It can be tuned but I felt that would take too much time so I implemented something like what you see below in the config.js with the ckeditor folder:

CKEDITOR.editorConfig = function( config ) {
config.allowedContent = true;
};

All settings go with the outer function wrapper and setting the config.allowedContent property to true within there sorted my problem as I wanted. Now, any HTML remains untouched and I am happy with the outcome. It might be better for features like Advanced Content Filtering to be switched off by default and turned on by those with the time and need for it, much like the one of the principles adopted by the WordPress project. Still, having any off switch is better than none at all.

Upgrading a 2012 Google Nexus 7 to Android 5.0

19th November 2014

Today, I was lured into upgrading my 2012 Google (ASUS) Nexus 7 to the final version of Android 5.0 (also known as Lollipop) by an icon in the device’s top panel. Initially, it felt as it was working OK but a certain sluggish could not be overlooked and there have been complaints about this with some questioning the sense of what Google have done. However, there would have been comments about grandfathering the device if they had not left us have the latest release of Android so there was no victory either way. We humans are fickle creatures and there is an example of exactly that in a well observed double-ended short story by the Irish writer Maura Laverty.

My impressions of how the upgrade had lumbered the tablet had me wondering about replacing the thing with either an Apple iPad Mini 2 or a Google (HTC) Nexus 9 but a much less expensive option came to mind: doing a full factory reset of the device using its recovery mode. That may sound drastic but much of what I had on there was in the cloud anyway so there was nothing to lose. So these are the instructions from Google themselves and I will leave you to use them at your own risk:

  1. If your tablet is on, turn it off.
  2. Press and hold the Volume Down button, then press and hold the Power button at the same time until the tablet turns on. You’ll see the word “Start” with an arrow around it.
  3. Press the Volume Down button twice to highlight “Recovery mode”.
  4. Press the Power button to start Recovery mode. You’ll see an image of an Android robot with a red exclamation mark and the words “No command.”
  5. While holding down the Power button, press the Volume Up button.
  6. Use the volume buttons to scroll to “wipe data/factory reset,” then press the Power button to select it.
  7. Scroll down to “Yes -- erase all user data,” then press the Power button to select it.

Note: If your tablet becomes unresponsive at any point during these steps, you can restart it by holding down the Power button for several seconds.

Once that was completed and the tablet restarted, the set-up routine began and took around an hour to reinstate the various apps that had been lost by the rest. Much of that was down to the time taken for re-installation rather than that taken by the actual downloads themselves over a wired broadband connection. The wait was worth it because the Nexus 7 feels more responsive again. While there times when little lags are noticeable, they are nothing next to the slowdown that I had witnessed before the rest. It might have been a better option than attempting to return to Android 4.4.4 using a factory image, which was another option that I was considering. So long as there is no deterioration in speed, the effort expended to do a reset will have been worthwhile.

Turning off the full height editor option in WordPress 4.0

10th September 2014

Though I keep a little eye on WordPress development, it is no way near as rigorous as when I submitted a patch that got me a mention on the contributor list of a main WordPress release. That may explain how the full editor setting, which is turned on by default passed by on me without my taking much in the way of notice of it.

WordPress has become so mature now that I almost do not expect major revisions like the overhauls received by the administration back-end in 2008. The second interface was got so right that it still is with us and there were concerns in my mind at the time as to how usable it would be. Sometimes, those initial suspicions can come to nothing.

However, WordPress 4.0 brought a major change to the editor and I unfortunately am not sure that it is successful. A full height editor sounds a good idea in principle but I found some rough edges to its present implementation that leave me wondering if any UX person got to reviewing it. The first reason is that scrolling becomes odd with the editor’s toolbar becoming fixed when you scroll down far enough on an editor screen. The sidebar scrolling then is out of sync with the editor box, which creates a very odd sensation. Having keyboard shortcuts like CTRL+HOME and CTRL+END not working as they should only convinced me that the new arrangement was not for me and I wanted to turn it off.

A search with Google turned up nothing of note so I took to the WordPress.org forum to see if I could get any joy. That revealed that I should have thought of looking in the screen options dropdown box for an option called “Expand the editor to match the window height” so I could clear that tickbox. Because of the appearance of a Visual Editor control on there, I looked on the user profile screen and found nothing so the logic of how things are set up is sub-optimal.  Maybe, the latter option needs to be a screen option now too. Thankfully, the window height editor option only needs setting once for both posts and pages so you are covered for all eventualities at once.

With a distraction-free editing option, I am not sure why someone went for the full height editor too. If WordPress wanted to stick with this, it does need more refinement so it behaves more conventionally. Personally, I would not build a website with that kind of ill-synchronised scrolling effect so it is something needs work as does the location of the Visual Editor setting. It could be that both settings need to be at the user level and not with one being above that level while another is at it. Until I got the actual solution, I was faced with using distraction-free mode all the time and also installed the WP Editor plugin too. That remains due to its code highlighting even if dropping into code view always triggers the need to create a new revision. Despite that, all is better in the end.

Turning off seccomp sandbox in vsftpd

21st September 2013

Within the last week, I set up virtual web server using Arch Linux to satisfy my own curiosity since the DIY nature of Arch means that you can build up exactly what you need without having any real constraints put upon you. What didn’t surprise me about this was that it took me more work than the virtual server that I created using Ubuntu Server but I didn’t expect ProFTPD to be missing from the main repositories. The package can be found in the AUR but I didn’t fancy the prospect of dragging more work on myself so I went with vsftpd (Very Secure FTP Daemon) instead. In contrast to ProFTPD, this is available in the standard repositories and there is a guide to its use in the Arch user documentation.

However, while vsftpd worked well just after installation, connections to the virtual FTP soon failed with FileZilla  began issuing uninformative messages. In fact, it was the standard command line FTP client on my Ubuntu machine that was more revealing. It issued the following message that let me to the cause after my engaging the services of Google:

500 OOPS: priv_sock_get_cmd

With version 3.0 of vsftpd, a new feature was introduced and it appears that this has caused problems for a few people. That feature is seccomp sandboxing and it can turned off by adding the following line in /etc/vsftpd.conf:

seccomp_sandbox=NO

That solved my problem and version 3.0.2 of vsftpd should address the issue with seccomp sandboxing anyway. In case, this solution isn’t as robust as it should be because seccomp isn’t supported in the Linux kernel that you are using, turning off the new feature still needs to be an option though.

Piggybacking an Android Wi-Fi device off your Windows PC’s internet connection

16th March 2013

One of the disadvantages of my Google/Asus Nexus 7 is that it needs a Wi-Fi connection to use. Most of the time this is not a problem since I also have a Huawei mobile WiFi hub from T-Mobile and this seems to work just about anywhere in the U.K. Away from the U.K. though, it won’t work because roaming is not switched on for it and that may be no bad thing with the fees that could introduce. My HTC Desire S could deputise but I need to watch costs with that too.

There’s also the factor of download caps and those apply both to the Huawei and to the HTC. Recently, I added Anquet‘s Outdoor Map Navigator (OMN) to my Nexus 7 through the Google Play store for a fee of £7 and that allows access to any walking maps that I have bought from Anquet. However, those are large downloads so the caps start to come into play. Frugality would help but I began to look at other possibilities that make use of a laptop’s Wi-Fi functionality.

Looking on the web, I found two options for this that work on Windows 7 (8 should be OK too): Connectify Hotspot and Virtual Router Manager. The first of these is commercial software but there is a Lite edition for those wanting to try it out; that it is not a time limited demo is not something that I can confirm though that did not seem to be the case since it looked as if only features were missing from it that you’d get if you paid for the Pro variant. The second option is an open source one and is free of charge apart from an invitation to donate to the project.

Though online tutorials show the usage of either of these to be straightforward, my experiences were not all that positive at the outset. In fact, there was something that I needed to do and that is why this post has come to exist at all. That happened even after the restart that Conectify Hotspot needed as part of its installation; it runs as a system service so that’s why the restart was needed. In fact, it was Virtual Router Manager that told me what the issue was and it needed no reboot. Neither did it cause network disconnection of a laptop like the Connectify offering did on me and that was the cause of its ejection from that system; limitations in favour of its paid addition aside, it may have the snazzier interface but I’ll take effective simplicity any day.

Using Virtual Router Manager turns out to be simple enough. It needs a network name (also known as an SSID), a password to restrict who accesses the network and the internet connection to be shared. In my case, the was Local Area Connection on the drop down list. With all the required information entered,  I was ready to start the router using the Start Network Router button. The text on this changes to Stop Network Router when the hub is operational or at least it should have done for me on the first time that I ran it. What I got instead was the following message:

The group or resource is not in the correct state to perform the requested operation.

The above may not say all that much but it becomes more than ample information if you enter it into the likes of Google. Behind the scenes, Virtual Router Manager is using native Windows functionality is create a WiFi hub from a PC and it appears to be the Microsoft Virtual Wi-Fi Miniport Adapter from what I have seen. When I tried setting up an adhoc Wi-Fi network from a laptop to the Nexus 7 using Windows’ own network set up capability via its Control Panel, it didn’t do what I needed so there might be something that third party software can do. So, the interesting thing about the solution to my Virtual Router Manager problem was that it needed me to delve into the innards of Windows a little.

Firstly, there’s running Command Prompt (All Programs > Accessories) from the Start Menu with Administrator privileges. It helps here if the account with which you log into Windows is in the Administrators group since all you have  to do then is right click on the Start Menu entry and choose Run as administrator entry in the pop-up context menu. With a command line window now open, you then need to issue the following command:

netsh wlan set hostednetwork mode=allow ssid=[network name] key=[password] keyUsage=persistent

When that had done its thing, Virtual Router Manager worked without a hitch though it did turn itself after a while and that may be no bad thing from the security standpoint. On the Android side, it was a matter of going in Settings > Wi-Fi and choose the new network that have been creating on the laptop. This sort of thing may apply to other types of tablet (Dare I mention iPads?) so you could connect anything to the hub without needing to do any more on the Windows side.

For those wanting to know what’s going on behind the scenes on Windows, there’s a useful tutorial on Instructables that shows what third party software is saving you from having to do. Even if I never go down the more DIY route, I probably have saved myself having to buy a mobile Wi-Fi hub for any trips to Éire. For now, the Irish 3G dongle that I already have should be enough.

Changing to web fonts

12th February 2012

While you can add Windows fonts to Linux installations, I have found that their display can be flaky to say the least. Linux Mint and Ubuntu display them as sharp as I’d like but I have struggled to get the same sort of results from Arch Linux while I am not so sure about Fedora or openSUSE either.

That has caused me to look at web fonts for my websites with Google Web Fonts doing what I need with both Open Sans and Arimo doing what I need so far. There have been others with which I have dallied, such as Droid Sans, but these are the ones on which I have settled for now. Both are in use on this website now and I added calls for them to the web page headers using the following code (lines are wrapping due to space constraints):

<link href="http://fonts.googleapis.com/css?family=Open+Sans:300italic,400italic,600italic,700italic,400,300,600,700" rel="stylesheet" type="text/css">
<link href='http://fonts.googleapis.com/css?family=Arimo:400,400italic,700,700italic' rel='stylesheet' type='text/css'>

With those lines in place, it then is a matter of updating font-family and font declarations in CSS style sheets with “Open Sans” or “Arimo” as needed while keeping alternatives defined in case the Google font service goes down for whatever reason. A look at a development release of the WordPress Twenty Twelve theme caused me to come across Open Sans and I like it for its clean lines and Arimo, which was found by looking through the growing Google Web Fonts catalogue, is not far behind. Looking through that catalogue now causes for me a round of indecision since there is so much choice. For that reason, I think it better to be open to the recommendations of others.

A new phone

4th February 2012

After a few years with a straightforward Nokia 1661 and a PAYG Blackberry 8520, I decided to go and upgrade from the former to an HTC Wildfire S. So far, the new phone has been good to me with only a few drawbacks. Other than working out how to insert a SIM card, the phone has been easy to use with just a few nuances to learn, such as finger pinch zooming and dealing with an onscreen keyboard as opposed to a real one.

The touchscreen interface and the 3G capability are the big changes from my Blackberry and both make web browsing so much faster too, especially with the larger screen. For instance, checking RSS feeds with Google Reader and emails is so much faster on the move with the screen being very responsive most of the time that I am using it; it does get dirty like others so either a screen cover or frequent cleaning with a camera lens cloth would be no bad thing. The onscreen keyboard remains something to which I need to grow accustomed and probably is the one area where the Blackberry continues to hold sway though turning the phone sideways and tapping it on the side to change orientation helps a lot. That makes the keys larger and, while my finger are not the thickest, there are fewer cases of hitting the wrong key. Even then, you need to get used to switching between alphabet and numeric keyboards and that applies also when you need punctuation marks like commas and so on.

A new phone

Otherwise, the user interface is bright and pleasing to the eye with the typical presentation of both a clock and current weather on there. Handily, the screen is locked easily too with a press of the button at the top right of the phone. That will put a stop to inadvertent phone calls, emailing, web browsing and other things so it is to be commended. To unlock the screen, all that’s needed is to swipe the lock bar to the bottom. Any alerts are viewed in a similar way with holding down your finger on the top bar presenting an extension that can be pulled all of the way down to see what’s there.

With an icon for the Android Marketplace on the main screen, I got to adding a few apps and you can set these to update automatically too but you need to watch your phone contract’s data allowance. The one for WordPress works better than it does on my Blackberry but it seems that retweeting with UberSocial is much less good on the Android platform. For one thing, feeds for all accounts are presented on the one screen and swiping left to right is needed for replying, retweeting and other operations and that’s not working out so smoothly for me yet. Maybe I’ll try an alternative. There are others that I have downloaded too and these include one from CrossCountry Trains and that seems to be a nice offering even if it failed to find trains between Macclesfield and Edale of a Sunday morning. For those omissions, I have an alternative in place and I also have the LinkedIn app too. That seems to work well too. Usefully, it is possible to move these to the phones microSD card to avoid filling up the limited space that’s on offer. However, that isn’t to say that I will be going mad on these things.

Of course, any phone should be good at making and taking phones and the Wildfire seems to be doing well on this score too. Firstly, contacts were read from the SIM but they can be transferred from an old phone using Bluetooth connections too. Sound is good and loud though you need to be on a call to adjust the speaker volume with the rocker button on the side of the phone. Otherwise, that just changes the volume of the ring tone. Without any adjustments, the phone seems to vibrate and ring at the same time though that may be something that I get to changing in time. The pings emitted when new text messages, emails or tweets fall into the same category.

If there’s any downside to this phone, it has to be battery life. Unlike others that I have had, this is a phone that needs charging every night at the very least. Maybe that’s the price of having a nice bright responsive screen but it would be no harm if it lasted longer. Others have found the same thing and reported as much on the web though some have having worse experiences than others. There are some hints regarding how to conserve battery life but they include such things as switching off 3G or data capabilities and neither appeal to me; after all, I might as well use my old Nokia if this is all that can be offered. Instead, I am wondering if acquiring a spare battery might be no bad idea because that’s what I do for my Pentax DSLR (note in passing: I haven’t got to using the phone’s own camera but recent wintry weather had me tempted by the idea, especially with the likes of Twitpic and YFrog out there.). Taking things further, others have mentioned getting a larger capacity replacement but that sounds more risky.

All in all, first impressions of the HTC Wildfire are good ones. Over time, I should find out more about the ins and outs of the gadget. After all, it is a mini-computer with its own operating system and other software. Since I continue to learn more and more about PC’s everyday, the same should be the case here too.

Sorting out MySQL on Arch Linux

5th November 2011

Seeing Arch Linux running so solidly in a VirtualBox virtual box has me contemplating whether I should have it installed on a real PC. Saying that, recent announcements regarding the implementation of GNOME 3 in Linux Mint have caught my interest even if the idea of using a rolling distribution as my main home operating system still has a lot of appeal for me. Having an upheaval come my way every six months when a new version of Linux Mint is released is the main cause of that.

While remaining undecided, I continue to evaluate the idea of Arch Linux acting as my main OS for day-to-day home computing. Towards that end, I have set up a working web server instance on there using the usual combination of Apache, Perl, PHP and MySQL. Of these, it was MySQL that went the least smoothly of all because the daemon wouldn’t start for me.

It was then that I started to turn to Google for inspiration and a range of actions resulted that combined to give the result that I wanted. One problem was a lack of disk space caused by months of software upgrades. Since tools like it in other Linux distros allow you to clear some disk space of obsolete installation files, I decided to see if it was possible to do the same with pacman, the Arch Linux command line package manager. The following command, executed as root, cleared about 2 GB of cruft for me:

pacman -Sc

The S in the switch tells pacman to perform package database synchronization while the c instructs it to clear its cache of obsolete packages. In fact, using the following command as root every time an update is performed both updates software and removes redundant or outmoded packages:

pacman -Syuc

So I don’t forget the needful housekeeping, this will be what I use in future with the y being the switch for a refresh and the u triggering a system upgrade. It’s nice to have everything happen together without too much effort.

To do the required debugging that led me to the above along with other things, I issued the following command:

mysqld_safe --datadir=/var/lib/mysql/ &

This starts up the MySQL daemon in safe mode if all is working properly and it wasn’t in my case. Nevertheless, it creates a useful log file called myhost.err in /var/lib/mysql/. This gave me the messages that allowed the debugging of what was happening. It led me to installing net-tools and inettools using pacman; it was the latter of these that put hostname on my system and got the MySQL server startup a little further along. Other actions included unlocking the ibdata1 data file and removing the ib_logfile0 and ib_logfile1 files so as to gain something of a clean sheet. The kill command was used to shut down any lingering mysqld sessions too. To ensure that the ibdata1 file was unlocked, I executed the following commands:

mv ibdata1 ibdata1.bad
cp -a ibdata1.bad ibdata1

These renamed the original and then crated a new duplicate of it with the -a switch on the cp command forcing copying with greater integrity than normal. Along with the various file operations, I also created a link to my.cnf, the MySQL configuration file on Linux systems, in /etc using the following command executed by root:

ln -s /etc/mysql/ my.cnf /etc/my.cnf

While I am unsure if this made a real difference, uncommenting the lines in the same file that pertained to InnoDB tables. What directed me to these were complaints from mysqld_safe in the myhost.err log file. All I did was to uncomment the lines beginning with “innodb” and these were 116-118, 121-122 and 124-127 in my configuration file but it may be different in yours.

After all the above, the MySQL daemon ran happily and, more importantly, started when I rebooted the virtual machine. Thinking about it now, I believe that was a lack of disk space, the locking of a data file and the lack of InnoDB support that was stopping the MySQL service from running.Running commands like mysqld start weren’t yielding useful messages so a lot of digging was needed to get the result that I needed. In fact, that’s one of the reasons why I am sharing my experiences here.

In the end, creating databases and loading them with data was all that was needed for me to start see functioning websites on my (virtual) Arch Linux system. It turned out to be another step on the way to making it workable as a potential replacement for the Linux distributions that I use most often (Linux Mint, Fedora and Ubuntu).

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.