Technology Tales

Adventures & experiences in contemporary technology

Installing Citrix Receiver 13.0 in Ubuntu GNOME 13.10 64-bit

28th November 2013

Installing the latest version of Citrix Receiver (13.0 at the time of writing) on 64-bit Ubuntu should be as simple as downloading the required DEB package and double-clicking on the file so that Ubuntu Software Centre can work its magic. Unfortunately, the 64-bit DEB file is faulty so that means that the Ubuntu community how-to guide for Citrix still is needed. In fact, any user of Linux Mint or another distro that uses Ubuntu as its base would do well to have a look at that Ubuntu link.

For sake of completeness, I still am going to let you in on the process that worked for me. Once the DEB file has been downloaded, the first task is to creating a temporary folder where the DEB file’s contents can be extracted:

mkdir ica_temp

With that in place, it then is time to do the extraction and it needs two commands with the second of these need to extract the control file while the first extracts everything else.

sudo dpkg-deb -x icaclient- ica_temp
sudo dpkg-deb --control icaclient- ica_temp/DEBIAN

It is the control file that has been the cause of all the bother because it refers to unavailable dependencies that it really doesn’t need anyway. To open the file for editing, issue the following command:

sudo gedit ica_temp/DEBIAN/control

Then change line 7 (it should begin with Depends:) to: Depends: libc6-i386 (>= 2.7-1), lib32z1, nspluginwrapper. There are other software packages in there that Ubuntu no longer supports and they are not needed anyway. With the edit made and the file saved, the next step is to build a new DEB package with the corrected control file:

dpkg -b ica_temp icaclient-modified.deb

Once you have the package, the next step is to install it using the following command:

sudo dpkg -i icaclient-modified.deb

If it fails, then you have missing dependencies and the following command should sort these before a re-run of the above command again:

sudo apt-get install libmotif4:i386 nspluginwrapper lib32z1 libc6-i386

With Citrix Receiver installed, there is one more thing that is needed before you can use it freely. This is to put Thawte security certificate files into /opt/Citrix/ICAClient/keystore/cacerts. What I had not realised until recently was that many of these already are in /usr/share/ca-certificates/mozilla and linking to them with the following command makes them available to Citrix receiver:

sudo ln -s /usr/share/ca-certificates/mozilla/* /opt/Citrix/ICAClient/keystore/cacerts/

Another approach is to download the Thawte certificates and extract the archive to /tmp/. From there they can be copied to /opt/Citrix/ICAClient/keystore/cacerts and I copied the Thawte Personal Premium certificate as follows:

sudo cp /tmp/Thawte Root Certificates/Thawte Personal Premium CA/Thawte Personal Premium CA.cer /opt/Citrix/ICAClient/keystore/cacerts/

Until I found out about what was in the Mozilla folder, I simply picked out the certificate mentioned in the Citrix error message and copied it over like the above. Of course, all of this may seem like a lot of work to those who are non-tinkerers and I have added a repaired 64-bit DEB package that incorporates all of the above and should not need any further intervention aside from installing it using GDebi, Ubuntu’s Software Centre, dpkg or anything else that does what’s needed.

A reappraisal of Windows 8 and 8.1 licensing

15th November 2013

With the release of Windows 8 around this time last year, I thought that the full retail version that some of us got for fresh installations on PC’s, real or virtual, had become a thing of the past. In fact, it did seem that every respecting technology news website and magazine was saying just that. The release that you would buy from Microsft or from mainstream computer stores was labelled as an upgrade. That made it look as if you needed the OEM or System Builder edition for those PC’s that needed a new Windows installation and that the licence that you bought was then attached to the machine from when it got installed on there.

As is usual with Microsoft, the situation is less clear cut than that. For instance, there was some back-pedalling to allow OEM editions of Windows to be licensed for personal use on real or virtual PC’s. With Windows & and its predecessors, it even was possible to be able to install afresh on a PC without Windows by first installing on inactivated copy on there and then upgrading that as if it was a previous version of Windows. Of course, an actual licence was of the previous version of Windows was needed for full compliance if not the actual installation. At times, Microsoft muddies waters so as to keep its support costs down.

Even with Microsoft’s track record in mind, it still did surprise me when I noticed that Amazon was selling what appeared to be full versions of both Windows 8.1 and Windows 8.1 Pro. Having set up a 64-bit VirtualBox virtual machine for Windows 8.1, I got to discovering the same for software purchased from the Microsoft web site. However, unlike the DVD versions, you do need an active Windows installation if you fancy a same day installation of the downloaded software. For those without Windows on a machine, this can be as simple as downloading either the 32-bit or the 64-bit 90 day evaluation editions of Windows 8.1 Enterprise and using that as a springboard for the next steps. This not only be an actual in-situ installation but there options to create an ISO or USB image of the installation disk for later installation.

In my case, I created a 64-bit ISO image and used that to reboot the virtual machine that had Windows 8.1 Enterprise on there before continuing with the installation. By all appearances, there seemed to be little need for a pre-existing Windows instance for it to work so it looks as if upgrades have fallen by the wayside and only full editions of Windows 8.1 are available now. The OEM version saves money so long as you are happy to stick with just one machine and most users probably will do that. As for the portability of the full retail version, that is not something that I have tested and I am unsure that I will go beyond what I have done already.

My main machine has seen a change of motherboard, CPU and memory so it could have de-activated a pre-existing Windows licence. However, I run Linux as my main operating system and, apart possibly from one surmountable hiccup, this proves surprisingly resilient in the face of such major system changes. For running Windows, I turn to virtual machines and there were no messages about licence activation during the changeover either. Microsoft is anything but confiding when it comes to declaring what hardware changes inactivate a licence. Changing a virtual machine from VirtualBox to VMware or vice versa definitely so does it so I tend to avoid doing that. One item that is fundamental to either a virtual or a real PC is the mainboard and I have seen suggestions that this is the critical component for Windows licence activation and it would make sense if that was the case.

However, this rule is not hard and fast either since there appears to be room for manoeuvre should your PC break. It might be worth calling Microsoft after a motherboard replacement to see if they can help you and I have seen that it is. All in all, Microsoft often makes what appear to be simple rules only to override them when faced with what happens in the real world. Is that why they can be unclear about some matters at times? Do they still hanker after how they want things to be even when they are impossible to keep like that?

A way to get Rigo working again in Sabayon

23rd October 2013

After having Sabayon running on a PC until it came to pieces after an attempted version upgrade, I went away from the Linux distro for a while and Linux Mint now runs on the aforementioned machine. It only was a certain curiosity that got me installing it into a virtual machine on VirtualBox to see if my command line method of keeping the system up to date was the cause or whether rolling or partially-rolling distros have a certain fragility that is not seen in their discrete release counterparts.

Recently that ran into a hitch, the Sabayon package manager Rigo failed to start up for me. After waiting to see if it sorted itself on its own, I looked into returning to those command line ways and that line of enquiry led me to a method of restoring Rigo’s functionality from Sabayon’s wiki page on the underlying Entropy. The first step was to issue a command to become root:

su

That needed the appropriate password and the next command issued updated Sabayon’s repositories:

equo update

Once that had done its thing, it was time to install new versions of Entropy and Rigo:

equo install entropy rigo

With that complete, it was time to exit the root session with the exit command. Then, it was time to try running Rigo and it worked as expected. Any thoughts of adding in the superseded Sulfur (Rigo’s predecessor) were banished on seeing that success.

A look at Ubuntu GNOME 13.10

12th October 2013

With its final release being near at hand, I decided to have a look at the beta release of Ubuntu GNOME 13.10 to get a sense of what might be coming. A misstep along the way had me inadvertently download and install the 64-bit edition of 13.04 into a VirtualBox virtual machine. The intention to update that to its soon to be released successor was scuppered by instability so I never did get to try out an in situ upgrade to 13.10. What I had in mind was to issue the following command:

gksu update-manager -d

However, I found another one when considering how Ubuntu Server might be upgraded without the GUI application that is the Update Manager. To update to a development version, the following command is what you need:

sudo do-release-upgrade -d

To upgrade to a final release of of a new version of Ubuntu, drop the -d switch from the above to use the following:

sudo do-release-upgrade

There is one further option that isn’t recommended for moving between Ubuntu versions but I use it to get updates such as new kernel subversions that are released:

sudo apt-get dist-upgrade

Rather than trying out the above, I downloaded the latest ISO image for the beta release of Ubuntu GNOME 13.10 and installed onto a VM that instead. Though it is the 32 bit version of the distro that is installed on my main home PC, it has been the 64 bit version that I have been trying. So far, that seems to be behaving itself even if it feels a little sluggish but that could be down to the four year old PC that hosts the virtual machine. For a while, I have been playing with the possibility of an upgrade involving an Intel Core i5 4670K CPU and 16 GB of RAM (useful for running multiple virtual machines at a time) along with any motherboard that supports those so looking at a 64 bit operating system has its uses.

The Linux kernel may be 3.11 but that is not my biggest concern. Neither is the fact that LibreOffice 4.1.2.3 was included and GIMP wasn’t, especially when that could be added easily anyway and it is version 2.8.6 that you get. The move to GNOME Shell 3.8 was what drew me to seeing what was coming because I have been depending on a number extensions. As with WordPress and plugins, GNOME Shell seems to have a tempestuous relationship with some of its extensions and I wanted to see which ones still worked. There also has been a change to the backstage application view in that you either get all installed applications displayed when you browse them or you have to start typing the name of the one you want to select it. Losing the categorical view that has been there until GNOME Shell 3.6 is a step backwards and I hope that version 3.10 has seen some sort of a reinstatement. There is a way to add these categories and the result is not as it once was either; also, it shouldn’t be necessary for anyone to dive into a systems innards to address things like this. With all the constant change, it is little wonder that Cinnamon has become a standalone entity with the release of its version 2.0 and that Debian’s toyed with not going with GNOME for its latest version (7.1 at the time of writing and it picked a good GNOME Shell version in 3.4).

Having had a look at other distribution that already have GNOME Shell 3.8, I knew that a few of my extensions worked with it. The list includes Frippery Bottom Panel, Frippery Move Clock, Places Status Indicator, Removable Drive Menu, Remove Rounded Corners (not really needed with the GNOME Shell theme that I use, Elementary Luna 3.4, but I retain it anyway), Show Desktop Button, User Themes and Ignore_Request_Hide_Titlebar. Because of the changes to the backstage view, I added Frippery Applications Menu in preference to Applications Menu because I have found that to be unstable. Useful new discoveries have included Curtains Up and GNOME Shell Open Terminal while Shell Restart User Menu Entry has made a return and found a use this time around too.

There have been some extensions that were not updated to work with GNOME Shell 3.8 that I have got working. In some cases, it was as simple as updating the metadata.json file for an extension with new version numbers of 3.8 and 3.84 to the list associated with the shell version property. All extensions are to be found in the .local/share/gnome-shell/extensions location in your home directory and each has a dedicated file containing the aforementioned file.

With others, it was a matter of looking in the Looking Glass (execute lg in the box that ALT + F2 brings up on your screen to access this) and seeing what error messages were to be found in there before attempting to correct these in either the extensions’ extension.js files or whatever JavaScript (*.js) file was causing the problem. With either or both of these remedies, I managed to port the four extensions below to GNOME Shell 3.8. In fact, you can download these zip files and install them yourself to see how you get on with them.

Advanced Settings in User Menu

Antisocial Menu

Remove App Menu

Restart Shell Entry

There is a Remove Panel App Menu that works with GNOME Shell 3.8 but I found that it got rid of the Places menu instead of the panel’s App Menu so I tried porting the older extension to see if it behaved itself and it does. With these in place, I have bent Ubuntu GNOME 13.10 to my will ahead of its final release next week and that includes customising Nautilus too. Other than a new version of GNOME Shell, it looks as if it will come with less in the way of drama and a breather like that is no bad thing given that personal computing continues to remain in a state of flux these days.

Installing Nightingale music player on Ubuntu 13.04

25th June 2013

Ever since the Songbird project concentrated its efforts to support only Windows and OS X, the Firefox-based music player has been absent from a Linux user’s world. However, the project is open source and a fork called Nightingale now fulfils the same needs. Intriguingly, it too is available for Windows for OS X users so I am left wondering why that overlap has happened. However, Songbird also is available as a web app and as an app on both Android and iOS while Nightingale sticks to being a desktop application.

To add it to Ubuntu, you need to set up a new repository. That can be done using the Software Centre but issuing a command in a terminal can be so much quicker and cleaner so here it is:

sudo add-apt-repository ppa:nightingaleteam/nightingale-release

Apart from entering your password, there will be prompt to continue by pressing the carriage return key or cancelling with CTRL + C. For our purposes, it is the first action that’s needed and once that’s done the needful, you can execute the following command:

sudo apt-get update && sudo apt-get install nightingale

This is in two parts: the first updates the repositories on your system and second actually installs the software. When that is complete, you are ready run Nightingale and, with the repository, staying up to date is not chore either. In fact, using the above commands brings another advantage and it is that they should in any Ubuntu derivatives such as Linux Mint.

A little look at Debian 7.0

12th May 2013

Having a virtual machine with Debian 6 on there, I was interested to hear that Debian 7.0 is out. In another VM, I decided to give it a go. Installing it on there using the Net Install CD image took a little while but proved fairly standard with my choice of the GUI-based option. GNOME was the desktop environment with which I went and all started up without any real fuss after the installation was complete; it even disconnnected the CD image from the VM before rebooting, a common failing in many Linux operating installations that lands into the installation cycle again unless you kill the virtual machine.

Though the GNOME desktop looked familiar, a certain amount of conservatism reigned too since the version was 3.4.2. That was no bad thing since raiding the GNOME Extension site for a set of mature extensions was made all the more easy. In fact, a certain number of these was included in the standard installation anyway and the omission of a power off entry on the user menu was corrected as a matter of course without needing any intervention from this user. Adding to what already was there made for a more friendly desktop experience in a short period of time.

Debian’s variant of Firefox , Iceweasel, is version 10 so a bit of tweaking is needed to get the latest version. LibreOffice is there now too and it’s version 3.5 rather than 4. Shotwell too is the older 0.12 and not the 0.14 that is found in the likes of Ubuntu 13.04. As it happens, GIMP is about the only software with a current version and that is 2.8; a slower release cycle may be the cause of that though. All in all, the general sense is that older versions of current software are being included for the sake of stability and that is sensible too so I am not complaining very much about this at all.

The reason for not complaining is that the very reason for having a virtual machine with Debian 6 on there is to have Zinio and Dropbox available too. Adobe’s curtailment of support for Linux means that any application needing Adobe Air may not work on a more current Linux distribution. That affects Zinio so I’ll be retaining a Debian 6 instance for a while yet unless a bout of testing reveals that a move to the newer version is possible. As for Dropbox, I am sure that I can recall why I moved it onto Debian but it’s working well on there so I am in no hurry to move it over either. There are times when slower software development cycles are better…

Getting an Epson Pefection 4490 Photo scanner going with Ubuntu GNOME Remix 12.10

7th March 2013

My Epson Perfection 4490 Photo scanner has been in my possession for a while now and its impossible to justify any replacement given that it both works well and digital photography has taken over from its film predecessor for me. Every time I go installing an operating system afresh, I need to reinstate it again and last year’s installation of Ubuntu GNOME Remix 12.04 only saw me do the deed recently. When I did so, it was brought back to me that I’d never gone and documented on here how this was done. Given that I sometimes use this place as a repository of stuff to which I need to refer again in the future, it seemed remiss of me so here it is for you all.

Though I had XSane and SimpleScan already installed on the system, Sane wasn’t on there so I went and added it and a few other extras using the following command:

sudo apt-get install sane sane-utils libsane-extras

Then, it was onto the Epson website for their Perfection 4490 Photo Linux drivers since Sane’s support for this scanner seemingly remains incomplete even though it pre-dates my move to Linux in 2007. Three files were needed and the following commands install them (depending on when you do this, the file names may be different so just change them to whatever they are for you; it can be done with a single command too but there is not enough girth for that here):

sudo dpkg -i iscan-data_1.22.0-1_all.deb
sudo dpkg -i iscan_2.29.1-5~usb0.1.ltdl7_i386.deb
sudo dpkg -i iscan-plugin-gt-x750_2.1.2-1_i386.deb

With those in place, there was one other task that needed doing so that scanning could be done without resorting to running scanning software using sudo privileges. To free up the access to a normal user account, I needed a HAL device information file. These normally are in /usr/share/hal/fdi/ but they change every time an installation so any modifications that you may make are going to be lost. Therefore, there is no point modifying either /usr/share/hal/fdi/preprobe/10osvendor/20-libsane.fdi or /usr/share/hal/fdi/preprobe/10osvendor/20-libsane-extras.fdi where scanner information usually is to be found.

The first task in creating an fdi file was to issue the lsusb command and look for a line corresponding to my scanner. This is the one that I got:

Bus 001 Device 004: ID 04b8:0119 Seiko Epson Corp. Perfection 4490 Photo

From this, I gleaned the manufacturer ID and model ID as 04b8 and 0119, respectively. These are needed later on. Next I needed to create the hal/fdi/preprobe/ folder structure under /etc since it was there. Then, I created epson4490photo.fdi in the bottom folder of the tree (/etc/hal/fdi/preprobe/epson4490photo.fdi) as follows:

cd /etc/hal/fdi/preprobe/ && sudo touch epson4490photo.fdi

Then, I edited the new file using the following command:

gksu gedit epson4490photo.fdi &

When open, I added in the following text:

<?xml version="1.0" encoding="UTF-8"?>
<deviceinfo version="0.2">
<device>
<match key="info.subsystem" string="usb">
<!-- Epson Perfection 4490 Photo -->
<match key="usb.vendor_id" int="0x04b8">
<match key="usb.product_id" int="0x0119">
<append key="info.capabilities" type="strlist">scanner</append>
<merge key="scanner.access_method" type="string">proprietary</merge>
</match>
</match>
</match>
</device>
</deviceinfo>

It’s all in XML so the place to look is immediately beneath the scanner name comment. The int attributes of the two match elements immediately following the comment line are populated using the information from the lsusb command output with 0x prefixing both the manufacturer and model identifiers. The element with a key attribute of usb.vendor_id is the former and that with a key attribute of usb.product_id is the latter. With epson4490photo.fdi saved, I rebooted the machine to restart HAL and all was as I wanted it to be apart maybe from XSane making complaints that seemed not to be of any actual consequence. With Epson’s Image Scan! and Simple Scan on the PC, there’s no need to be bothered with those messages. Choice is good when you have it, especially when you have expended some effort to get that far.

Installing the Cinnamon Desktop Environment on Sabayon Linux

26th January 2013

During the week, I did an update on my Sabayon system and GNOME 3.6 came on board without to much of a bother. There was no system meltdown or need for an operating system re-installation. However, there was one matter that rankled: adding and updating extensions from extensions. gnome.org was impossible. The process would create a new folder in ~/.local/share/gnome-shell/extensions/  but not fill it with anything at all. Populating from another my Ubuntu GNOME Remix 12.10 machine didn’t seem to achieve the needful and I am left wondering if it is down to the version of GNOME Shell being 3.6.2. However, even adding an entry for the current version of GNOME Shell to metadata.json for one plugin didn’t appear to do what I wanted so resolving this issue needs further enquiry.

In the meantime, I added the Cinnamon desktop environment using the following command and will be using that from now on. If the GNOME Shell extension issue ever gets sorted I may move back but there is no rush. GNOME 3.8 sounds like it’s bringing an interesting option that makes use of the approach Linux Mint took for version 12 of that distribution and I can await that, especially if it avoids the need for adding extension on a personal basis like now.

sudo equo update && sudo equo install cinnamon

With the installation completed by the above command, it was a matter of logging out and choosing the Cinnamon entry (there is a 2D version too) from the session dropdown menu on the login screen to get it going. Then, it was a matter of tweaking Cinnamon to my heart’s content. Getting a two panel layout required logging out and in again as well as choosing the appropriate setting in the Cinnamon Panel options tab. Next, I decided to check on what themes are available at cinnamon.linuxmint.org before settling on Cinnamint 1.6. It all feels very comfortable apart from not having an automatically growing list of workspaces that are a default offering in GNOME Shell. That goes against the design principles of Cinnamon though so only hopes of someone making an extension that does that are left.

Moving a Windows 7 VM from VirtualBox to VMware Player

14th October 2012

Seeing how well Windows 8 was running in an VMware Player virtual machine and that was without installing VMware Tools in the guest operating system, I was reminded about how sluggish my Windows 7 VirtualBox VM had become. Therefore, I decided to try a migration of the VM from VirtualBox to VMware. My hope was that it was as easy as exporting to an OVA file (File > Export Appliance… in VirtualBox) and importing that into VMware (File > Open a VM in Player). However, even selecting OVF compatibility was insufficient for achieving this and the size of the virtual disks meant that the export took a while to run as well. The solution was to create a new VM in VirtualBox from the OVA file and use the newly created VMDK files with VMware. That worked successfully and I now have a speedier more responsive Windows 7 VM for my pains.

Access to host directories needed reinstatement using a combination of the VMware Shared Folders feature and updating drive mappings in Windows 7 itself to use what appear to it like network drives in the Shared Folders directory on the \\vmware-host domain. For that to work, VMware Tools needed to be installed in the guest OS (go to Virtual Machine > Install VMware Tools to make available a virtual CD from which the installation can be done) as I discovered when trying the same thing with my Windows 8 VM, where I dare not instate VMware Tools due to their causing trouble when I last attempted it.

Moving virtual machine software brought about its side effects though. Software like Windows 7 detects that it’s on different hardware so reactivation can be needed. Windows 7 reactivation was a painless online affair but it wasn’t the same for Photoshop CS5. That meant that I needed help from Adobe’s technical support people top get past the number of PC’s for which the software already had been activated. In hindsight, deactivation should have been done prior to the move but that’s a lesson that I know well now. Technical support sorted my predicament politely and efficiently while reinforcing the aforementioned learning point. Moving virtual machine platform is very like moving from one PC to the next and it hadn’t clicked with me quite how real those virtual machines can be when it comes to software licencing.

Apart from that and figuring out how to do the it, the move went smoothly. An upgrade to the graphics driver on the host system and getting Windows 7 to recheck the capabilities of the virtual machine even gained me a fuller Aero experience than I had before then. Full screen operation is quite reasonable too (the CTRL + ALT + ENTER activates and deactivates it) and photo editing now feels less boxed in too.

Setting up MySQL on Sabayon Linux

27th September 2012

For quite a while now, I have offline web servers for doing a spot of tweaking and tinkering away from the gaze of web users that visit what I have on there. Therefore, one of the tests that I apply to any prospective main Linux distro is the ability to set up a web server on there. This is more straightforward for some than for others. For Ubuntu and Linux Mint, it is a matter of installing the required software and doing a small bit of configuration. My experience with Sabayon is that it needs a little more effort than this and I am sharing it here for the installation of MySQL.

The first step is too install the software using the commands that you find below. The first pops the software onto the system while second completes the set up. The --basedir option is need with the latter because it won’t find things without it. It specifies the base location on the system and it’s /usr in my case.

sudo equo install dev-db/mysql
sudo /usr/bin/mysql_install_db --basedir=/usr

With the above complete, it’s time to start the database server and set the password for the root user. That’s what the two following commands achieve. Once your root password is set, you can go about creating databases and adding other users using the MySQL command line

sudo /etc/init.d/mysql start
mysqladmin -u root password ‘password’

The last step is to set the database server to start every time you start your Sabayon system. The first command adds an entry for MySQL to the default run level so that this happens. The purpose of the second command is check that this happened before restarting your computer to discover if it really happens. This procedure also is needed for having an Apache web server behave in the same way so the commands are worth having and even may have a use for other services on your system. ProFTP is another that comes to mind, for instance.

sudo rc-update add mysql default
sudo rc-update show | grep mysql

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.