Technology Tales

Adventures & experiences in contemporary technology

Installing Citrix Receiver 13.0 in Ubuntu GNOME 13.10 64-bit

28th November 2013

Installing the latest version of Citrix Receiver (13.0 at the time of writing) on 64-bit Ubuntu should be as simple as downloading the required DEB package and double-clicking on the file so that Ubuntu Software Centre can work its magic. Unfortunately, the 64-bit DEB file is faulty so that means that the Ubuntu community how-to guide for Citrix still is needed. In fact, any user of Linux Mint or another distro that uses Ubuntu as its base would do well to have a look at that Ubuntu link.

For sake of completeness, I still am going to let you in on the process that worked for me. Once the DEB file has been downloaded, the first task is to creating a temporary folder where the DEB file’s contents can be extracted:

mkdir ica_temp

With that in place, it then is time to do the extraction and it needs two commands with the second of these need to extract the control file while the first extracts everything else.

sudo dpkg-deb -x icaclient- ica_temp
sudo dpkg-deb --control icaclient- ica_temp/DEBIAN

It is the control file that has been the cause of all the bother because it refers to unavailable dependencies that it really doesn’t need anyway. To open the file for editing, issue the following command:

sudo gedit ica_temp/DEBIAN/control

Then change line 7 (it should begin with Depends:) to: Depends: libc6-i386 (>= 2.7-1), lib32z1, nspluginwrapper. There are other software packages in there that Ubuntu no longer supports and they are not needed anyway. With the edit made and the file saved, the next step is to build a new DEB package with the corrected control file:

dpkg -b ica_temp icaclient-modified.deb

Once you have the package, the next step is to install it using the following command:

sudo dpkg -i icaclient-modified.deb

If it fails, then you have missing dependencies and the following command should sort these before a re-run of the above command again:

sudo apt-get install libmotif4:i386 nspluginwrapper lib32z1 libc6-i386

With Citrix Receiver installed, there is one more thing that is needed before you can use it freely. This is to put Thawte security certificate files into /opt/Citrix/ICAClient/keystore/cacerts. What I had not realised until recently was that many of these already are in /usr/share/ca-certificates/mozilla and linking to them with the following command makes them available to Citrix receiver:

sudo ln -s /usr/share/ca-certificates/mozilla/* /opt/Citrix/ICAClient/keystore/cacerts/

Another approach is to download the Thawte certificates and extract the archive to /tmp/. From there they can be copied to /opt/Citrix/ICAClient/keystore/cacerts and I copied the Thawte Personal Premium certificate as follows:

sudo cp /tmp/Thawte Root Certificates/Thawte Personal Premium CA/Thawte Personal Premium CA.cer /opt/Citrix/ICAClient/keystore/cacerts/

Until I found out about what was in the Mozilla folder, I simply picked out the certificate mentioned in the Citrix error message and copied it over like the above. Of course, all of this may seem like a lot of work to those who are non-tinkerers and I have added a repaired 64-bit DEB package that incorporates all of the above and should not need any further intervention aside from installing it using GDebi, Ubuntu’s Software Centre, dpkg or anything else that does what’s needed.

Surveying changes coming in GNOME 3.10

20th October 2013

GNOME 3.10 came out last month but it took until its inclusion into the Arch and Antergos repositories for me to see it in the flesh. Apart from the risk of instability, this is the sort of thing at which rolling distributions excel. They can give you a chance to see the latest software before it is included anywhere else. For the GNOME desktop environment, it might have meant awaiting the next release of Fedora in order to glimpse what is coming. This is not always a bad thing because Ubuntu GNOME seems to be sticking with using a release behind the latest version. With many GNOME Shell extension writers not updating their extensions until Fedora has caught up with the latest release of GNOME for a stable release, this is no bad thing and it means that a version of the desktop environment has been well bedded in by the time it reaches the world of Ubuntu too. Debian takes this even further by using a stable version from a few years ago and there is an argument in favour of that from a solidity perspective.

Being in the habit of kitting out GNOME Shell with extensions, I have a special interest in seeing which ones still work or could work with a little tweaking and those which have fallen from favour. In the top panel, the major change has been to replace the sound and user menus with a single aggregate menu. The user menu in particular has been in receipt of the attentions of extension writers and their efforts either need re-work or dropping after the latest development. The GNOME project seems to have picked up an annoying habit from WordPress in that the GNOME Shell API keeps changing and breaking extensions (plugins in the case of WordPress). There is one habit from the WordPress that needs copying though and that is with documentation, especially of that API for it is hardly anywhere to be found.

GNOME Shell theme developers don’t escape and a large border appeared around the panel when I used Elementary Luna 3.4 so I turned to XGnome Enhanced (found via GNOME-Look.org) instead. The former no longer is being maintained since the developer no longer uses GNOME Shell and has not got the same itch to scratch; maybe someone else could take it over because it worked well enough until 3.8? So far, the new theme works for me so that will be an option should there a move to GNOME 3.10 on one of my PC’s at some point in the future.

Returning to the subject of extensions, I had a go at seeing how the included Applications Menu extension works now since it wasn’t the most stable of items before. That has improved and it looks very usable too so I am not awaiting the updating of the Frippery equivalent. That the GNOME Shell backstage view has not moved on that much from how it was in 3.8 could be seen as a disappointed but the workaround will do just fine. Aside from the Frippery Applications Menu, there are other extensions that I use heavily that have yet to be updated for GNOME Shell 3.10. After a spot of success ahead of a possible upgrade to Ubuntu GNOME 13.10 and GNOME Shell 3.8 (though I remain with version 13.04 for now), I decided to see I could port a number of these to the latest version of the user interface. Below, you’ll find the results of my labours so feel free to make use of these updated items if you need them before they are update on the GNOME Shell Extensions website:

Frippery Bottom Panel

Frippery Move Clock

Remove App Menu

Show Desktop

There have been more changes coming in GNOME 3.10 than GNOME Shell, which essentially is a JavaScript construction. The consolidation of application title bars in GNOME applications continues but a big exit button has appeared in the affected applications that wasn’t there before. Also there remains the possibility of applying the previously shared modifications to Nautilus (also known as Files) and a number of these usefully extend themselves to other applications such as Gedit too. Speaking of Gedit, this gains a very useful x of y numbering for the string searching functionality with x being the actual number of the occurrence of a certain piece of text in a file and y being its total number of occurrences.GNOME Tweak Tool has got an overhaul too and lost the setting that makes a folder path box appear in Nautilus instead of a location part, opening Dconf-Editor and going to org > gnome > nautilus > preferences and completing the tick box for always-use-location-entry will do the needful.

Essentially, the GNOME project is continuing along the path on which it set a few years ago. Though I would rather that GNOME Shell would be more mature, invasive changes are coming still and it leaves me wondering if or when this might stop. Maybe that was the consequence of mounting a controversial experiment when users were happy with what was there in GNOME 2. The arrival of Fedora 20 should bring with it an increase in the number of GNOME shell extensions that have been updated. So long as it remains stable Antergos is good have a look at the latest version of GNOME for now and Cinnamon fans may be pleased the Cinnamon 2.0 is another desktop option for the Arch-based distribution. An opportunity to say more about that may arrive yet once the Antergos installer stops failing at a troublesome package download; a separate VM is being set aside for a look at Cinnamon because it destabilised GNOME during a previous look.

Getting rid of a Dropbox error message on a Linux-powered PC

24th September 2012

One of my PC’s has ended up becoming a testing ground for a number of Linux distributions. The list has included openSUSE, Fedora, Arch and LMDE with Sabayon being the latest incumbent. From Arch onwards in that list though, a message has appeared on loading the desktop with every one of these when I have Dropbox’s client set up on there:

Unable to monitor entire Dropbox folder hierarchy. Please run “echo 100000 | sudo tee /proc/sys/fs/inotify/max_user_watches” and restart Dropbox to correct the problem.

Even applying the remedy that the message suggests won’t permanently fix the problem. For that, you need to edit /etc/sysctl.conf with superuser access and add the following line to it:

fs.inotify.max_user_watches = 100000

With that in place, you can issue the following command to fix the problem in the current session (assuming your user account is listed in /etc/sudoers):

sudo sysctl -p & dropbox stop & dropbox start

A reboot should demonstrate that the messages no longer appear again. For a good while, I had ignored it but curiosity eventually got me to find out how it could be stopped and led to what you find above.

Synchronising package selections between Linux Mint and Linux Mint Debian Edition

18th April 2012

To generate the package list on the GNOME version of Linux Mint, I used the Backup Tool. It simply was a matter of using the Backup Software Selection button and telling it where to put the file that it generates. Alternatively, dpkg can be used from the command line like this:

sudo dpkg --get-selections > /backup/installed-software.txt

After transferring the file to the machine with Linux Mint Debian Edition, I tried using the Backup Tool on there too. However, using the Restore Software Selection button and loading the required only produced an irrecoverable error. Therefore, I set to looking around the web and found a command line approach that did the job for me.

The first step is to load the software selection using dpkg by issuing this command (it didn’t matter that the file wasn’t made using the dpkg command though I suspect that’s what the Linux Mint Backup Tool was doing that behind the scenes):

sudo dpkg --set-selections < /backup/installed-software.txt

Then, I started dselect and chose the installation option from the menu that appeared. First time around, it fell over but trying again was enough to complete the job. Packages available to the vanilla variant of Linux Mint but not found in the LMDE repositories were overlooked as I had hoped and installation of the extra packages had no impact on system stability either.

sudo dselect

Apparently, there is an alternative to using dselect that is based on the much used apt-get command but I didn’t make use of it so cannot say more:

sudo apt-get dselect-upgrade

All that I can say is that the dpkg/dselect combination did what I wanted so I’ll keep them in mind if ever need to synchronise software selections between two Debian-based distributions in the future again. The standard edition of Linux Mint may be based on Ubuntu rather than Debian but Ubuntu is itself based on Debian so the description holds here.

Uninstalling VirtualBox Guest Additions on a Linux Guest OS

8th April 2012

Within the last few days, I updated my Linux Mint Debian Edition virtual machine installation to Update 4. Between not following the instructions so closely and problems with the update server, a re-installation preceded the update itself. When all was done, no desktop environment appeared and issuing the startx command revealed that it was one of the VirtualBox drivers that was the cause of the problem. With my being unable to see any files on the VirtualBox virtual CD, something else needed doing and the executing following command (replacing [VboxAddonsFolder] with VBoxGuestAdditions-4.1.12 in my case but it is different for each VirtualBox version) resolved the situation:

/opt/[VboxAddonsFolder]/uninstall.sh

When it was complete, a scrambled desktop began to appear so a reboot was in order to set things to rights. Then, I could set to looking at what Update 4 had brought to Linux Mint Debian Edition.

Adding Software to Arch Linux from the AUR

3rd December 2011

There are packages absent from the Arch Linux repositories that could come in useful. When you are after one of these, then it’s time to search the Arch User Repository (AUR). In here, I have found the likes of Microsoft Core Fonts, Adobe Reader and Dropbox. There may be others but these examples are what comes to mind as I write this. In time, it may be that packages make if from the AUR into the Arch community repository but you have to use the former if you cannot wait.

Just search the AUR for what you want and download the tarball (tar.gz file) from the webpage where you find it. Then, I recommend extracting it to /tmp where clearance  at boot time means that you don’t need to do it yourself. Then, going into the appropriate subfolder in /tmp (acroread for Adobe Reader, for instance) and issue the following command:

makepkg

This will attempt to create a package file where you are working for installation by pacman. If dependencies are absent, you will be told and these may need another AUR search in some cases though most are included in the repositories. Once dependencies, have been sorted, just issue the makepkg command again to create the xz file that pacman needs to perform the installation. To do so, issue the following command from the same directory either as root or by using sudo if your user account has such privileges:

pacman -U *.xz

There usually is but one xz archive in a package folder so I have been taking the easy route of not looking up the name all of the time. Of course, you can do that for safety if you want.

With pacman not looking at the AUR, you have to do more work to get upgrades to happen if you want to avoid without having to repeat the above process all of the time. There is a package in the AUR called yaourt that needs package-query from the same place as well. Before any of these, yajl needs to installed from one of the default repositories. Once yaourt is in place, then the following does the updates for you:

yaourt -Syu --aur

Again, it might be best run this as root or using sudo though that gives messages from makepkg about not running it as a privileged user. However, I reckon that those might need to be ignored. When I tried it, the Citrix update failed though the Dropbox one succeeded. This experience might be worth bearing in mind. Saying that, I have found installing and updating software from the AUR not to be too onerous a process so far. Anything that gives a little more freedom only can be a good thing.

Sorting out MySQL on Arch Linux

5th November 2011

Seeing Arch Linux running so solidly in a VirtualBox virtual box has me contemplating whether I should have it installed on a real PC. Saying that, recent announcements regarding the implementation of GNOME 3 in Linux Mint have caught my interest even if the idea of using a rolling distribution as my main home operating system still has a lot of appeal for me. Having an upheaval come my way every six months when a new version of Linux Mint is released is the main cause of that.

While remaining undecided, I continue to evaluate the idea of Arch Linux acting as my main OS for day-to-day home computing. Towards that end, I have set up a working web server instance on there using the usual combination of Apache, Perl, PHP and MySQL. Of these, it was MySQL that went the least smoothly of all because the daemon wouldn’t start for me.

It was then that I started to turn to Google for inspiration and a range of actions resulted that combined to give the result that I wanted. One problem was a lack of disk space caused by months of software upgrades. Since tools like it in other Linux distros allow you to clear some disk space of obsolete installation files, I decided to see if it was possible to do the same with pacman, the Arch Linux command line package manager. The following command, executed as root, cleared about 2 GB of cruft for me:

pacman -Sc

The S in the switch tells pacman to perform package database synchronization while the c instructs it to clear its cache of obsolete packages. In fact, using the following command as root every time an update is performed both updates software and removes redundant or outmoded packages:

pacman -Syuc

So I don’t forget the needful housekeeping, this will be what I use in future with the y being the switch for a refresh and the u triggering a system upgrade. It’s nice to have everything happen together without too much effort.

To do the required debugging that led me to the above along with other things, I issued the following command:

mysqld_safe --datadir=/var/lib/mysql/ &

This starts up the MySQL daemon in safe mode if all is working properly and it wasn’t in my case. Nevertheless, it creates a useful log file called myhost.err in /var/lib/mysql/. This gave me the messages that allowed the debugging of what was happening. It led me to installing net-tools and inettools using pacman; it was the latter of these that put hostname on my system and got the MySQL server startup a little further along. Other actions included unlocking the ibdata1 data file and removing the ib_logfile0 and ib_logfile1 files so as to gain something of a clean sheet. The kill command was used to shut down any lingering mysqld sessions too. To ensure that the ibdata1 file was unlocked, I executed the following commands:

mv ibdata1 ibdata1.bad
cp -a ibdata1.bad ibdata1

These renamed the original and then crated a new duplicate of it with the -a switch on the cp command forcing copying with greater integrity than normal. Along with the various file operations, I also created a link to my.cnf, the MySQL configuration file on Linux systems, in /etc using the following command executed by root:

ln -s /etc/mysql/ my.cnf /etc/my.cnf

While I am unsure if this made a real difference, uncommenting the lines in the same file that pertained to InnoDB tables. What directed me to these were complaints from mysqld_safe in the myhost.err log file. All I did was to uncomment the lines beginning with “innodb” and these were 116-118, 121-122 and 124-127 in my configuration file but it may be different in yours.

After all the above, the MySQL daemon ran happily and, more importantly, started when I rebooted the virtual machine. Thinking about it now, I believe that was a lack of disk space, the locking of a data file and the lack of InnoDB support that was stopping the MySQL service from running.Running commands like mysqld start weren’t yielding useful messages so a lot of digging was needed to get the result that I needed. In fact, that’s one of the reasons why I am sharing my experiences here.

In the end, creating databases and loading them with data was all that was needed for me to start see functioning websites on my (virtual) Arch Linux system. It turned out to be another step on the way to making it workable as a potential replacement for the Linux distributions that I use most often (Linux Mint, Fedora and Ubuntu).

All Change?

19th September 2011

Could 2011 be remembered as the year when the desktop computing interface got a major overhaul? One part of this, Windows 8, won’t be with us until next year but there has been enough happening so far this year that has resulted in a lot of comment. With many if not all of the changes, it is possible to detect the influence of interfaces used on smartphones. After all, the carryover from Windows Phone 7 to the new Metro interface is unmistakeable.

Two developments in the Linux world have spawned a hell of an amount of comment: Canonical’s decision to develop Unity for Ubuntu and the arrival of GNOME 3. While there have been many complaints about the changes made in both, there must be a fair few folk who are just getting on with using them without complaint. Maybe there are many who even quietly like the new interfaces. While I am not so sure about Unity, I surprised myself by taking to GNOME Shell so much that I installed it on Linux Mint. It remains a work in progress as does Unity but it’ll be very interesting to see it mature. Perhaps a good number of the growing collection of GNOME Shell plugins could make it into the main codebase. If that were to happen, I could see it being welcomed by a good few folk.

There was little doubt that the changes in GNOME 3 looked daunting so Ubuntu’s taking a different approach is understandable until you come to realise how change that involves anyway. With GNOME 3 working so well for me, I feel disinclined to dally very much with Unity at all. In fact, I am writing these words on a Toshiba laptop running UGR, effectively Ubuntu running GNOME 3, and that could become my main home computing operating system in time.

For those who find these changes not to their taste, there are alternatives. Some Linux distributions are sticking with GNOME 2 as long as they can and there apparently has been some mention of a fork to keep a GNOME 2 interface available indefinitely. However, there are other possibilities such as LXDE and XFCE out there too. In fact, until GNOME 3 won me over, LXDE was coming to mind as a place of safety until I learned that Linux Mint was retaining its desktop identity. As always, there’s KDE too but I have never warmed to that for some reason.

The latest version of OS X, Lion, also included some changes inspired by iOS, the operating system that powers both the iPhone and iPad. However, while the current edition of PC Pro highlights some disgruntlement in professional circles regarding Apple’s direction, they do not seem to have aroused the kind of ire that has been abroad in the world of Linux. Is it because Linux users want to feel that they are in charge and that iMac and MacBook users are content to have decisions made for them so long as everything just works? Speaking for myself, the former description seems to fit me though having choices means that I can reject decisions that I do not like so much.

At the time of writing, the release of a developer preview of the next version of Windows has been generating a lot of attention. It also appears that changes are headed for the Windows user too. However, I get the sense that a more conservative interface option will be retained and that could be essential for avoiding the alienation of corporate users. After all, I cannot see the Metro interface gaining much favour in the working environment when so many of us have so much to do. Nevertheless, I plan to get my hands on the developer preview to have a look (the weekend proved too short for this). It will be very interesting to see how the next version of Windows develops and I plan to keep an eye on it as it does so.

It now looks as if many will have their work cut out if they are to avoid where desktop computing interfaces are going. Established paradigms are being questioned, particularly as a result of touch interfaces on smartphones and tablets. Wii and Kinect have involved other ways of interacting with computers too so there’s a lot of mileage in rethinking how we work with computers. So far, I have been able to deal with the changes in the world of Linux but I am left wondering at the changes that Microsoft is making. After Vista, they need to be careful and they know that. Maybe, they’ll be better at getting users through changes in computing interfaces than others but it’ll be very interesting to see what happens. Unlike open source community projects, they have the survival of a massive multinational at stake.

Do we need to pay for disk partitioning tools anymore?

29th November 2010

My early explorations of dual-booting of Windows and Linux led me into the world of disk partitioning. It also served a another use since any Windows 9x installations (that dates things a bit…) that I had didn’t have a tendency to last longer than six months at one point; putting the data on another partition meant that a fresh Windows installation didn’t jeopardise any data that I had should a mishap occur.

Then, Partition Magic was the favoured tool and it wasn’t free of charge, though it wasn’t extortionately priced either. For those operations that couldn’t be done with Windows running, you could create bootable floppy disks to get the system going in order to perform those. Thinking about it now, it all worked well enough and the usual caveats about taking care with your data applied as much then as they do now.

For the last few years, many Linux distributions have coming in the form of CD’s or DVD’s from which you can boot into a full operating system session, complete with near enough the same GUI that an installed version. When a PC is poorly, this is a godsend and makes me wonder how we managed without; having that visual way of saving data sounds all too necessary now. For me, the answer to that is that I misspent too many hours blundering blindly using the very limited Windows command line to get myself out of a crux. Looking back on it now, it all feels very dark compared to today.

Another good aspect to these Live Distribution Disks is that they come with hard disk partitioning tools such as the effective GParted. They are needed to configure hard drives during the actual installation process but they serve another process too: they can be used in place of the old proprietary software disks that were in use not so long ago. Being able to deal with the hard disk sizes available today is a very good thing as is coping with NTFS partitions along with the usual Linux options. The operations may be time consuming but they have seemed reliable so far and I hope that it stays that way in spite of any warning that get issued but you make any changes. Last weekend, I got to see a lot of what that means and I setting up my Toshiba Equium laptop for Windows/Ubuntu dual booting.

With the capability that is available both free of charge and free of limitations, you cannot justify paying for disk partitioning software nowadays and that’s handy when you consider the state of the economy. It also shows how things have changed over the last decade. Being able to load up a complete operating system from a DVD also serves to calm any nerves when a system goes down on you, especially when you surf the web to find a solution for the malady that’s causing the downtime.

A look at Slackware 13.0

5th June 2010

Some curiosity has come upon me and I have been giving a few Linux distros a spin in fVirtualBox virtual machines. One was Slackware and I recall a fellow university student using it in the mid/late 1990’s. Since then, my exploration took me into Redhat, SuSE, Mandrake and eventually to Ubuntu, Debian and Fedora. All of that bypassed Slackware so it was to give the thing a look.

While the current version is 13.1, it was 13.0 that I had to hand so I had a go with that. In many ways, the installation was a flashback to the 1990’s and I can see it looking intimidating to many computer users with its now old-fashioned installation GUI. If you can see through that though, the reality is that it isn’t too hard to install.

After all, the DVD was bootable. However, it did leave you at a command prompt and I can see that throwing many. The next step is to use cfdisk to create partitions (at least two are needed, swap and normal). Once that is done, it is time to issue the command setup and things look more graphical again. I picked the item for setting the locale of the keyboard and everything followed from there but there is a help option too for those who need it. If you have installed Linux before, you’ll recognise a lot of what you see. It’ll finish off the set up of disk partitions for you and supports ext4 too; it’s best not to let antique impressions fool you. For most of the time, I stuck with defaults and left it to perform a full installation with KDE as the desktop environment. If there is any real criticism, it is the absence of an overall progress bar to see where it is with package installation.

Once the installation was complete, it was time to restart the virtual machine and I found myself left at the command prompt. Only the root user was set up during installation so I needed to add a normal user too. Issuing startx was enough to get me into KDE (along with included alternatives like XCFE, there is a community build using GNOME too) for that but I wanted to have that loading automatically. To fix that, you need to edit /etc/inittab to change the default run level from 3 to 4 (hint: look for a line with id:3:initdefault: in it near the top of the file and change that; the file is well commented so you can find your way around it easily without having to look for specific esoteric test strings).

After all this, I ended up with a usable Slackware 130.0 installation. Login screens have a pleasing dark theme by default while the desktop is very blue. There may be no OpenOffice but KOffice is there in its place and Seamonkey is an unusual inclusion along with Firefox. It looks as if it’ll take a little more time to get to know Slackware but it looks good so far; I may even go about getting 13.1 to see how things might have changed and report my impressions accordingly. Some will complain about the rough edges that I describe here but comments about using Slackware to learn about Linux persist. Maybe, Linux distributions are like camera film; some are right for you and some aren’t. Personally, I wouldn’t thrust Slackware upon a new Linux user if they have to install it themselves but it’s not at all bad for that.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.