Do not let social networks destroy your social skills

You see them every day. People at restaurants, bars, pubs, clubs, or even private parties, constantly looking at their mobiles… Rather than living their lives, they are validating their lives on social networks. They share their current location, tag their friends, upload pictures of their meals (rigorously altered with some cheap filter), and, since they are already on it, answer messages and leave likes and comments here and there… You name it. All this when they could actually mingle with the people around them. :)

Seriously, folks… Social networks are are destroying your social skills. Do not let them win. Tonight, turn off your mobile. Spend a night out with the people you really love, talk to them, laugh with them, exchange positive vibes in a way that is possible in the real world only. That is socialising. The rest is just a surrogate of it.

The sad story of the vCard format and its lack of interoperability

I have tried to reach the zen of address book synchronisation for many years. However, I have always experienced that some contact information, especially instant messaging and social networking addresses, gets lost or corrupted during the synchronisation.

The most adopted format for representing contact information is the vCard, whose last version is the 4.0 (see IETF’s RFC 6350, 2011), while the most adopted protocol for accessing contact information is the CardDAV (see in the IETF’s RFC 6352, 2011), which is based on the vCard format. Hence, I performed a little empirical study of the actual interoperability of the vCard format.

First, I defined a sample contact, where the contact information is meant to be for home:

Joe Bloggs
+44 20 1234 5678
1 Trafalgar Square, WC2N London, United Kingdom
Skype: joe.bloggs
Twitter: @joebloggs

Second, I added this contact to four different address books:

Third, I exported each of the address books to a vCard file.

Fourth, I created a sample vCard file based on the vCard format 4.0.

Finally, I compared the exported vCard files and the sample vCard file among each other. The differences between these files blew my mind.

In the following, I show these vCard files and discuss the properties which are not interoperable. Note that I stripped the irrelevant properties and rearranged the remaining properties in order to make the comparison easier.

Sample vCard file

FN:Joe Bloggs
TEL;TYPE="cell,home";PREF=1:tel:+44 20 1234 5678
ADR;TYPE=home;PREF=1:;;1 Trafalgar Square;London;;WC2N;United Kingdom

The specification of the vCard is kind of shocking. Believe or not, it does not support social networking addresses yet. Even worse, it supports constructs which are not interoperable, namely grouped properties and non-standard properties.

Grouped properties are properties prefaced with the same group name. They should be grouped together when displayed by an application. I will show examples of grouped properties later.

Non-standard properties are properties defined unilaterally or bilaterally outside the standard. They may be ignored by an application.

Hence, I was forced to represent the Twitter address by a non-standard X-SOCIALPROFILE property:


Apple Contacts (version 7.1)

FN:Joe Bloggs
TEL;type=CELL;type=VOICE;type=pref:+44 20 1234 5678
ADR;type=HOME;type=pref:;;1 Trafalgar Square;London;;WC2N;United Kingdom

The vCard file exported by Apple Contacts is only partially based on the vCard format 3.0 (see IETF’s RFC 2425 and RFC 2426, 1998) and its extension for instant messaging (see IETF’s RFC 4770, 2007).

The web address is represented by a standard URL property grouped together with a non-standard X-ABLabel property:


This issue can be solved by changing the type of the web address from “home page” to “home”. This leads to a vCard file where the web address is represented by a standard URL property:


The Twitter address is represented by a non-standard X-SOCIALPROFILE property:


Cobook (version 1.1.6)

FN:Joe Bloggs
item2.TEL;type=VOICE:+44 20 1234 5678
item3.ADR:;;1 Trafalgar Square;London;;WC2N;United Kingdom

The vCard file exported by Cobook is only partially based on the vCard format 3.0. With the exception of the name, all the contact information is represented by either grouped properties or non-standard properties.

Google Contacts (15 November 2012)

FN:Joe Bloggs
TEL;TYPE=CELL:+44 20 1234 5678
ADR;TYPE=HOME:;;1 Trafalgar Square;London;;WC2N;United Kingdom

Google Contacts does not support social networking addresses natively, so I was forced to add them as URLs.

The vCard file exported by Google Contacts is only partially based on the vCard format 3.0 (see IETF’s RFC 2425 and RFC 2426, 1998).

The colon in all the URLs is is unnecessarily escaped.

Similar to Apple Contacts, the web address is represented by a standard URL property grouped together with a non-standard X-ABLabel property:


I guess this is because Google Contacts specifically targets Apple Contacts when exporting to a vCard file. This issue can be solved by changing the type of the web address from “Home Page” to “Home”. This leads to a vCard file where the web address is represented by a standard URL property:


The Skype address is represented by a non-standard X-SKYPE property:


The Twitter address is represented by a standard URL property grouped together with a non-standard X-ABLabel property:


Memotoo (15 November 2012)

FN:Joe Bloggs
TEL;HOME;CELL:+44 20 1234 5678
ADR;HOME:;;1 Trafalgar Square;London;;WC2N;United Kingdom

The vCard file exported by Memotoo is only partially based on the vCard format 2.0 (see Versit Consortium’s specification, 1996).

The Skype address is represented by a non-standard X-SKYPE-USERNAME property:


The Twitter address is represented by a non-standard X-TWITTER property:



Given the results of this study, it is not surprising that the import/export of vCard files as well as the synchronisation via CardDAV do not behave as expected most of the time.

Common contact information such as email addresses, telephone numbers, postal addresses, web addresses, and instant messaging addresses can be represented in two ways: by means of standard properties, or by means of standard properties grouped together with non-standard properties. The second way is currently used by Apple (and other vendors targeting Apple); it is unnecessary, prevents interoperability, and promotes vendor lock-in.

Other common contact information such as social networking addresses are not supported at all.

So what should be done? Here is my suggestion:

First, the IETF should remove grouped properties and non-standard properties from the specification, since open standards should promote interoperability and prevent vendor lock-in. Second, the IETF should add social networking properties to the specification. Third, the IETF should provide an official validator for vCard files. Finally, the vendors should implement the last version of the vCard format, and they should do it right.

Update 22 November

I have shared my concerns in IETF’s vCardDAV mailing list. You can follow the thread here.

Why I abandoned GNU/Linux on the desktop

I have experimented with GNU/Linux in dual boot with Windows from 1997 to 2000, and I have had GNU/Linux only installed on any of my computers, both at home and at work, since 2001. I have changed distribution relatively often — 3 years with Red Hat (now Fedora), 2 years with Mandrake (now Mandriva), few weeks with SuSE (now openSUSE), 1 year with Slackware (don’t ask…), 3 years with Gentoo, few weeks with Debian and 6 years with Kubuntu — but I have not changed desktop environment that much — 1 year with FVWM ’95 (those were the days…), 1 year with Enlightenment, 1 year with GNOME 1, and 12 years with KDE.

I have been a loyal KDE user, contributor and advocate since the release of KDE 2.0. I have donated 100€ to the KDE e.V. organisation each year since the announcement of the “Join the Game” campaign. Last but not least, I acknowledged the KDE community in my PhD thesis (at the end of the Preface, page xi). This was just in case anybody wonders about my credentials as a GNU/Linux and KDE user…

Unfortunately, KDE does not satisfy my needs any more, and I was forced to look into other solutions. This post attempts to explain why I came to this decision, and I hope that the GNU/Linux and KDE communities will perceive this as a constructive critic.

KDE 4.0 was released before it had reached feature parity with KDE 3.5. This is because KDE developers intended KDE 4.0 as a technological preview aimed at developers, testers and early adopters only. However, the majority of KDE users did not really understand that, which is legitimate considering that .0 means at least feature complete in any other project. As a consequence, many KDE users (including Linus Torvalds) found themselves with a desktop environment which was just half baked, and eventually ditched KDE.

I expected KDE developers to adopt a more conservative release strategy in future major (point) upgrades, but apparently they did not learn any lesson. In fact, KDE 4.4 was released together with a new version of KAddressBook which was rewritten from scratch and based on the Akonadi storage service. The new version introduced several regressions compared to the previous version shipped with KDE 4.3. As a consequence, once again, many KDE users found themselves with a half baked KDE PIM suite, and eventually ditched KDE.

Actually, this time also my frustration started to mount. I wrote two verbose posts on the KDE forum to question this release management and propose a system of bounties for bug fixing. These two initiatives triggered a lively discussion in the community, but in the end nothing really happened.

KDE 4.9 was released one month ago. There are still many small nuisances with it, especially with the KDE PIM suite. And do not blame me or the packagers, please. Try to access an IMAP e-mail account with an unstable Internet connection: in the best case, Akonadi will spam the KDE notification system with connection error messages, which will eventually crash KNotify; in the worst case, Akonadi itself will crash. Try also to synchronise contacts and calendars with Google or any other well known social network: if you manage to make it work, consider yourself lucky if you do not have any loss of information.

Despite these years-old bugs, KDE developers keep spending resources on applications the world could probably do without, like the Rekonq browser and the Calligra office suite. Sometimes I ask myself if KDE developers use these applications for real, and apparently the answer is that some do not: as you can notice in the official screenshot for the KDE 4.9 release, some prefer Chrome and LibreOffice over Rekonq and Calligra, which is not surprising at all. I often read complaints about the lack of resources to maintain the KDE project. Why not focusing on less applications of higher quality rather than more applications of questionable quality then?

I tried to look into other distributions and desktop environments, but the situation seems to be even more tragic. Let us have a look at the top ten distributions on Distrowatch:

And these are just ten distributions out of hundreds, as well as just seven desktop environments out of tens — among which I can not resist to mention Trinity, which is a fork of KDE 3…

Am I the only one thinking that this fragmentation is beyond ridiculous? The developers of these distributions and desktop environments are spending massive amounts of resources to develop redundant software and compete on a mere 2% of market share. Why not focusing on less distributions and desktop environments of higher quality rather than more distributions and desktop environments of questionable quality then?

Maybe there is a question of ego, or maybe there is a problem with the bazaar itself. But the fact remains: GNU/Linux has missed all the chances to become a mainstream desktop operating system, and I do not want to use a niche operating system any more. This was a very difficult decision, and I am really sorry for that, but I need something that just works, and I need it now.

So long GNU/Linux, so long KDE, you served me well.

My new desktop operating system? Mac OS X. Do I love it? No, I actually hate it at times, but I will come back to that another day.

How to blow 1.6 million EUR

The University of Smallville needs to build a new student centre. The centre will offer services to students such as programme enrolment and exam registration, and will provide a new auditorium, library, swimming pool, gym, etc.

On the 14th of September 2006 the University Board decides to initiate a project called MegaCentre for the new student centre, to which it allocates half a million EUR. The University has, among others, a Department of Architecture, a Department of Engineering and a Department of Facility Management. One might expect that the University Board would assign the management of MegaCentre to one of these Departments. On the contrary, however, the University Board assigns the management of MegaCentre to the Student Affairs Centre. The Student Affairs Centre forms a working group composed of a project leader, a project co-leader, a technical leader and two co-workers. Again, one might expect that someone from the Department of Architecture, Engineering or Facility Management would cover one of these roles. On the contrary, however, all members of the working group, except for the technical leader, belong to the Student Affairs Centre. The project leader and co-leader do not have specialist educations in architecture or engineering. The technical leader of the working group has an education in engineering but does not belong to the University. The working group spends more than one year and half a million EUR planning MegaCentre.

On the 13th of September 2007 the working group presents the plans for MegaCentre to the University Board. According to these plans, the construction of the building will be assigned to the external construction company Nonchalant, which guarantees the use of state-of-the-art construction techniques. Moreover, once the building comes into service, the maintenance will be assigned to the Department of Facility Management. The University Board accepts the plans and allocates an additional 0.9 million EUR to the project. Nonchalant spends more than one year on construction of the building, on completion of which it presents a bill of 1.1 million EUR.

On the 4th of February 2009 the building is inaugurated with due ceremony, after which it enters into service. Unfortunately, faults in the building’s design immediately become evident, with problems such as poor insulation, a leaky roof, an unreliable alarm system and poor handicap access, to name but a few. Both employees and students soon become frustrated. Again, one might expect that the working group of MegaCentre would demand Nonchalant to honour its contractual agreement, repair all faults and pay any necessary fines for damage caused. On the contrary, however, the working group simply allows the Department of Facility Management to deal with the faults as they see fit. The Department of Facility Management hires construction workers and assigns them to the repairs and alterations. The construction workers do what they can, but after one year many design issues remain unresolved. The head of the Department of Facility Management, who has an education in engineering, decides to perform a thorough evaluation of the building. On doing so, he discovers that the building is constructed with obsolete, rather than state-of-the-art techniques, and that these would not guarantee minimal safety in the event of a natural disaster. Finally, he concludes that it will in fact be necessary to reconstruct the building from scratch using appropriate techniques.

On the 29th of April 2010 the head of the Department of Facility Management presents the evaluation to the University Board. At this point the University Board finally acknowledges that severe action must be taken and sues Nonchalant for damages, excludes the Student Affairs Centre from the project, hands the management of MegaCentre to the Department of Facility Management and fires the employees responsible for public money wasted hitherto.

Do you find this story unbelievable? Well, now replace the name Smallville with Bergen, MegaCentre with EksternWeb and Nonchalant with Bouvet, and read it again here

The big 3-0

Yesterday the universe had plenty of happenings: a winter solstice, a total lunar eclipse, the darkest night in 400 years, and, last but not least, the last day of my twenties… Yes, it had to happen: I turned 30 today. “What is important is to be young at heart”, some might say… Bullshit! I honestly hate this big 3-0 and all the social expectations that it implies. Anyway, entering a new decade always triggers some self-reflection. I have looked back at the last decade of my life, and, inspired by the novel Caos calmo, I have written down some of the things I have done during these years:

Countries visited

United Kingdom

Mountains climbed

Fløyen (400m)
Ulriken (640m)
Rundemanen (560m)
Sandviksfjellet (417m)
Lyderhorn (396m)
Damsgårdsfjellet (350m)
Løvstakken (477m)
Corno grande (2912m)
Preikestolen (604m)
Kjerag (1110m)

Airlines taken

Estonian Air

Laptops owned

Sony Vaio PCG-FX801
Toshiba Satellite A100-703
ASUS Eee PC 1101HA
Dell Latitude E6500

Mobiles owned

Nokia 5110
Nokia 6110
Nokia 7110
SonyEriccsson Z1010
Siemens MT50
Siemens C55
SonyEricsson K610i
Nokia 5800 XpressMusic

Camera owned

Sony Cyber-shot DSC-P72
Canon EOS 350D
Olympus μ 1030 SW

Cars owned

Škoda Fabia 1.9TDI (2001 ed.)

Motorcycles owned

Yamaha FZR600 (1994 ed.)

There are actually many other things I could write down, but all in all I have been lucky to have had so many opportunities. I am curious to see how these lists will look like in ten years time…

Asus Eee PC 1101HA, Intel GMA500 (Poulsbo) and the shattered dream of the out-of-the-box GNU/Linux support

Do not buy a Asus Eee PC 1101HA or any netbook/laptop having an Intel GMA500 (Poulsbo) video chipset if you plan to run GNU/Linux on it.

Unlike Intel’s other video chipsets, the GMA500 is not developed in-house but it is based on Imagination Technologies’s PowerVR which is barely supported under GNU/Linux. The GMA500 drivers are so messy that it is even challenging to get the native display resolution. You can read more about how Intel is ruining its relationship with the GNU/Linux community on Linux Journal and Ars Technica.

I spent about 3800 NOK (460 EUR) to buy an Asus Eee PC 1101HA last Saturday. Now I can not return it to the reseller. In other words, I am screwed.

NWPT 2009 and Danish language

I have not written any post about my summer vacations in Italy, Spain and Hungary, but now they are far away and I will skip them. I just want to share my experience at the last conference I participated, namely the Nordic Workshop in Programming Theory in Lyngby, north of Copenhagen, Denmark.

As always I travelled together with Adrian and this time I had to share the hotel room with him since my travelling budget for 2009 has been in red since July… Fortunately Adrian is not employed at the University of Bergen but at the Bergen University College, and it seems that funding is less problematic there. ;)

The conference was very well organised and covered very many topics of computer science. Adrian and I spent a lot of time modifying the slides rather than listening to the talks, but the presentations of our two abstracts went fine in the end.

During my stay I had the chance to test my skills in Scandinavian languages with some locals. Just for the records, written Danish and Norwegian (in the bokmål variant) are rather similar, so similar that reading Danish is not a problem for me… But the spoken counterparts are definitely very different. Spoken Danish sounds like a continuous stream of (guttural) sounds to me, with no chances to understand when a word stops and when the next starts. :) I hope that no one will take it personally if I say that it seems like Danes do not make any effort to pronounce words clearly.

But there is even more… Danish has a rather weird number system. The tens from fifty on are not based on the number ten, as is the case in most European languages (French being another outstanding exception). This strange system combines two archaic ways of counting: twenty-based instead of ten-based and fossilized expressions for two and a half, three and a half and four and a half. This is the result:

50 halv-tred-s(ind-s-tyve) half-third-t(imes-of-twenty)
60 tre-s(ind-s-tyve) three-t(imes-of-twenty)
70 halv-fjerd-s(ind-s-tyve) half-fourth-t(imes-of-twenty)
80 fir-s(ind-s-tyve) four-t(imes-of-twenty)
90 halv-fem-s(ind-s-tyve) half-fifth-t(imes-of-twenty)

After this experience, I think that these Norwegian comedians are not so far from reality. ;)

TOOLS 2009

This time it was the TOOLS 2009 conference in Zurich, Switzerland. Adrian and I arrived on a Saturday, without any particular plan for the evening. Many locals suggested us to go to Lucerne because of the first edition of the Lucerne festival. Adrian managed to convince me to go there, and I have to admit it was a good idea. Plenty of people, plenty of music, plenty of local food and drinks. And right after the sunset, the best fireworks I have ever seen: 25 minutes of pyrotechnic show with lights coming from both sky and lake… Amazing!

All the stereotypes about Swiss precision and efficiency were destroyed in one go on the way back to Zurich. We were supposed to take the train from Lucerne at 2:30, but probably too many people shared with us the same plan. :) The result was kilometric queues on the ticket machines and people packed in trains like in India. The train we took did not even arrive to Zurich and despite the promises of the railways personnel there, no further train came before one hour. In the end, tired of waiting, we took a taxi back to the city.

Well, despite this “original” start, the conference went very well. The ETH, which hosted the conference, is located on top of a hill with a nice view over the city. The event was well organised and composed by several co-located conferences and workshops. Adrian made a brilliant presentation of our last work “A Diagrammatic Formalisation of MOF-Based Modelling Languages“, and many asked questions. I feel like the goals of our participation to the conference have been all fulfilled. The city was lovely and welcomed us with a great warm summer weather. The food was also great; to eat once more authentic Fondue and Rösti was a pleasure. :)

I left Zurich by train on Saturday, and my destination was not Bergen but Tortoreto, my home town in Italy. The trip home was a sort of odyssey. The train I took in Milan had broken air conditioning system and I had to stay inside it for five hours with no chance to open the windows… And if this was not enough, the catering services of the Italian railways had a strike the very same day, i.e., it was not even possible to buy water! Italy is somehow able to remind me every time that the choice of moving abroad was the right one.

Iomega UltraMax Plus — A Linux-friendly External Hard Drive with RAID support

Lately my home folder began to run out of space, so I started to look around for an external hard drive. I wanted a solution comprising at least 1TB space, RAID 1 support and USB 2.0 connector (since the last NAS I tried did not transfer more than 10Mbit/s). Obviously, the drive had to work out-of-the box with GNU/Linux.

The most popular solution seemed to be the WD MyBook Mirror, but various GNU/Linux forums including the Ubuntu ones had many posts reporting compatibility problems. The RAID control software is Windows-only, and the drive tends to spin itself down under GNU/Linux, causing the kernel to give up on it and disconnect the device.

It seemed almost like there was no other solution but I found out that Iomega produced exactly what I was searching for. The Iomega UltraMax Plus includes eSATA, USB, FireWire interface connections plus RAID 0, 1 , and JBOD features. And it even looks cool! ;) I could not find any information about possible compatibility issues with GNU/Linux, but I decided to buy it anyway. I have not been experiencing any issue since I received it one week ago. The RAID configuration is chosen via a hardware switch on the back of the drive, and GNU/Linux seems to handle it properly. I recommend it to anyone.


As always with some delay I can finally write a bit about what my experience at the NWPT’08 workshop in Tallinn, Estonia. The trip started immediately with some strong emotions: my colleague Adrian realised that he forgot the passport home just before catching the taxi to the airport, so we had ask the driver to run to his place first and to the airport next. Fortunately we made it, and late in the night we were in the old city of Tallinn.

The workshop encompassed several theoretical presentations, and I have to admit that I did not understand many of them, but this is probably (hopefully? :) ) normal when people are coming from very different areas of research. I finally had my first presentation as well. We had indeed two extended abstracts accepted at the workshop, and I presented the one titled “Version Control in MDE.” Despite the initial twitter, I have to say that the presentation was smooth.

I had the chance to go a bit around the old town of Tallinn during the weekend, and I loved it. There is a lot of history everywhere, and sometimes it seems to be back in time. However, despite the old-fashion look, Tallinn is really ahead in time for what concerns Internet. In fact, Internet is available for free everywhere through wifi access. Note that with “for free” I do not mean that you can steal the connection from some unwary network owner, but that it is provided by the municipality. Estonian people I met seemed very helpful and friendly, and most of them were able to speak English fluently. At the end, it was a very nice experience, except for a last detail…

The journey back to Bergen was a sort of odyssey. We had a connected flight to Bergen, with stop over at Copenhagen. The day that we were supposed to leave, we woke up in the middle of an extreme snow storm. Our flight was not cancelled, so we had to reach the airport at 16:30, with expected departure at 18:00. The taxi driver had even problems to come to the airport because of the loads of snow all over the streets. The situation did not look promising at all, but the Airport kept delaying our flight rather than canceling it. After waiting many hours at the gate with no precise information, the flight was eventually declared cancelled at 00:00. The airline could not provide us an hotel, since they had to handle so many cancellation during the day. After having our flight rescheduled for the day after, we had to come back to our hotel, where they had two rooms available for the night fortunately. But the story does not end here… We had exactly the same schedule for the day after, but luck was not with us. The flight took off at 21:00 instead of 18:00, and we obviously missed the connection in Copenhagen. We hoped that they could reschedule us to take the last flight from Copenhagen to Bergen at 22:45, but guess what? It was cancelled due to another snow storm in Stockholm… So another night abroad, this time at a Radisson SAS hotel at least. :) After more than two days of journey, finally we landed in Bergen the morning after…