Saturday, May 30, 2009

More Waves

Essential to me and many of my friends is the license for Wave. The patent license has already been posted. It is short and it is sweet.

"Subject to the terms and conditions of this License, Google and its affiliates hereby grant to you a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this License) patent license for patents necessarily infringed by implementation of this specification. If you institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the implementation of the specification constitutes direct or contributory patent infringement, then any patent licenses for the specification granted to you under this License shall terminate as of the date such litigation is filed."

In addition to the license, a contributor to the protocol has to provide either an individual or an organisational contributor license agreement AND in order to get a patch to the protocol accepted, a production-quality patch for the reference implementation has to be provided.

For Wave technology to succeed, is must be easily adopted and be adopted as widely as possible. Given that Wave is based on peer to peer technology, only by making client or server applications available cooperation can be ensured. By restricting the use of software cooperation in a Wave is prevented and the usefulness of the functionality is reduced
Thanks,
      GerardM.

Friday, May 29, 2009

Google wave

I just watched yesterday's developer presentation of Google Wave. It stole the thunder of bing and deservedly so. It is the difference between a completely proprietary search system and an open source system that integrates many of the paradigms of the modern Internet. It is the difference between a "me too" system and something "new".

I think wave has the potential to make a big difference. The one thing that struck me was how much those aspects that make a Wiki are integrated already. The usability of this still very early product is awesome. When you see this demonstration and you think Wikipedia, you may find like me that much of the functionality is already there.. Sure I have not seen the "flagged revisions" or the "citations" but both will be the kind of functionality that so many people / organisations will need that I am sure that that will happen.

What do you think, is this the kind of open source functionality that would fit Wikipedia?
Thanks,
GerardM




Thursday, May 28, 2009

Encoding the Pahawh Hmong script

 Pahawh Hmong is a script. It is used to write the Hmong language(s). Different scripts are quite the norm in Asian languages and it is for this reason that the Hmong take pride in the existence of "their" script. Pahawh Hmong was developed in 1959 and it has evolved in to three variants and this complicates things a lot.

At this time there is the opportunity for this script to be included in Unicode. It is a good time to do this; there is still someone alive who knew Mr Shong Lue Yang, and it takes some plane tickets and salary for the person researching and encoding the script into Unicode.

It is not just Wikipedia that requires Unicode for a script most applications do. Without it, you may write it with a fountain pen but effectively the language is not part of this modern world.

As it is, there is no money for the encoding of the Pahawh Hmong script...
Thanks,
       GerardM

Requests to implement the Babel extension

The Babel extension is one of my favourites. There are several reasons why:
  • it is implemented and supported at translatedwiki.net
  • there are many localisations for the messages involved
  • when you install it, all the localisations are instantly available
  • it is an extension 
  • it is not based on templates
The good news is that many people appreciate Babel for what it is. There are several Wikipedias who have requested for this extension to be implemented.  They are even considering to put this extension to the vote for the German Wikipedia..

There is one problem; the code needs to be assessed. It already has the endorsement of all the developers at translatewiki.net...
Thanks,
      GerardM

Tuesday, May 26, 2009

Free localisation support for five Wikis

When you install the 1.15 stable version of MediaWiki on your system, you will only get the localisation available at the time of the release. After that you do not get any more regular updates. This is not because no more localisation work is done at translatewiki.net for the many relevant messages, it is just because there is no mechanism to get them to you in a timely manner.

The LocalisationUpdate extension will be a game changer; it will bring you all the latest localisations from Wikimedia Foundation's SVN. It will bring them based on the definition of the English message in SVN being EXACTLY the same as the local message on your wiki. So when you have a MediaWiki Wiki in a language that is in need of improved localisation support, you can choose to become part of the beta testing of LocalisationUpdate.

TheDevilOnLine is willing to support five Wikis in five different languages who want to test out this new functionality. To be eligible you will have to upgrade to release 1.15.
Thanks,
       GerardM

Friday, May 22, 2009

Update on LocalisationUpdate

When you are testing something nice like the LocalisationUpdate extension like I do at the moment, you have the priviledge to follow its development. A lot of changes happened since it first materialised in Subversion; so much in fact that it may be cheaper to rewrite the code with the lessons learned in mind. It is not much code..

So what happened already;
  • currently there is an internationalisation file
  • the performance has improved a lot
  • it just works.
While there is still a need for code clean-up, it also works quite sweetly. The latest test run just reported that it had updated 2296 messages. For comparison the MediaWiki core currently has some 2275 messages..
Our expectation is that with a timely update of the localisation more people will help out with this necessary work. There are some bets going about what its impact will be. I have heard numbers as different as 5% and 40% more messages half a year after the implementation on the MediaWiki projects.
Thanks,
      GerardM

Wednesday, May 20, 2009

Approaches to mass digitisation in the Netherlands

I found this presentation on SlideShare by Paul Keller about mass digitisation in the Netherlands.  I want us  to cooperate with museums, libraries and archives, I want much of their material to be available on Commons. I love the idea that the cultural heritage of the Netherlands is digitised; it is one of the best ways of insuring that something is left when a disaster strikes.
 
My problem with the project presented by Paul is that it does not state what license will be used for the material that is digitised. This may be on purpose and I expect that it is, but the consequence is that nobody outside a narrow band of people can safely use it as a result. It may be that the CC-by-nc-sa is to be used, but as it does not say so explicitly it would be not safe for use by anyone. Wikipedia would not touch it because of the restriction imposed for the "non commercial" part.
Thanks,
       GerardM

PS it would make sense for all this digitised material to be stored somewhere high and dry ... above sea level :)


Tuesday, May 19, 2009

Waiting for your localisations to materialise

One of the more frustrating things for the localisers at translatewiki.net is the waiting. The waiting for the localisations to go life.

The work flow for internationalisation and localisation is as follows:
  • Messages are importen from SVN into translatewiki.net
  • The messages are checked and possibly improved
  • The messages are made available for localisation
  • The localised message are checked by the translationwiki.net developers
  • The messages are submitted to SVN
  • The messages go into production when MediaWiki is updated
The last bit is a bit of a bottleneck; updates happen when they do and the current 48811 revision has been in production for a few weeks now. This is not good for the motivation of the people that localise. So a mechanism is needed to make sure that the messages become available in a timely manner.

A new extension; "LocalisationUpdate" is being created that will make a difference. In its current incarnation it can be run either from a command line or it can be run a "cron job". This is going to make it possible to move the messages to production sooner. For the technical people, only those localised messages will be updated where the English message in both SVN and in the target MediaWiki installation are exactly the same.

The LocalisationUpdate will be first available for release 1.15 and the 1.16alpha. We hope to make it available for older releases at a later date.

A big thank you to TheDevilOnLine; he is realising this dream..
Thanks,
      GerardM

Wednesday, May 13, 2009

Djatoka with Open Layers

Some of the pictures at Commons are huge. They take forever to load and they start loading at the top, typically not the most exciting part of a picture. It makes more sense to initially load a low resolution picture and download the higher resolution bits as and when required. This gives the average user a much more responsive experience. In this demonstration of the Djatoka software, you will find many pictures by Ansel Adams of the Japanese-American internment at Manzanar. The scaling of the pictures is done by Djatoka in combination with OpenLayers..
Thanks,
GerardM

Monday, May 11, 2009

Usability improvements for translatewiki.net

The Special:Translate page on translatewiki.net has been revamped. When there is an hierarchy like the one for MediaWiki messages, it will show this in a much improved way.
A relatively small change in the presentation makes it easier to understand what is going on, what the relevance is of particular messages. Not only does it look better, it makes it easier to navigate all the messages and projects that are supported in translatewiki.net.
Thanks,
      GerardM

Sunday, May 10, 2009

OpenLayers is now supported in translatewiki.net

OpenLayers is software used by Open StreetMap. The functionality and content of Open StreetMap is going to be used by Wikipedia. One of the MediaWiki Google Summer of Code projects is the "Semantic Layers" extension it will provide an integration layer between OpenLayers and Semantic MediaWiki.

All these facts make for the inescapable conclusion that OpenLayers is relevant or will be relevant for the projects of the Wikimedia Foundation. A big thank you is due to RobertL who wrote the functionality that makes it possible to internationalise and localise in translatewiki.net. Consequently, the whole stack for the support of maps can now be in your language.

Please help us localise OpenLayers in translatewiki.net
Thanks,
GerardM

Saturday, May 09, 2009

Inaugural meeting of the Concept Web Alliance

The Concept Web Alliance (CWA) is a new organisation that aims to bring the best science has to offer together and make better science and be more informative about science.

The key to the activities of the CWA is the realisation that there is an overwhelming amount of data around, data that is available in many formats. Data that could be of use in many environments and data that would be of use when there are commonalities for the data between these different environments.

Semantic web, triples, disambiguation, authentication, death switches, multi linguality, community involvement, Wikipedia were just a few of the subjects that had my interest among the many, many subjects that were mentioned / discussed.

What made this meeting so powerful was the presence of so many influential people representing from so many countries, organisations, domains.

I had the pleasure to present a demonstration project that demonstrated how XML data can be imported in an integrated text-data environment with Semantic MediaWiki and how the data can then be exported as triples..

Thursday, May 07, 2009

Referata in anything but English

I am in New York and I am meeting Yaron Koren. Yaron is one of the Semantic MediaWiki greats and he is the man behind the referata.com websites. I consider the referata websites "best of breed" and I am a fan of the Food finds wiki. Food finds is about finding restaurants all over the world.

When you are getting information from all over the world, it makes sense to use a "universal language" and as such it can be argued that English qualifies. This does not mean that there is no room for Semantic MediaWikis in other languages. Yaron told me that he was impressed by the large amount of localisations that came his way when all his extensions became supported at translatewiki.net.

My challenge is to have a best of breed Semantic MediaWiki in another language, any other language. But would it not be great when this language was something "exotic" like Arabic, Tagalog or Gujarati. I am not suggesting that Semantic MediaWiki is completely localised in those languages, but they can be.

Would this not be cool ?
Thanks,
       GerardM

Friday, May 01, 2009

Everything can be compared. It still represents 'the known universe'

It is the month of May again and the new "group statistics in time" of translatewiki.net are available again. The numbers do not lie they say. When you compare these current numbers, you will find progress when compared to last year. The progress that impresses me most you will find in the "Extensions 65%"; they grew from 17 to 30 and consequently Siebrand's goal for 2009 has been reached.

These numbers demonstrate that when people get into their localisation, it becomes a habit to come back to do some work. It demonstrates that slowly but surely the extensions that are not used by the WMF are being localised. When you notice that there are only 35 languages that have 90% of the extensions localised that are used by the WMF you may appreciate why I appreciate this dedication that makes MediaWiki such a wonderful platform.
Thanks,
GerardM