Tuesday, November 29, 2011

#MediaWiki release 1.18 is out

The new MediaWiki stable release has been announced and, the list of new features is impressive. When you have your own MediaWiki installation, you definitely want to consider upgrading.

There are several innovations that are quite relevant:
  • jQuery is included as standard and, it is used
  • several new languages have been added
  • several extensions are now included as part of the distribution
  • the qqx messages trigger a screen that helps identify specific messages
add &uselang=qqx at the end of the URL
An "in your face" message is often hard to identify. Adding &uselang=qqx to the URL will provide you with the information you need to identify the message at translatewiki.net for localisation or local modification. For the localisers this is a very significant tool.

Given that extensions are included in the new release, it will be a good idea to have three extensions included in the next release for Wikis that do not have English for its user interface. They are:
In this way a language will always be up to date with the latest localisations, it will be supported with input methods and fonts that make the Wiki accessible to as wide a group of people as we can make it. What would be really awesome is when new and improved keyboard methods and fonts can be distributed in a similar way as the messages...

Saturday, November 26, 2011

A #translation sprint and an intro to #MediaWiki hacking

The verdict on the records is out; the day is not done yet. The biggest thing was when a Panjabi woman was thrilled to be able to just type Panjabi in the Symbiosis computer lab.

This was a revelation shared by other people in the room for other languages. Supporting many Indic languages is available for many operating systems. Never mind how cool it is to support you with Narayam and WebFonts, it is much better to have support local on your local device. This not only provides you with the support you deserve in Wikipedia but also in all the applications you might run. You know what, installing this is not that hard.

At this time 40+ new people created a profile at translatewiki.net, many people have experienced that it is not that hard to support Free Software by helping themselves make the software available in their own language.

The second part of the day, the technical part was too short but many people got a clue what MediaWiki and its infrastructure is about. For us to be in Pune has been a privilege and, it is really cool to see the functionality we are so passionate about being used in earnest.

Friday, November 25, 2011

#Translation sprint in Pune is one for the whole world

The #Wikimedia #Localisation team is in Pune #India. Tomorrow we have a translation sprint at the University and our intention is to seriously beat some records. The record for the most translations in an hour, the record for the most translations in a day, the record for the most contributions of translators in a day; they are all up for grabs and we intent to break them.

We expect some 50 people at the university.  They are going to translate and they are all going to start together. Our team is available to explain, support, authorise. But when we can do it for the people in Pune, we can also do it for the people of India who want to help us. When we are at it, we can do it for people from where-ever.

When we are done at the university, we will be flying home. Happy to see our family again but we are not the only ones who support the translatewiki.net community. It will be interesting to see how we cope, how well the server copes and how hard it will be to break the record now and in the future.

So help us today, localise for your own language and break every record we have in our book.

Thursday, November 24, 2011

The #Tifinagh script for #Berber languages can be typed

In the #Wikimedia #Incubator, the Tifinagh script is used for one of the Berber languages. It is the original script for these languages and it is there to be used. Using it is hard when your computer does not support it.

The Narayam extension does allow the use of input methods on top of an international keyboard. When the Tifinagh script is to be supported, it helps when an organisation is putting its weight behind a particular input method. For the Tifinagh script it is the Royal Institute for the Amazight culture; they defined a keyboard, they provide a manual on how to use it and we are happy to support this keyboard in Narayam.

The Royal Institute produced many beautiful fonts; it would be nice when we find that we can use these as well and truly promote the use of the Amazight culture worldwide.

Wednesday, November 23, 2011

Hackasaurus or let's edit #HTML

Wikisyntax; it has people wake up screaming at night. Developers hate to touch the MediaWiki parser. Rumour has it there is maybe only one maybe two who have a fighting chance to work on it.

When Hackasaurus was introduced to me, it provided me with a tool to see how a page with multiple languages shows what expected HTML looks like: showing a language tag for content that is different from its surrounding content.

<p lang="nl"> and finally </p>
The fun thing of this tool is that you can change things easily. It is meant to be easy as it is intended to teach kids HTML. We specified the language in the "p" tag. It is very much a visual editor and even adults love it. Some people say; you can do with HTML-5 everything that wikisyntax allows you to do. As wiki syntax was meant to hide the HTML, it would be nice to consider if we have not come full circle and drop the Wikisyntax and just use the later version of HTML.

Narayam will look like this

The #India hackathon produced excellent results; one of the highlights was the online keyboard for Narayam. It will need some modifications but it at least allows you to SEE where these bleeding keys are on an International keyboard.

The day after the hackathon we tried to install the patch only to find that some files were sadly lacking. The good news is that after a few e-mails there is not only a screenshot but the missing files are now available to us as well. A big thank you to Abhijeet Pathak for his contribution and his kind support.

Monday, November 21, 2011

#Agile is about getting functionality out

Many software development projects fail to get functionality in a timely manner. Often all functionality is released all together and it has to function to the nth degree. Usually it doesn't and usually the users of the new functionality grumble and object. The Wikimedia Localisation team does not produce everything in one go. What it does is produce functionality that is described in a story
Alolita wants to edit to localise the translatewiki.net user interface in Hindi and uses the Narayam functionality to mimic a Devanagari keyboard. She is not able to type blind and she uses a mock-up of the keyboard to find the keys she needs to press
In Agile this is called an "user story" and for us it is part of the Narayam "epic story". This story is analysed, it is divided in tasks, they are given points and every fortnight it is decided what stories to implement in the next sprint. Those stories can be quite rough around the edges consider; a story that takes as a premise that the user can type Hindi blindly is already in production.

The point of the stories is that they describe precise functionality and this is what is realised in a limited time frame. These stories are selected at the beginning of a sprint and consequently not only does new functionality get frequently realised and implemented, priorities for what the language team is to do are also constantly re-assessed.

Our hope is that our users; the Wiki communities will find us responsive. Our aim is to make the gap  between the user-experience in English and any other language as small as we can make it.

Sunday, November 20, 2011

The #Wikipedia challenge for an AppStore

When an application has been developed, tested and ready, it gets into an AppStore. Except that a developer develops and when he is done it still needs to be localised.

As Wikipedia exists in over 270 languages, you can imagine that the localisation for any application is a continuous process at translatewiki.net. The challenge for all the AppStores; support this continuous stream of new localisations.

Even existing users of a Wikipedia application may be in need for new and improved localisations. Really how are these stores going to cope ??

Typing made easier for six languages

The #MediaWiki Narayam extension provides keyboard mapping. This means that when you have an International keyboard the keystrokes will be mapped to what it would be on a keyboard with another layout. This is a great help when you can type blind.

The Mumbai Hackathon has provided us with new keyboard methods for Urdu, Panjabi, Gujarati, Bodo, German and Marathi. All these keyboard methods are available now for testing in translatewiki.net.

Photo Victor Vigras
These keyboard methods are a step towards making it as easy to use MediaWiki in these languages as it is to use MediaWiki in English. This challenge is one for the Localisation team. The challenge to enhance the usability for English is outside its remit; this is what all the other developers are working on but keeping up with their good work will keep the team occupied.

Saturday, November 19, 2011

Hackathon #India day one

With a large group like this, with so much talent a lot can be done in one day. When the achievements of the days were presented it was quite a lot.

  • Kiwix has been localised in four languages
  • The new Lohit fonts were loaded on translatewiki.net and tested
  • new bugs for the Lohit font were discovered and bugs were reported
  • an SMS gateway for sending Wikipedia articles was tested
  • an input method was created for Marathi
  • the Android app for MediaWiki is now supported at translatewiki.net

Getting the knack of Narayam

Narayam provides input methods to several languages of India. This is seen with keen interest by many Wikimedia communities. Their need for an input method is no less profound then for their language.

At the India hackathon people expressed interest to create an input method for Marathi or Panjabi. At the WikiConference India there was interest to create an input method for the Angika language.

When technical people and language people get together, they can do awesome stuff. The language people know what it should look like while the technical people can make it happen. Doing this in an organised way however is the difference between a hack and a solution.

To create solutions we have our language support teams; once hacks are tried and tested, when it meets with all the initial requirements hacks can become a solution when they are passed on to downstream applications. When a Marathi keyboard method has been defined, why not have it also in a Linux distribution or defined in the applicable ISO standard.

Yes, it is great to start with hacks but our languages are not only used in MediaWiki for Wikipedia.

Friday, November 18, 2011

WebFonts are ready for your comments

At the WikiConference #India, one of the main topics is support for the languages of India. The government of India acknowledges in its best practices that not everybody speaks English. MediaWiki may not be used by the Indian government, but it certainly aspires to bring these best practices in reality. In summary, we aim to make Wikipedia and all its sister projects as easy to edit in for instance Malayalam or Orya as English.

One of the stumbling blocks for us is that many of the computers sold did not come with support for the Indic fonts or with the keyboard mappings for Indic languages. The "Localisation team" worked hard to overcome these two very basic issues. For many languages the Narayam extension is now live and allows people to type. The next phase is providing WebFonts. Web fonts are fonts send to your computer with the data for the articles. 

Providing web fonts can only happen by us when fonts are available for a language and when these fonts are freely licensed. Luckily the fonts used by the Linux operating system are freely licensed and we are happy to provide these as well as other fonts to you.

At this time we are testing the use of WebFonts at translatewiki.net and we intent to bring WebFonts in production for all Wikimedia projects in Indic languages on December 12. We will start off with WebFonts for the Indian languages and other languages will follow at a later date. In the run-up to production day, we need you to assess the fonts for us. the questions we have are:
  • do these fonts represent the script as defined in the latest Unicode specification
  • are the characters readable
  • are you comfortable with the user interface that allows you to configure WebFonts
  • is the default we have chosen for your language the best freely licensed font available
Given that our aim is to bring information to everyone, you will appreciate that our prime objective is to enable more readers and help more people to find their way to our projects. We are quite happy to use any font that meets with our technical requirements when it is freely licensed. 

To help you assess the options available to your language, there should be a page called WebFonts that contains a copy of the India article in your language. 

Thursday, November 17, 2011

#Intrernationalisation - #India Government best practices

I was pointed to a PDF about "best practices for e-government applications for India". It is a wonderful document. Many of its messages are completely applicable for MediaWiki and supporting the languages for all our Wiki projects.

A few choice quotes:
  • It’s a simple question to ask when you build applications: Who is the user?
  • Most of us don’t speak English
  • Applications that are simple, seamless and complete for every user in India

When Wikipedia is to achieve its purpose, we can paraphrase another great line:
It is our objective now to demonstrate everything stated above available to all Indians regardless of their knowledge of English. It will be native to users’ experience and expectations.
The underlying infrastructure is ready and the implementation can be done. The outcome is cost-effective delivery of useful services to all people using the internet.

Tuesday, November 15, 2011

Gender distinction for the #Portuguese language

Making a difference based on gender is not the #Wikipedia thing. In selected situations it is however a #MediaWiki thing. Gender and particularly linguistic gender distinguishes how a man or a woman is addressed or referred to. For several languages the “User” name space will be named differently depending on the gender set in the user preferences.

When our messages support gender, it becomes possible to address people using natural language. This improves readability and consequently it makes MediaWiki a more friendly environment.

For the Portuguese language this has been implemented recently. Changing a name space is different from programmatic changes; it requires someone who can configure the servers. This means that Portuguese did support gender before it is just that one of the more visual aspects of the user interface supports gender.

Supporting gender is an iterative process. Many MediaWiki messages pre-date gender support and at translatewiki.net we get a steady stream of requests to support gender in specific messages on our “Support” page. With so many Portuguese speaking ladies becoming aware of our gender support, they must find occasions where messages could address them even better.

We welcome any lady from any language (as well as any gentlemen) to help us improve the MediaWiki experience and point out more messages that can do with a bit of genderification.

Monday, November 14, 2011

#MediaWiki code review leads to refactoring

Once software is written, it needs review. When you get a great review, it may include both specific instructions and general suggestions. Krinkle did a wonderful job reviewing the current iteration of the WebFonts extension and his instructions and suggestions equally apply to other applications we are working on.

This realisation that instructions and suggestions are equally relevant in other code is powerful; it also indicates that the existing code is written in a consistent way. For the Localisation Team it is not that surprising as it does its part in the development and the maintenance of coding standards particularly for internationalisation and localisation.

Given that all Wikimedia developers are reviewing each others code and given that every one has its strenghts and weaknesses, quality is achieved by reviewing widely. For many Wikimedia developers reviewing code of other developers is a reminder that they grew into their role because of the people who mentored them and reviewed their code.

Sunday, November 13, 2011

#CLDR, the name of languages in Serbian is in lower case

At #Translatewiki.net we prefer to use information from the CLDR. Sometimes we get a query why a specific message that uses CLDR information is incorrect.
Is there any parser that converts the words between it to lowercase? Value of $1 variable in this message, in Serbian, should be lowercase. So, Translation statistics for Serbian would be Статистика превода за Српски, which won't be gramatically correct since language names in Serbian are written lowercase.
We use the names of languages from the CLDR and we encourage our localisers, our languages support teams to look at the content of the CLDR because we do use it in our MediaWiki software.

When the data as provided by the CLDR is problematic, we can hack it. The fact is that we really really do not like to do that.

Holy cow, mobile devices morph #Wikipedia

Communicating that Wikipedia will change is the surest way to get into a fight. As far as I am  concerned, Wikipedia has to change and one of the best reasons why Wikipedia has to change is because people are using different devices to access a MediaWiki site.

Yes, Wikipedia will have to change drastically but how will it change? One of the early signals is in what happens in the mobile development. Brandon gave a "brown bag" meeting in the office. It has been recorded and really, if you do not want to be surprised, look at the recording, chew on it and look again.

It has things on recent changes, interwiki links, allow for editing or not. The behaviour on devices is discussed and really I recommend you watching it. Watch it again and rethink all the things that are taken as the Wikipedia gospel. It is good that things are reconsidered, revisited and reimplemented.

Mobile is where we grow, what we need to be support more and more in the future.

Friday, November 11, 2011

#Wikimedia highlights for October

Many people are eager to know how the Wikimedia Foundation is doing. Every month there is a long publication in English with everything you want and do not want to know. It can be translated, but it is considered too much of a burden on our translator community. In order to get the main message out anyway, there are highlights that are available for translation at Meta.

Once a translation is ready, the highlights can be referred to on the local village pumps of the projects for a language.

The latest Wikimedia highlights, the one for October, are looking for translators. When you make a request for translation, the translator should easily be guided to what needs doing. Two improvements to the original request have been implemented;
  • When you click on the article link, you will see the text in the language you prefer
  • When you click on the translate link, you will be shown the text fragments for your language
Meta is a multi-lingual project and it can be expected that people have set their user preferences to their mother tongue. When they do, we can provide them with translations when we have them.

Thursday, November 10, 2011

The page views for #Wikipedia in #India

With a new server collecting the page views data, anomalies are of interest. Do they represent reality or do they indicate something that is wrong. The page view statistics for the Indic languages show anomalies; check the growth for Marathi and Malayalam; not only do they outpace the Hindi Wikipedia, when you extrapolate the numbers for November, compared to the already impressive numbers for October it appears to be too good to be true.

We do expect the Indic languages to do well. The size of the traffic compared to the number of people who speak these languages is really small. When these numbers prove to be correct, it will be really interesting to learn what triggered this.

NB the traffic numbers for English are not restricted to India.

Thank you Domas ! - #Wikipedia stats are here again

As our projects mature, the amount of data they generate grows. The infrastructure needed to manipulate data grows as a consequence. The result is that hardware solutions provided by some of our finest reach the end of the line. Domas has provided us with hardware where the raw data of our traffic was stored. A whole infra structure of statistics depended on it.

This included the official Wikimedia statistics. This is where the need for a Wikimedia data solution became all too obvious. In moments of very heavy traffic, the old server could not cope and finally it just run out of steam. With the new solution, we will have more consistent data and we have more of a growth path. Obvious because the WMF is now in control of the whole chain.

Finally a word of thanks to Domas. As a result of his generous gift we have a lot of statistical data starting at a time where statistics did not have the priority it has today.

Wednesday, November 09, 2011

#Kashmiri uses #Devanagari too

We are familiar with the fact that a language can use different characters that are part of a script. Just have a look at the Vietnamese Wikipedia for instance; most of its diacritics really look weird and, it is what you get when you insist on using a script that does not really consider a tonal language.

The Devanagari script is used for many languages and the assumption that all these languages use exactly the same characters is wrong. Kashmiri uses characters not used for for instance Hindi. The result is that a font that does not include the additional characters does not suffice, the keyboard method will not be the same.

We often do not have enough knowledge to support the languages that have a Wikipedia. There are also languages waiting in the Incubator needing proper support. We want to support all languages well and, we want to support a language well from the start. Join the "language support team" for your language and help us ensure that we can support your language properly.

Tuesday, November 08, 2011

The #Wikia translation jam

Wikia has projects in more then 100 languages and they have many people working on the localisation of their software at translatewiki.net. In order to make this possible, it is vitally important that the developer write code that is both translatable and well documented.

To make this point to their developers, the developers at the Polish office were all invited to join in a "translation jam". This resulted in 253 translations in just an hour. The comments by the developers were really interesting:
  • "Wow, the interface is nice!"
  • "This is fast..."
  • "Oh, so that's the qqq documentation?"
  • "We should give people links to trac in docs."
Translations, documentation and awareness of the internationalisation process; this is indeed the kind of activity that makes Wikia more effective and responsive.

Monday, November 07, 2011

Improved #usability means more work gets done at #translatewiki.net

One item on the "to do" list of Niklas was making it easier to become a translator at translatewiki.net. We have had a procedure to follow for a long time. It is intended to ensure that people know at least some English. Knowing English is important because all the software we localise at twn start off in English.

The procedure is now more rational, easier to understand and, a newbie still needs to understand some English. The usability improvements do serve its purpose; more people are admitted as a localiser for their language. These new people are certainly as enthusiastic as people new to translatewiki have always been. As more people are becoming active all the projects that we support benefit.

As always, there is more work that needs doing. Please consider spending some of your time localising freely licensed software at translatewiki.net.

Freely licensed fonts are ugly, now what!

The #MediaWiki WebFonts extension exists to enable people to read their language. The #Tamil fonts that were initially selected do not deserve the "beauty" prize. Many in the Tamil community are calling for better looking fonts.

The issue is that at the Wikimedia Foundation, we can only use freely licensed fonts. Consequently we are really happy that people are reaching out to the government of Tamil Nadu to make their fonts available under a free license.

At the same time people working hard to get more existing fonts ready under a free license; they are completing the fonts, they are reaching out to the developers of fonts. They are working hard to make the use of Tamil on any digital devices a reality.

In many ways, Wikipedia is a "down stream" user of the technology developed for a language, a script. As there are so many languages we can stimulate developments for instance in our "language support teams". The best results will be when technology, fonts are available under a free license. It will help us support your language and it will help any and all other applications that care to provide the internationalisation and localisation needed to make use of it.

Sunday, November 06, 2011

#WCN11 - Wiki loves Monuments

At the WCN11 it was time for the jury to announce the winners of the Dutch Wiki loves Monuments challenge. This was the second time Wiki loves monuments hit the Netherlands and this time there were even more pictures taken.

Many pictures just showed the object, some were of technical excellence but the most remarkable included something extra, something special as well. These were the ones that impressed the jury most. The jury report in Dutch.

a picture with many details; the talking neighbours, the goat going into the pen, the laundry, the beautiful light ... the number one picture by Harm Joris ten Napel


Celebrating Eid al-Adha in the #Netherlands

When politics has its way, this year will be the last year when Muslims are allowed to celebrate Eid al-Adha in the traditional way. Traditionally a sheep is butchered, its meat is shared with the poor. According to a one issue political party, the "party of the animals" butchering animals in the halal way should not be allowed.

The law that effectively prevents the eating of meat by both Muslims and Jews still needs to go through the Dutch senate. My fear is that this law that may be well intentioned, will effectively have quite nasty results. At present animals are slaughtered in slaughter houses. With this law in place I am convinced that many animals will be butchered wherever because people will celebrate their festivals and they want to celebrate it in the traditional way.

PS Ied mubarak

Old #Hebrew shown as it should

When #Wikisource is to host old sources, it helps when the resulting digital text looks old. Particularly when the way a language is written changed over time, using a different font helps.

In Hebrew, signs may be added to a text that indicate how a text is to be chanted. Such signs are not well supported in most operating systems. Amir has added a default font created by the Culmus Project for texts in old Hebrew (language code hbo).

Testing the waters in this way, a next addition for old Hebrew may be a font that will do justice to manuscripts like the Dead Sea scrolls. It means that our digital text will resemble closely what the manuscript looks like. Having all the characters show properly and in the old style will add respect to the original text.

Once a transcribed text shows visibly the same, it will give added validity to the use of our digital text for comparative research. A project transcribing the Dead Sea scrolls is the kind of project that will make Wikisource shine.

#WCN11 - Uploading files to #Wikimedia #Commons

A hard nut is being cracked; selecting multiple files that are to be uploaded to Commons. There is a call out for testing this new functionality

Not only multiple file upload but other things have changed in the upload process. Read what needs testing, give it a go and see how you like it..

Please do create a profile but also change the language of the user interface to your language..

#WCN11 - Attribution is what a #GLAM needs, what #Wikipedia can provide

In Wikipedia we add a reference to the facts mentioned in an article. This is best practice in many Wikipedias. The argument for adding references is as applicable for pictures. Our pictures are freely licensed and consequently they are open to abuse by people who “improve” on the original. It is allowed. Many pictures improve as an illustration and this becomes clear only when you compare it with the original.

The original is not on Commons. Many original pictures are kept in museums and archives. Wikipedia will improve not only its references but also its relations with GLAMs when access to the original picture is made easy. To realise this we need to remove all the unnecessary layers between the picture as it is used on a Wiki and the GLAM.
Using images from Commons is something we know how to do well. What we need to do is add enough of the meta-data to provide more information and a link to the original at the GLAM website.
It will make us friends and it will help us clean up our act; we can include the license information. This will get us closer to what the intention of the creative commons license is.

#WCN11 - #MediaWiki gadgets

At the #Wikimedia Conference 2011, one of the best bits of news was in the presentation of Roan and Timo. They have been working on the rejuvenation of our JavaScript support, this brought us a lot of improvements in the past and now they have their sites firmly on the MediaWiki gadgets.

In summary, the syntax to support gadgets is worse then wikisyntax on steroids, the same gadget may exist in a gazillion manifestations on any of the 800 Wikis, internationalisation is a joke, gadgets as they are are awful. Awful because its infrastructure sucks not because of a lack of hard work by a community of people fulfilling a need.

Consequenrly the syntax has to be cleaned up, gadgets are to be shared among the many wikis, it should be able to use all the JavaScript improvements and as gadgets are to be shared, internationalisation is necessary in order for it to work well on all those Wikis.

The best news; Roan and Timo are already testing gadgets with many of the improvements in place. The gadgets are able to use MediaWiki messages and Siebrand and Roan are talking about implementing gender and plural support for JavaScript.

#WCN11 #Tropenmuseum knows the stories of expeditions

As colonial powers extended their reach, they send expeditions to those parts of the world where they might find an “el dorado”. Many of these expeditions had a dual purpose and expanding the knowledge of the world was one. When you were lucky, you lived to tell the tale. You arrived home heavy with notes, artefacts both human, animal, plant and inanimate. These artefacts were analysed and when done, they ended up in depots of museums. Museums like the Tropenmuseum.

The Dutch anthropological museums all got “their” part of these expeditions and now is the time when thanks to digitisation all the notes and artefacts can come together. Just bringing them together is already a lot of fun but it is only by telling the story to the public that it gains in relevance.

The Tropenmuseum is really happy with many of the results of their donations of Indonesian materials; it is the Indonesian Wikipedia where their material is used most. A project about expeditions may become as relevant as they are when the first systematic references occur in a written language.

Digitising the documents, the photos and combining them in Wikipedia articles is worthwhile in itself however this is not telling the stories of the expeditions and the stories of the collections. Telling the stories is what will make all the material come alive.

The pictures go into Commons, the subjects in Wikipedia, the sources in Wikisource. Can we have a Wikistories to tell the story and have a framework for all the content of the expeditions ?

Thursday, November 03, 2011

Learning the gender of Dutch words

It is hard to learn the gender of all the nouns of the Dutch language. There are so many nouns. There are three genders; masculine, feminine and neutral but practically masculine and feminine are mostly the same.

People who are not originally Dutch can be recognised because they always get "de and het" wrong. A really simple program could make reading a Wikipedia article a training exercise. All it has to do is to remove the words "de" and "het" and make it a toggle for the reader.

The objective; having more readers for the Dutch Wikipedia AND giving them an additional reason to do so. Many people are learning to write Dutch and they are often subject matter experts in areas where this Wikipedia is weak.

Tuesday, November 01, 2011

#Chrome #security is a bit too much

The logo for the Hindi #Wikipedia is used as an illustration on a previous post. Illustrations like this can be saved from the "page view info" that is available by right clicking a web page.

Chrome apparently does not like the Hindi Wikipedia; it does not only prevent me from accessing the page view info, the information is incorrect as well.

I did use Chrome before today. I am quite happy for Chrome to provide me with a security warning; I hate it when it prevents me to do my job.

Particularly nasty is the warning:
The site uses SSL, but Google Chrome has detected either high-risk insecure content on the page or problems with the site’s certificate. Don’t enter sensitive information on this page. Invalid certificate or other serious https issues could indicate that someone is attempting to tamper with your connection to the site.
Such warnings should be taken seriously and reported but I am quite happy to use Firefox in stead.

A #Wikipedia article in #Hindi about the Hindi language

To illustrate how WebFonts will work for Hindi Wiki projects, I looked for an article on the Hindi Wikipedia about the Hindi language. Apparently there is no such article; at least there is no interwiki link to the Hindi Wikipedia.

It really amazed me.

As the objective is to have a sizeable article, I copied the article in Hindi about India to the test page on the Hindi portal at translatewiki.net. A similar exercise is needed for all the official languages of India.

We are very much interested in learning how using WebFonts is received by the communities of the associated Wiki projects. Having a sizeable article allows people to test reading and editing in their language with the latest version of the WebFonts software.