Saturday, 30 January 2010

Will Apple accept Google NaCl on iPhone, iPod, iPad?

Apple did it again - they released a great product called the iPad. However, the main reasons that people buy these products, is that it is good quality and they do things better than the competitors. However, in order to protect the business model of Apple, and in order to ensure a good overall consumer experience, Apple restricts what applications can run on their devices. Google Voice was not initially accepted in the Apple app store, but Google has a great interest in making sure, that their services are universally available. Their solution was to create a Google Voice web application, that does the job, and now Google Voice is available on the iPhone.

Google Native Client expands the capabilities of the web, so that web applications can contain codecs, 3D graphics and code that requires a lot of computations, and still run well. HTML5 includes offline capabilities, and in combination, there is a complete framework for writing offline native apps delivered via the internet. The app store will no longer be needed by those that provide apps for free, unless they have very specific needs for access to local hardware.

Apple has a choice, but if they refuse Google NaCl, they will make their products to do less than the competing Android and Chrome OS products. Apple's products will be inferior. This will work as long as there are no significant apps for Google NaCl.

Google Chrome OS includes NaCl and is basically a lowest common denominator OS, where all the applications for it can also run on other OS'es. By providing Google Chrome for Mac+Windows+Linux, Google provides a platform that is large enough to make it interesting for app developers. It's standards-based, it's Open Source, it's easy and capable, it's free, it's cross-platform and it is huge. Apple can be in or out.

Apple is free to pick the default choices on their devices, and can limit access to local storage for web apps. But competition is catching up, quickly, and Apple needs to invent more. I would not be surprised to see Apple become very active in delivering a cohesive webservice offering similar to Google's. What's next, Apple buying Yahoo? Does Microsoft have anything to offer, or have they completely lost the consumer market?

Saturday, 23 January 2010

VMware converter review

Recently I upgraded to a newer laptop, and considered to virtualize my old laptop, because there are so many settings for software development, that are not easily copied. This includes search paths that depend on the drive configuration, installation of tools (cygwin etc.), and many other things. In order to be 100% productive at all times, I simply need the old laptop to be available until the new one is 100% ok.

This is possible using VMware and VirtualBox, but there was no full description of prerequisites and consequences, so I just went ahead with the solution that seemed simplest: Using VMware converter and VMware player. This is a short resume.

VMware converter can convert a physical PC into a VMware image, ready to be played inside VMware player. This enables access to your old PC from the new PC, in a window, which can be great until you have moved all the settings, files, and installed all apps onto the new PC. The first thing you need to do is to identify a PC that is suitable to receive the result of the conversion. It must:

* Have enough free harddisk space to keep the image of the old computer. If your old laptop had a 100GB harddisk with 70GB allocated (30GB free disk space), you should have at least 80GB available.

* Have the ability to share a directory as a windows file share. Not all PCs are set up for this.

* Have the ability to let the VMware converter app listen on port 80 and port 443. It is possible to use other ports, if you already have a webserver installed.

* It should use wired network, because then it only takes hours instead of days.

Your old computer should also use wired network, of course, and have a wired power connection (not running on battery) before starting. You need to install the VMware converter on both computers, and then the rest is rather easy: It is installed as a service on the "server" and on the old PC you are guided through the process. After a while, maybe 30 minutes, a progress indicator will tell you how long the entire process takes.

My old laptop was fully TrueCrypt encrypted, and I chose to unencrypt it before the conversion, just to make sure that it did not interfer with the conversion process. I am not sure that this is necessary, though.

When the process is finished, you have 2 files on the server, and you can uninstall the converter again. You can then move the files to your new PC, install VMware Player, and start using it. My old laptop was using Windows XP, and when I started it, it used 640x480 resolution in 256 bit color or something like that, and required to be re-activated. It took ages for it to load, but after installing VMware tools, it worked smoothly and nicely, with full screen resolution and full color, except that the bridges VMware networks connections did not always work. After switching to VMware NAT connections, that problem was solved.

Windows XP has to be reactivated within 3 days after doing this, because of the significant changes in hardware. I reconfigured the amount of RAM it should use, CPU settings, hardware settings etc., so that I had the final hardware configuration before reactivating Windows XP over the internet. That worked perfectly, and I could archive the old laptop. After a week, when I was completely sure that everything worked perfectly, that I didn't miss anything important, I retrieved the old laptop and wiped it by installing Ubuntu.

Except for the bridged networking problem, that was easily solved by switching to NAT, and the lack of pre-process documentation, everything went perfectly. I will do this again next time.

ANSI in Delphi is not about the character set ANSI

One of the most frequent misunderstandings that I have seen about the Unicode migration of Delphi, is that many consider the Ansistring of Delphi 2007 and older, and all the Ansi functions in the APIs, to be about the ANSI character set.

If you live in USA or any other country that uses Windows-1252 (aka ANSI) as the default character set, it all fits together: ansistring contains strings using the ANSI character set. However, in the rest of the world, things are much more complicated. The default 8-bit character set in Windows is not Windows-1252 in countries like Greece, Hungary, Russia, Japan, China etc. These countries use letters that need values >=128 for their encoding, or sometimes multiple bytes. This means that:

* Document filenames inside ZIP-files probably use characters that are not shown correctly if the zip file is opened on a U.S. computer

* Uppercase() and similar string operations does not work on normal ansistring texts.

* Simple Windows text files are not compatible with PCs from countries that use other character sets.

* Ansi* functions exist, but don't use the ANSI character set

For Delphi 1-2007 developers, it has always been important to use uppercase(ansistring), lowercase(ansistring) etc. for machine-readable text (identifiers etc.), and AnsiUppercase(ansistring), AnsiLowercase(ansistring) etc. for all human text (text from TEdit etc.) in order to have an app that localizes well. AnsiUppercase will use the current local character set for its conversion, no matter whether it is Windows-1252, or not, so that Uppercase('æ') becomes 'Æ' etc. Basically, all the functions that are prefixed with "Ansi", are the locale-sensitive versions, whereas the functions without the Ansi-prefix, are useful for machine readable stuff, where it needs to be 100% deterministic and locale-independent.

This also means, that all string variables in a good app would either contain locale-independent strings, or locale-dependent strings, but not both. It was important to make this distinction in order to know, whether to use Uppercase() or AnsiUppercase() on the variable.

With Delphi 2009, Unicode is often mentioned as a localization thing, so many people struggle to get this right. However, it's still the same problem: If your app is only meant to work in USA, you can disregard all the localization stuff, and it's VERY easy. If your app was already well internationalized, the conversion to unicode is also rather easy. It only gets really complicated, if your app was not internationalized, and now you want it to be. But that's not about Unicode strings - it's about internationalization.

Saturday, 16 January 2010

English - the superlanguage

My first programming language (ND80) saved all identifiers by reference in order to save RAM, which was scarce. Using the swap instruction, it was possible to replace any identifier with another, so basically, the entire, programming language could be translated to Danish, my native language. Sounds ridiculous, right? It was. Later, Microsoft did the same: Excel functions were translated to Danish, and even VB programming was Danish-ified, COM APIs were localized etc. This caused a huge amount of problems - it made support difficult, it made it difficult to find help on the internet, localizing APIs meant that some apps did not work with MS programs that were localized to other languages etc. Of course there were workarounds and solutions for most of the "problems", but the problems were real and sometimes caused real havoc. One of the 5 Danish regional administrations just introduced ODF as standard format for document interchange between MS Office 2003, MS Office 2007 and OpenOffice, because this solves problems like date format problems (ddmmyy in some, localized ddmmåå in others). It will not be solved fully, because if you have an expression in a spreadsheet where 'ddmmåå' is part of it, it may not work in a non-Danish spreadsheet at all, no matter how you save it. The easy solution was to do everything in English, using U.S. notation (decimal dot instead of decimal comma) etc. I guess everybody now realized that this is the way to go for source code, APIs, XML files etc.

However, in the recent years, evolutions in the internet has expanded this problem. Humans are increasingly interfacing directly to software, specifying parameters. The most common interface is the search engine. How do you explain it easily to a 6 y.o. how the angle of Earth's rotation axis creates summer and winter? Youtube, of course. But don't use Danish words for your search, it will probably not yield a single good result. So, even though my daughter can write on a computer, she still cannot use youtube. She doesn't know English. I encounter this problem many times per week.

The problem is not just limited to searching. Many electronic devices are not localized, a lot of software is not localized, and what language do you use on Facebook if your friends don't all understand Danish? Wikipedia is another good example: The absolutely biggest wikipedia uses the English language, and it is 3 times bigger than number 2: German. Wikipedia has become a significant provider of information, and you simply need to know English to use it.

In order to understand all implications of international contracts, English is the language of choice. EU has made a guide for European English, which defines a terminology that may not always match that of any English-speaking country, and many terminologies are translated from English-language originals. English has become the new Latin.

Google Translate tries to solve some of this. However, when I read Chinese web pages in Danish, using Google Translate, it is obvious that it was translated to English before it was translated to Danish. There can be many reasons for this, but it surely helps comprehensibility when I use Chinese->English instead of Chinese->Danish. Anyway, Google Translate cannot solve all problems, it's merely a patch.

Only 1-2 decades ago, you would have looked at the size of countries, measured by population count and economic size, in order to find out what language to learn. Today, English is much larger than the sum of English-speaking countries.

The latest statistics indicate, that other languages than English are currently losing popularity in school in Denmark. That's a problem: Most people in the world don't do English well. If you want to target those people, you need to localize. Even when you meet a person that seems to talk and understand English well, you need to realize, that this sometimes requires the full brainpower of that person. In other words, if you ask this person to solve a complicated task, that involves the use of English language, like programming HD recorder, it is much harder than if the HD recorder had been localized. Also, just because a person knows how to express himself/herself in English in a given context, it doesn't necessarily mean that this person can express himself/herself in another context that would work out fine in his/her native language. In order to localize well, an application specialist should know the target language well enough to be able to inspect the localized result.

So, remember to localize, learn languages, and remember to teach your children English. And in the unlikely case that your native language is English, here is a sign not to laugh at, it's very serious:



If you're in doubt about what it means, use Google Translate.