Monday 27 July 2009

3D full screen game using O3D

I just ran across this cute little 3D game on the internet:

http://blog.largeanimal.com/demo/

Remember to click the full-screen button in the bottom left, for a full experience. You will need the O3D plugin which will be included in Google Chrome, soon.

The game seems to be entirely made using JavaScript. If you want to see how it looks using Google Chrome app mode, in Windows, create a Windows shortcut like this:

"C:\Documents and Settings\Username\Local Settings\Application Data\Google\Chrome\Application\chrome.exe" --app=http://blog.largeanimal.com/demo/

The game does not adapt to the window size and still shows some HTML stuff when not in full-screen mode, and it's also a bit slow to load compared to what we can expect in the future, but even in its current state, it makes it awfully old-fashioned to run setup programs or to use MSI files.

Buying CO2 credits doesn't make IT CO2 neutral

More and more IT companies try to become CO2 neutral by buying the CO2 credits that match the amount of power they use for the IT equipment. The argument is, that if they use 1 ton CO2 to run their servers, then they buy 1 ton CO2 credit from the market, removing 1 ton of CO2 emission elsewhere.

Nice thought but that's not how it works. Buying CO2 credits like that just means that you need 2 ton CO2 credits produce 1 ton CO2, basically increasing the price for emitting CO2. If everybody did this, CO2 emissions would be cut by 50%, but not 100%. The good thing is, that the increased price of emitting CO2 generates additional incentives for developing new energy technologies. However, the IT company still emits CO2.

What makes matters even worse, is that market economy ensures, that if you can spend your money on energy during operating equipment, or energy during manufacturing of equipment, you will spend your money where you get most value. And if the equipment is produced in one of the countries outside the CO2 market (like USA or China), you will basically just push the problem out of the market, but not away from planet Earth.

It is good that companies try to use the CO2 emissions topic for profiling themselves, but nobody gets CO2 neutral by burning coal.

Friday 24 July 2009

Delphi apps as Chrome apps?

Google is doing a lot of great stuff with Chrome these days. The first version of the browser included it's own task manager, process administration, sandboxing etc., but Google now also announced accelerated 3D support and native code, sandboxed. I would not surprised to see Google Earth to be one of the first applications, that stop being Windows applications and start being Google Chrome applications: Install chrome and visit a specific URL, and it works. No more "download this app" or "install this app", except for Chrome itself.

This is all great, of course, but what development tools do we use for that? C/C++ is notoriously unproductive, and so are web apps. Web 2.0 apps are even worse. We need some tool that can create cross-platform apps, delivered using Google Chrome or similar, that perform well and are easy to write. Delphi has previously shown, that it can compile to several platforms, and Delphi Prism officially targets mono, so why not take on this one?

Jim McKeeth on Delphi Prism in Copenhagen

For those who are not fortunate to be able to read Danish: The Danish Delphi users group has a workshop on september 15th and 16th, introducing Delphi Prism to experienced Delphi developers. I assume that members have first priority, but I also assume that Jim hasn't learned himself Danish and keeps everything in English. I will be there, too.

Sunday 19 July 2009

The difference between craftsmanship and engineering in software development

(Inspired by Jeff Atwood's latest post about Software Engineering)

Definitions:
* Craftsmanship
* engineering
* Software Engineering

Instead of commenting on Jeff's article, I would like to give a real engineer's view on software engineering. I usually say, that a craftsman can be good at creating something that is similar to something that has been created before, in good quality. An Engineer can create something that has not been done before, and predict how long it takes and predict how it will work.

If we look at software development, is it often possible to assign each part to one of two types:

* Doing something that is easy to specify and/or predict (designing a GUI etc.)
* Doing something that is not easy to specify/predict (research, unknown territory)

If things are relatively easy to specify, you can do calculations like "100 forms of each n hours = 100*n hours". This will be a calculation with some uncertainty, of course, but it gives you a good idea of the size of the project. If the tools and methods are well known, you can illustrate the predicted result to others by showing them similar applications. Good craftsmanship is needed for tasks like these, and these processes can be measured easily using metrics, quality is easy to measure and control.

If things get hard to predict and/or specify, engineering starts. Then you need models, knowledge about how things work, knowledge about many options, ideas, and all the other things that we know from traditional engineering. This is also where architecture comes in - and I prefer the definition of architecture as "The decisions that we make early".

When doing Engineering, the skills and knowledge of the central decisionmakers can make a huge difference. This is where the productivity difference between two "programmers" can become more than a factor 1000 - and where wages differ a lot. QA, QC, metrics etc. are usually difficult, and the lack of predictability can be bad for a big organization's ability to cooperate. If marketing needs to prepare a big launch, they need to know when the product is finished.

A project can choose the craftsmanship path by picking an existing development platform, so that all components of a software product can be produced by craftsmen. This is usually preferred by large organizations because it is predictable and well documented. It may also be more expensive, but in the overall budget, that is less important.

The Engineering approach makes sense, if the product's specifications are not met by any existing platform, or if the software project's costs are significant to the organization. This does not mean that engineering can make your costs lower - it means that sometimes your costs can be lower using engineering.

So, what is good engineering? Basically, making the unpredictable predictable, making things really cheap, and delivering what is wanted. What do you need to know in order to do that? My list is this:

* Knowledge about computer hardware, network, programming methods, abstraction methods
* Organizational theory, management knowledge, psychology
* Mathematics, statistics, economy
* Decision theory, risk management, training in using the scientific method
* The knowledge of the users of the product (!)

You don't need all this knowledge in one person, but the combined knowledge is needed in order to achieve good engineering.

Saturday 4 July 2009

Jeff Atwood is wrong about performance

Jeff Atwood likes referring to his blog post about Hardware is cheap, programmers are expensive. where he writes: "Given the rapid advance of Moore's Law, when does it make sense to throw hardware at a programming problem? As a general rule, I'd say almost always."

I totally disagree, of course, but here is why:

* The parts of hardware that complies with Moore's law is usually so fast, that it is not the bottleneck. I cannot remember when 100Mbps ethernet was introduced, and when they started to deploy 10Gbps networks, but many new virtual servers are limited to 10Mbps these days, and this does not smell like Moores law. Does anybody except me miss the good old days, where new harddisks had a lower seek time than the old ones?

* If you have upgraded all your servers without consolidating, year after year, you will notice that your electricity bill is going through the roof. It's very simple: Today, you are running that 1995 app on an extremely fast server, even though it was built for a slower server. You are simply using more energy to solve the same problem, and that's why everybody tries to reduce the amount of hardware these days. Many data centers are looking into energy efficiency, and they won't just put up a new physical server because you want to save programmer wages.

* Many speed improvements are not what they seem. 100Mbps ethernet is not always 10 times faster than 10Mbps ethernet, it's more complicated than that. The 20-stage pipeline of the Pentium 4 was also not an improvement for everybody.

* Many performance problems are not related to the number of bits per second, but to the latency. Sometimes latency goes up when speed goes up - I have seen several examples of things getting slower as a result of upgrading performance. The most well known example is probably that the first iPhone used GPRS instead of 3G, but that GPRS would actually make some things faster than if Apple had implemented 3G right away.

* If programming generally disregards performance, the performance problem is not solved by improving hardware speed 10 times or 100 times. A large application that is written totally without regard for performance, can easily be more than 1000 times too slow. I have been a troubleshooter on several projects, where the application performed more than 1000 times faster when I left.

But here is the most important reason why programmers should be solving performance problems:

* It takes one programmer little time to design things well, but it takes technicians, users, other programmers, testers, etc. a lot of time to wait when the software is slow. Bad performance costs a huge amount of money.

Here are some good tips:

* Performance improvements of less than a factor 2 are often not worth spending time on. Do the math and find out how big the performance improvement factor really is, before spending time or money on solving it. If your best finding improves less than 2 times, then you need to search harder.

* Do the math on the benefits. If your turnover improves 2 times because your website becomes more responsive, the entire organization should support your efforts.

* The later you start caring about performance, the more expensive it gets.

Anyway, it seems that Jeff Atwood has realized, that he can run into problems, too.