I was on the train, fighting against that locust general, trying to reach the solar bomb to get rid of all his vicious breed…Well I did it… at least virtually. I completed Gears of War, game of the Year, winner of several prizes and mentions, and got the status of “Mercenary”. So long, so good. Having played the never-ending Serious Sam (I and II) I expected the game to be somewhat longer, but apparently gone are the times when a game lasted for tens of hours… or maybe I’m getting too good at playing. Unlikely, I would say.
In this game you play Marcus Phoenix that begins his quest in a prison (Unreal, anyone?), freed by an old friend, stating that the army needs his help. And in fact any help is desperately needed – the world has been taken over by the Locusts, a cruel race of creatures resembling of reptiles, they savaged the cities so that the government decided to bomb everything (smart move, isn’t it?). Roughly you have the task of mapping the caves of the Locusts and then activating a “solar bomb” to destroy them all.
The game is a 3rd person shooter and one of the best yet seen. The graphics is gorgeous, really delivering a full immersion to the player, is movie-like quality and the suspension of disbelief is very easy. Classic realtime graphics defects (such as polygons cut by the camera plane, textures revealing their pixel based nature, squared objects) are basically non-existent. For this reason and the abundance of gore, this title is really not suitable for kids.
The only dissatisfying aspect is that bulk objects (such as the choppers) are apparently without mass, their movements is not as smooth and … inertial as they should be. The best bulk “mass” simulation in videogame remains Halo 2.
The camera (aside from never ever letting you down), sports an involving war-footage style.
The game play is slightly innovative, and it is quite difficult given that the genre counts tons of titles. New is the need for the player to look for a cover in a firefight, you have to plan your moves quite accurately if you don’t want to be blown off.
You can carry two weapons. Aside from grenades, there is a basic set of weaponry – the standard machine gun (with a chainsaw), a sniper rifle, a bow with explosive darts, a rocket launcher and a shotgun. You have a non-standard “Hammer of Dawn” which is a targeting device for calling satellite beam attack on your enemies. The satellite attack takes quite a long time for aiming and works only outdoor with clear sky (and satellite coverage).
You have a team, usually just another guy the fight at your side. His AI is pretty brilliant – he usually don’t get into your line of fire and doesn’t stops you from moving around (or worst blocking while you are retreating, as it happened in HalfLife 2). In the beginning of the game he usually hints you for the direction where the game proceed. The game is never too difficult on the brain side, the most difficult puzzle you have to solve is find the switch aside of the door you want to open, nor you risk of getting lost – maps are pretty big, but the path is so marked you have no chance to get it wrong.
The only downside with the gameplay is that occasionally the “take cover action” interferes with the movement, or the crouched-run that you try to save your life. Just occasionally annoying. Also, if you are picky enough you could note some bugs here and there, such as enemy boss that slams into invisible barriers, or the multipurpose floating robot Jack that appears out of thin air. The worst bug I encountered was against a boss, I was expected to attract the boss on a carriage with a gasoline tank, then leaving the cart and throwing a grenade to let the carriage explode with its annoying passenger. Unfortunately the split second I throw the grenade the boss jumped on my wagon, causing the trap to explode and leaving me without any weapon to get him off.
But these are just minor quirks for a great game I really enjoyed.
The Design and Evolution of C++
C++ language despite of the powerful mechanisms supported is not a language for the faint hearted. Two forces drive its peculiar concept of friendliness (it is not unfriendly, just very selective) the backward compatibility with C and the effort to not getting in the way to performances. This book, written by the language father, presents and analyzes the language history and the design decisions. And, given the writer, the perspective you get reading the book is very interesting and more than once helps to shred some lights in the dark corners of the language.
The history is very interesting since it details how the language genesis and marketing went from the AT&T labs to the academy and industry.
C++ design principles are presented and the most notable is that of ease of teach-ability. Several time proposed/existing features had been modified or dropped entirely because they were not easy to teach.
Another very interesting principle is the “you don’t pay what you don’t use”, meaning that features added to the C language in order to define the C++ language were designed so that the programmer would not incur in any penalties if not using them. That’s why if a class has no virtual method, then the pointer to the virtual methods table is not included, saving the pointer space from the class instance memory footprint.
Aside from answering to many questions, the book opens up a bunch of new ones. For example, the very first implementation of C++ has been developed practically around a threading library. Now more than 30 years later, in a world with an increasing presence of multi-core machines, the C++ standard still lacks of a multithreading / multiprocessing facility.
Also Stroustrup asserts more than once that a Garbage Collection way of managing memory could be add by a specific implementation. But fails to explain how this non-deterministic way of terminating dynamic memory life could deal with the deterministic needs of destructors. Likely I’m just to dumb to figure out myself.
The big miss I found in the book had been a comparison with Java language. Basically one of the great contenders for the title of most widely used programming language. Java, on its side, has some interesting approach to language design that conflicts with those of C++ (e.g. the C compatibility issue). Therefore it would have been nice listen from Bjarne voice his thoughts about. In his defense it has to be noted that by the date of this book hit the streets, Java hype had just been started.
Last complain about the book is the lack of conclusions. The book seems cut a couple of chapters before the real end. Aside from stylistic point of view, some words about the future evolution and perspective would have been at their place at the end of the book.
So Long and Many Thanks Folks
I don’t like the way it is being put. By now everyone understands there’s something really wrong with the climate. You don’t need to be a genius – just have a look at the calendar and the thermometer.Rather than being some degrees below zero, this winter we have blossom, buds and shoots. That’s not the way that’s expected to be in mid-January.
So media are starting covering the issue. And I don’t like the way they are handling this. They are telling us that unavoidably things will go worse and worse – desertification, sea raising and submerging lands and cities, drought, and so on. Now, even more alarmingly, they speak about huge number of casualties and economic crashing in the Mediterranean areas.
The reason I don’t like in this message is the implicit statement that all this is ineluctable and unavoidable and it has always been. In other words – you are going to die, start considering it… but, hey, you wanted comfort, car and TV-set? Don’t complain… die quietly please.
Since I was young (that’s now more years than I’m keen to mention), environmentalists and scientists are telling us, our government our industry leaders that we were polluting too much, that we had to choose a more sustainable way of exploiting our planet resources. We also had a number of summits and round tables (Kyoto anyone?) about the matter where light or no actions were agreed.
If now it is too late, the only fault we could be blamed about is not having given our vote for different political leaders more attentive to these problems.
But is it true that nothing can still be done?
It is true that things change quickly, often with an exponential rate rather then linear. The meaning is that it could well be that we get aware of a certain effect when it is too late to act on its cause removal.
To explain exponential growth I really like the chessboard example. Everyone knows the legend. It is about the payment a wise man asked to his King for having taught him to play chess. The exact amount of the payment had to be computed with the chessboard. Starting with one grain of wheat in the first square and doubling the amount at each square. That is 1, 2, 4, 8, … and so on.
Here comes the most interesting point – the King gave him a couple of sacks thinking that it was enough, but the wise man objected that the exact quantity was more than the entire collection of the whole kingdom.
What’s so amusing? It is that exponential growth fights and defeats our intuition. You can consider it a slow pace linear process until it really explodes.
Back to the weather it is well possible that now it is to late to cancel the nefarious effect of the mankind on the environment, but I am strongly convinced that it is never too late to do the right thing.
I have the impression that media are just relaying propaganda. What is the most convenient move? Just do nothing, let the poor people die while those who hold the power survive. It won’t be hard to survive if you can live everywhere you like, if you can afford water, food and energy at any price, if you can have your personal army.
“Dear passengers this is the captain speaking, the smoke on the right side of the aircraft is the engine that’s burning, the little spot below it’s me with the only parachute that was on-board… I hope you enjoyed the flight”.
Back from Sciliar
I had great days on the Alpe di Siusi (Alp of Siusi) or SeiserAlm as those who live there call their home. It is a lovely and smooth plateau at around 1800m, braced by imposing Dolomite pikes. Sasso Piatto (“Flat Stone” a sort of understatement) bounds the East side, while Denti di Terra Rossa (“Red Soil Teeth”) bound the South Side and ends with the tooth shaped Sciliar pike.We had sun for nearly seven days and, despite of the warm winter, snow was enough to ski.
Alas, in order to appreciate great things, we have to compare them with the grey, dull industrial landscape of Castellanza, that’s why (I guess) I’m back home and at work.
The first interesting surprise hitting me at work has been the anticipation of the milestone I was working for. Our customer product has been selected for a design prize, so we are expected to deliver the working product earlier. Anyway we’re working hard, against time and hardware shortage to hit the milestone nonetheless.
At home, Santa (in the person of my wife) gave me a Xbox 360 and I started playing a not-so-Xmas-spirit game: Gears of War. I’m about the first boss and I should say that it’s great. From the technical viewpoint I think this is one of the first real next gen game. It runs on the Unreal 3 engine and the look is as detailed as awesome. The gameplay is based on taking cover, i.e. as soon as enemies are encountered you should take cover or you get badly shot. This is somewhat different from the classical shooter where the player drives a Rambo-like bullet-proof character (well, in Serious Sam, this was intended). The first boss is a chasing game play – run away from the monster, let him smash the doors for you and eventually take him off. Great.
While I was so fresh from the holidays and relaxed from Gears of War, I decided to update my notebook to the latest linux available. I gave a brief look to Sabayon Linux, only to discover that it behaves badly with the Toshiba touch pad and apparently has no support for my wireless adapter (I can’t believe that today distros still do not support the Centrino wireless adapter that is so widespread and at least two years old). So I turned to what I know quite well – Fedora Core 6.
I opted for the upgrade option instead of the install. Years ago I was used to upgrade, only to find that the system resulted in something that wasn’t completely new nor old and often was prone to glitches. A friend of mine suggested me to never upgrade, rather to backup the /home directory, install and restore it. This time I was so light from the holidays that I decided that an upgrade could do.
Well, I was wrong.
Yes I got a sort of FC6 tailored on my previous FC4 installation, and, yes, the wireless adapter sorta worked. But I could only browse the google website. No matter how I set the firewall/SELinux properties, there was no way to browse the rest of Internet. But this is another story.
It’s Christmas… again
Happy Christmas and Great New Year to you all.
What if everything be just fake?
Sometimes it happens that reality baffles my expectations and my ego pretends that electronics, physics, computer science or whatever it is not responding the way I expect, work just for chance. No matter what you do, the result will always be random.Consider a wifi USB dongle. It behaves pretty badly indoor. Let’s face it, all manufacturers claim a 100m range, but, in practice, in real apartment environments it works fine within 10m, it becomes unreliable at 15m, stone dead at 20m. If you want to do real time communication, in my experience, you’d better halve the ranges above.
Ok. It may seems stupid trying to reduce the range, but it is handy when you want to test what happens at the range limit, without wearing out your shoes.
Now well over a century of science tells you about Faraday cages, roughly put – a metal sheet shields radio waves. So if I enclose my wifi dongle in a pretty solid metal box I would expect either no radio communication or a dramatic cut of the range. Right?
Well, time to rethink.
I experienced the very same range of the unboxed dongle.
With the same reliability.
It is like Laws of the Universe consider themselves superior to my request and diregards my experiment, saving themselves for more worthy causes.
BTW, in the meantime, I have read Stefano Benni’s “La compagnia dei Celestini”, Roberto Ierusalimschy’s Programming in Lua, Eoin Colfer’s “The Secret safe”, Alfredo Castelli’s “Rama’s Left Eye”… I suppose it’ll take quite a time to write review for each one of them :-).
Secrets and Lies
Some days ago I helped a coworker with an oddly behaving Makefile. I am a long-time user of this tool and am no longer surprised at ‘make’ doing the unexpected in many subtle ways. This time the problem was that a bunch of source files in a recursively invoked Makefile were compiled with the host C compiler rather than the cross-compiler as configured. Make, in an attempt to ease the poor programmer’s life, pre-defines a set of dependencies with a corresponding set of re-make rules. One of these implicit rules states how to build an object file (.o) from a C source file (.c). The rule is somewhat like this:
.o: %.c $(CC) -c $(CPPFLAGS) $(CFLAGS)
And by default, the CC variable is set to ‘cc’, i.e. the default C compiler on Unix systems. Bear in mind that this is a recursively invoked make, therefore it is expected to be hidden at least one level away from the programmer. In the other hand the build has configured the top-level make to use the cross-compiler arm-linux-gcc. The problem could also happen because ‘make’ has a local scope for variables, i.e. variables are not exported by default to the recursively invoked makefiles.
The hard part in spotting the problem is that everything works as expected, i.e. the build operation completes without a glitch a you are left wondering why your shared libraries are not loaded on the target system.
Once you know, the problem is easily fixed, but if you are an occasional Makefile user you may experience some bad hours seeking what the heck is going on.
Hiding isn’t always bad – you need to hide details for abstraction and consider complex objects as black boxes to simplify their handling. One of the three pillars of OOP is “encapsulation”, which basically translates as data opaqueness, the object user is not allowed to peek inside the used object.
The question arises – how much “hiding” is good and how much is wrong?
The C compiler is hiding away from the programmer the nits and bits of assembly programming so that he/she can think of the problem with a higher level set of primitives (variables instead of registers, struct instead of memory, and so on).
If you want to go up with the abstraction level you must accept two things:
- you are losing control of details;
- something will happen under the hood, beyond your (immediate) knowledge;
Going up another level we meet the C++ language, with a greater deal of things working below the horizon. For example, constructors implicitly call parent class constructors; destructors for objects instantiated as automatic variables (i.e. on the stack) are invoked when the execution leaves the scope where the objects had been instantiated.
If you are a bit fluent in C++ these implicit rules are likely not to surprise or to harm you. If you consider a traditional programming language such as C, Pascal, or even Basic (!), you will notice quite a difference. In traditional language, you cannot define code that is executed without an explicit invocation. C++ (and Java for that matter) is more powerful and expressive by hiding the explicit invocation.
In many scripting languages (such as Python, Lua, Unix shell, PHP… I think the list could go on for very long) you don’t have to declare variables. Moreover, if you use a variable that has not yet been assigned you get it initialized by default. Usually an empty string, a null value, or zero, depends on the language. This could be considered handy so that the programmer could save a bunch of keystrokes and concentrate on the algorithm core. I prefer to consider it harmful because it can hide one or more potential errors. Take the following pseudo-code as an example
# the array a[] is filled somewhere with numbers. while( a[index] != 0 ) { total += a[index]; index++; } print total;
If uninitialized variable values can be converted to number 0, then the script will correctly print the sum of the array content. But, what if some days later I add some code that uses a ‘total’ variable before that loop?
I will get a hard-to-spot error. Hard because the effect I see can be very far from the cause.
Another possible error is from mistyping. If the last line would be written as:
print tota1;
(where the last character of “tota1” is a one instead of a lowercase L)
I would get no parsing and no execution error, but the total would be always computed as zero (or with some variations in the code, could be the last non-zero element of the a[] array). That’s evil.
I think that one of the worst implicit variable definitions is the one made in Rexx. By default, Rexx variables are initialized by their name in upper case. At least 0 or nil is a pretty recognizable default value.
Time to draw some conclusions. You can recognize a pattern – evil hiding aims to help the programmer to save coding time but doesn’t scale, good hiding removes details that prevent the program from scaling up.
As you may have noticed lately, the world is not black or white, many are the shades and compromises are like the Force – they could yield both a light side and a dark side. E.g. C++ exceptions offer the error handling abstraction, at the cost of preventive programming nearly everywhere to avoid resource leaks or worse.
Knowing your tools and taking a set of well-defined idioms (e.g. explicitly initialize a variable, or use constructor/destructor according to the OOP tenets) are your best friends.
Spaghetti Alla Disperata
I’m no cook, but I see this post got several visualizations over the years. If you enjoy you are welcome to put a like or leave a comment.
You are back home late and you have to dine. You open the fridge wondering why it is so hard to open. The answer strikes you immediately when the door is pushed open – the fridge is full of vacuum and it… sucks. So you need to arrange something… what’s there? A tomato is desperately fighting against mold. A not so expired mozzarella cheese… well let’s have a look at what you can cook.
This is, more or less, what happened yesterday evening. Here is the recipe I recovered from a book requiring exactly (lucky I am, isn’t it?) what I had in the fridge.
The name is Spaghetti alla Caprese, I named them Spaghetti alla Disperata (Desperation Spaghetti). For 4 people:
- 400g spaghetti
- 100g mozzarella cheese
- 70g tuna
- 2 fillet of anchovies
- ground pepper
- 6 black olives
- 3 spoon of olive oil
- 350 of tomatoes
Peel the tomatoes, cut into small cubes, and take the seeds apart. Put them in the saucepan where the olive oil is already hot. Cook the tomatoes for about 15 minutes occasionally stirring them.
Mix the anchovies and the tuna in a mortar (you can use a chopping board with a chopping knife instead). When the mixture is well blended add the olives cut in little parts.
Strain the past a bit early and put it with the tomatoes, add the fish/olive mixture and some ground pepper. Finally, add the mozzarella cut into small cubes.
Enjoy and then go shopping something for to eat tomorrow.
Summer holidays travel log
I have just uploaded my last summer holidays [downloads/Diario_Grecia_Naxos_2006.pdf|travel log] (Italian only). It talks about our travel and stay in Naxos island. I hope you enjoy it at least as we enjoyed our holidays. Feedback and comments are very appreciated.
Frustrated programmers on Linux
Life is hard. Especially in the working hours. The more hard the more you have to do your job on linux for linux. If you think about it, that’s odd. Back in the Days, Unix was the result of a young team that sought a programming (and hacking) environment. At times they had very programmers-unfriendly environment and Unix was very successful in this respect – text editors, interpreters, advanced shells, programming tools and the like flourished in the Unix filesystems.
Today is like those days… well it is still like that, in the sense that the rest of the world, most notably Microsoft, caught up and overtook the Unix command line.
First, suppose you want a project aware C/C++ editor. In the Windows world, maybe you have not much choice, but the answer is easy, the latest Visual Studio is quite good, allowing you to do nearly everything in a comfortable way. Linux is lagging behind, there is vim, emacs and Eclipse. Eclipse is indeed good (considered the price), but its C/C++ editing capabilities are far inferior to the basic Visual Studio. Maybe you can get it working in the same way, but this requires a great effort for those that are not fluent in this developing environment.
Suppose now that you want to interface your application with audio. If you use the basic operating system functionalities (likely OSS) you can do it rather quickly at the price of audio exclusivity. If your application is running no one else can access it.
This is known problem and has a known solution – using a sound daemon that perform mixing from multiple audio applications. This is reasonable.
What is unreasonable is that Linux sports a wide variety of such deamons, everyone has his own. What is yet more unreasonable is that both Redhat/Fedora and Ubuntu use the eSound daemon that has no documentation.
So you are forced to not have a standard choice and, what is worse, the choice you are forced to has no documentation whatsoever.
Frustrating, isn’t it?