Recently I did a fresh install of Fedora 10 + gnome on a system and was unable to find a way to automatically enable gdm autologin.

Anyway adding the following lines to /etc/gdm/custom.conf and restarting gdm did the trick:



I’ve been thinking for a while on the right tool to create a media centre for home: today I’m giving a try to ELISA.
It is quite a mature all-media player which is written in python and based on gstreamer

However, due to problems in plugins code review, the official Fedora 10 repositories still lags to version 0.3. See “Elisa doesn’t start” and “Update ELISA” from bugzilla.

It is possible to install it on Fedora 9 following this instruction.  For Fedora 10 you’ll find that the repository has moved to

In order to install it I just:

  1. Downloaded elisa.repo from its original location and copied it to /etc/yum.repos.d
  2. Imported
  3. Installed “elisa” from the graphical front end with all the related packages (plugins-good,bad and ugly)

After installation it crashed with the error:

Elisa failed to initialize: [Failure instance: Traceback: <type ‘exceptions.ImportError’>:
No module named cssutils

Installing package python-cssutils make it working for me. After looking at the Internet section, I tried to play the latest Apple Movie trailers directly from the program interface. But ELISA complained it could not find any MPEG4 or H.264 decoder.

Being gstreamer based, I thought that simply installing all gstreamer-plugins (again, good,ugly and bed plus some other) would solve the problem but… nope, still no H.264 video.

I eventually installed PackageKit-gstreamer-plugin and fired up Totem to see if it could play one of the trailer from Apple: it magically detected the missing plugin and asked me to install gstreamer ffmpeg.
Now everything is working like a charm!

Hi again,

Eventually I discovered a wonderful software which simulate the physiology of the whole human body: it is created by the department of Physiology of the University of Missisipi medical center.

The real plus is the Laboratory manual, which is full of guided experiments to do on the simulated human being and explains carefully how to run each simulation.

There are two different application: QCP and QHP. For now, they seemed similar to me and I mainly used QCP because I was following closely the laboratory manual.

The user interface is not very intuitive and can be laborious to find the needed variables. Moreover there seems to be no way (but I’ve just started to use the software ) to use the main program to generate a custom presentation of the results to, say, follow a selected group of variables or replay a script with a bunch of intervention.

There are a model editor and solver available here and a list of created models here

On the other hand I noticed that the QHP software comes with all the module it uses to simulate the body as xml documents, so it is easy to inspect them and learn how the simulation is structured. It is also well documented, as there are over 250 pages of documentation of the XML schema

However, I wonder why they don’t use the emerging standards of SMBL or CellML

I think that especially the equation declaration part of the XML schema would benefit from a move to MathML

I’ll bet I’ll learn a lot with it!

Today I found out the OpenCog project

I usually step over the projects pretending to be the Next Big Thing(r) in AI, but this time I noticed that the project has been admitted to GSoC 2008 so, I thought, it couldn’t be so bad,could it?

Actually it looked very interesting and before I could even realize it, I was dwelling into the documentation…

It seems that the underling architecture is thoroughly documented but quite complicated. I’ll need some time to study it, but at a first glance they will provide a:

  • Knowledge Representation called AtomSpace (maybe similar to the one provided by OpenCyc)
  • Plugin container with the learning algorithms

The two things that captivated be were that they ship an improved version of the LinkGrammar parser that I used on my thesis and that they are working on a connection on OpenSimulator, a SecondLife open source reimplementation. Thus allowing the system to be embodied in a full 3D world…

I always thought that embodiment is fundamental to intelligence because having a body enables the intelligent agent to ground into reality all the words and the abstract categories it uses to reason. Notice that I didn’t used the word “program” but “agent” because I think that even humans use grounding to understand the words and the concepts they use day-by-day, but this will probably deserves a post on its own! 🙂

While waiting for intelligent robotic life to conquer the virtual worlds like Second Life, in the meantime I will watch one of my favourite films about AI:

Ever wondered what LHC experiments are about? (apart from destroying the world 🙂 )

Thanks Thomas for a very enjoyable video:

Who says scientists are boring?

Scusa dov’è che andiamo esattamente questo sabato pomeriggio? in una wellness??? Ma che andiamo a fare là?

Nel primo venerdi da quando sono arrivato in Spagna la settimana è stata piena di cose da fare, di impegni burocratici e di sistemazioni sia in casa ed in ufficio. Ci voleva proprio qualcosa per staccare… ed ecco che, appena invocata, arriva una mail:


¿Alguien se apunta a ver las estrellas esta noche con Mikel y su telescopio?