Summer report

For some reason I never quite understood, I always tend to be extremely busy in the summer when I would much rather enjoy the fresh air and take it slow, and be less busy during the winter when heading out is less attractive. This summer was no exception. After the traveling, I started a new mandate with a new client, and that brought my busyness to a whole new level.

In my last post, I mentioned a lot of wiki-related events happening over the summer and that I would attend them all. It turns out it was an exhausting stretch. Too many interesting people to meet, not enough time — even in days that never seem to end in Poland. As always, I was in a constant dilemma between attending sessions, the open space or just creating spontaneous hallway discussions. There was plenty of space for discussion. The city of Gdansk being not so large, at least not the touristic area in which everyone stayed, entering just about any bar or restaurant, at any time of the day, would lead to sitting with an other group of conference attendees. WikiMania did not end before the plane landed in Munich, which apparently was the connection city everyone used, at which point I had to run to catch my slightly tight connection to Barcelona.

I know, there are worst ways to spend par of the summer than having to go work in Barcelona.

I came to a few conclusions during WikiSym/WikiMania:

  • Sociotechnical is the chosen word by academics to discuss what the rest of us call the social web or web 2.0.
  • Adding a graph does not make a presentation look any more researched. It most likely exposes the flaws.
  • Wikipedia is much larger than I knew, and they still have a lot of ambitions.
  • Some people behind the scenes really enjoy office politics, which most likely creates a barrier with the rest of us.
  • One would think open source and academic research have close objectives, but collaboration remains hard.
  • The analysis performed leads to fascinating results.
  • The community is very diverse, and Truth in Numbers is a very good demonstration of it for those who could not be there.

As I came back home, I had a few days to wrap up projects before getting to work for a new client. All of which had to happen while fighting jet lag. I still did not get time to catch-up with the people I met, but I still plan on it.

One of the very nice surprises I had a few days ago is the recent formation of Montréal Ouvert (the site is also partially available in English), which held it’s first meeting last week. The meeting appeared like a success to me. I’m very bad at counting crowds, but it seemed to be somewhere between 40 and 50 people attending. Participants were from various professions and included some city representatives, which is very promising. However, the next steps are still a little fuzzy and how one may get involved is unclear. The organizers seemed to have matters well in hand. There will likely be some sort of hack fest in the coming weeks or months to build prototypes and show the case for open data. I don’t know how related this was to Make Web Not War a few months prior. It may just be one of those idea whose time has come.

I also got to spend a little time in Ottawa to meet with the BigBlueButton team and discuss further integration with Tiki. At this time, the integration is minimal because very few features are fully exposed. Discussions were fruitful and a lot more should be possible with the now in development version 0.8. Discussing the various use cases indicated that we did not approach the integration using the same metaphor, partially because it is not quite explicit in the API. The integration in Tiki is based on the concept of rooms as a permanent entity that you can reserve through alternate mechanisms, which maps quite closely to how meeting rooms work in physical spaces. The intended integration was mostly built around the concept of meetings happening at a specific moment in time. Detailed documentation cannot always explain the larger picture.

Improvement of the regression tool

Today, I completed and deployed a minor release of TaskEstimation. Attendees at my express session on estimation at PHP Quebec earlier this month got to see the very first addition since the original release: the graphical representation. It seems that for most people without much statistics in their background, and most people who forgot it, R2 is not too meaningful. However, a visual representation of the dots and the regression line makes everything more obvious. The dots simply have to be close to the line.

I made this addition during the conference the day before I had to present. I must say I am really impressed by the possible output of eZ Components Graph. Hacking around in the code to produce a scatter plot was not too much trouble, but it did remind me how hard generating good graphs is. I have done it before and going back to my own code did not seem so attractive. Looking at the internals, I could see the same problems were faced: rendering graphics to multiple output formats without tangling the code is just hard.

I then remembered a voice in the back of my head. It’s Steve McConnell repeating that “simplistic single-point estimates are meaningless because they don’t include any indication of the probability associated with the single point.” I knew I had to do it eventually, but I just had to wrap my head around the maths involved and do it. In addition to the forecast, the regression tool now displays the confidence range based on the desired probability. The range also gets displayed on the graphic.

It turns out Wikipedia was the most accurate source of information I had available at the time. It can be hard to find a basic formula when you don’t have a textbook nearby.

Looking scary? It’s really not that bad, except that a few details are missing and I suspect there is an error in the first one. I’d have to check more sources, but alpha alone does not make sense and using the alpha with the hat does look much more accurate graphically.

Generated graphic
Generated graphic

The major detail that is missing was the way to calculate the talpha/2,n-2 term. I wasn’t the only one searching for it. Someone linked to it on Wikipedia while I was reading. After a more while searching for the formula because I don’t like straight numbers, I settled for including the table itself. Those numbers are quite hard to obtain otherwise and so much accuracy is not required anyway.

TaskEstimation now provides nice confidence ranges to accompany the estimate, making the expected accuracy more obvious and taking into account that extrapolating your data beyond known values has more uncertainty.

Pushing it out the door

Parkinson’s law indicates that any task will take the amount of time allocated to it. Too often, this is abused by managers to squeeze developers in a short and unrealistic time line for the project. While they often abuse it, the foundation is still right. Given too much time and no pressure, some developers will create great abstractions and attempt to solve all the problems of the universe. Objectives are always good to have. Allocating a limited, but reasonable, amount of time for a project is a good way to insure no gold plating is made, but still allow for sufficient quality in the context.

A reasonable amount of time may be hard to define. It requires a new skill for developers: estimation. Done on an individual basis, estimation can be used by developers as a personal objective to attain, but can also be part of a greater plan towards self improvement. The practice of up front estimation has huge benefits on the long term, even if they are far off the target. Once the task is completed with a huge variation, it triggers awareness. What when wrong? Constantly answering these questions and making an effort at trying resolving the issues will lead to a better process, higher quality estimations and less stress to accomplish tasks.

A long time ago, after reading Watts Humphrey’s Personal Software Process (PSP), I became convinced of the value of estimation as part of my work. In Dreaming in Code, Scott Rosenberg reflects on Humphrey’s technique:

Humphrey’s success stood on two principles: Plans were mandatory. And plans had to be realistic. They had to be “bottom-up”, derived from the experience and knowledge of the programmers who would commit to meeting them, rather than “top-down”, imposed executive fiat or marketing wish.

A few initial attempts in 2006 gave me confidence that high precision estimates were possible and it wasn’t so hard to attain. However, when my work situation changed, I realized that the different projects I was working on did not have the same quality constraints. This lead to splitting up my excel sheets in multiple ways. The task of estimating became so tedious I eventually dropped all tools. Not because I was not satisfied of the results I obtained, but because of the time it took me to get to it. I reverted to paper estimates and my gut feeling of scale. Still, the simple fact of performing analysis, design and rough measurements gave me significant precision. Not everything was lost.

Estimation Interface
Estimation Interface

However, one thing I did loose was traceability. Paper gets buried under more, or lost altogether. Personal notes are not always clear enough to be understood in the future. I no longer had access to my historical data. I wanted my spreadsheet back, but couldn’t bear with having to organize it. Over a year ago, searching for a reason to try out new things, I started a personal project to build a tool that would satisfy my needs for organization and simplicity. It required a few features crucial to me.

  1. It had to make it easy to filter data to find the relevant parts to the task at hand
  2. It had to be flexible enough to allow me to try out new estimation techniques
  3. It had to be quick and fun to use, otherwise it would just be an other spreadsheet

I achieved a first usable version over the last summer, working on it in my spare time and gave it a test run in the following months. It was not good enough. Too linear. Too static. It did not accomplish what I needed and found myself reverting back to paper over again. What a terrible failure.

Spacial Editor
Spacial Editor

A few months later, I figured I had to make it paper-like and gave it a little more effort. After a dozen hours sitting in the airport over the last two weeks, I think I finally documented my work enough for others to understand. Sadly, even if the application is somewhat intuitive, the prerequisite skills required to perform estimation are not.

Today, I announce the first public beta release of, a tool aimed for developers to estimate their work on a per-task basis and work towards self improvement. Don’t be confused, this is not built for project management. While it probably is flexible enough for it, any project manager using it should have it’s own self improvement in mind. Feedback is welcome on both the application and the documentation. I expect the later one to be lacking details, so feel free to ask questions.

American Way of Life

USA flag

Well, Bush has once again declined to sign the Kyoto treaty. The excuse this time is that it will cause job loss. It’s quite sad to see that jobs today worth more than life tomorrow. I don’t actually see how it will cause job loss in the first place. It’s probably just an other lame excuse. I actually hope it does not reflect the opinion of the average american.

Is the american way of life they are trying to protect really about egocentrism? As a general idea, every single decision made totally ignores the rest of the world. They elected Bush once again, even if the entire world was against it.

It’s nice to see the way major problems are simply ignored. Kyoto has not been a key element during the year-long campain (which probably costed more than signing Kyoto would). Less than a week later, the decision is made. Has everything else been done or I simply don’t understand what priorities are?

Presentation in November

PHP Quebec Logo

During the November meeting of PHP Quebec in Montreal, I will be giving a presentation about documentation methods. Expect it to be focused on PHP, but it will go way beyond the traditional JavaDoc (or phpDoc in the context). So far, the plan looks like this:

  • Why?
  • Why not?
  • Types of documentation
    • API
    • Vision
    • Changes
    • Data structures
    • Bugs and anomalies

The meeting takes place at the École Polytechnique on November 4th 2004 at 19:00. Admission is totally free and everyone is invited. Sessions are given in french due to the audience, but we usually do a good job at translating on request. More details about the location are given on PHP Quebec’s website.

Are we ever going to see them?

OpenGL Logo

I just felt on an other 3D desktop environment. Croquet Project‘s goals seem to be greater than simply being a desktop environment, but it still looks like one. The open source project is using OpenGL and is supposed to run on all major platforms (Linux, OS X, Windows). Reading the descriptions on the main page, I ended up thinking it was a complete operating system. I had to go in the FAQ to find out that it was nothing more than an application. Those guys are definetly good at playing with words.

First thing I tought about was Sun’s Looking Glass Project. Basically, they both seem to do similar things except that Sun’s really is a desktop environment and is designed for the Java Desktop System (project core is available on SourceForge under GPL) and Croquet is Croquet is a combination of computer software and network architecture that supports deep collaboration and resource sharing among large numbers of users within the context of a large-scale distributed information system., whatever it means. I also saw an other similar project back when Looking Glass made some waves with the release on SourceForge but I couldn’t find it back.

Are we ever going to see one of those in action? I havn’t heard any news from Looking Glass in months. This new project does not even have an official release: how long will it live? Both project seem to be very serious, but is it only a hype that will be dropped before reaching the public? I really wonder how it will be in terms of usability and if it will be good enough for everyone to drop their good old 2D desktop interface.

Distributions taking positions

Linux Logo with Tux

Distributions have been growing for years. As Linux started to get more popular, the amount of packages in distributions also increased. Most recent distributions like Fedora and Mandrake have 4 CDs or are distributed on DVD. Maintainers saw their workload increase drastically. Yesterday, an article on Slashdot came up and indicated that Slackware could drop support for Gnome. This was in no way official, but it definetly marks a major change in the way Linux will be distributed.

A few new distributions saw the light recently, such as Ubunto Linux. The distribution is based on Debian but it filters the packages that are accessible and also gives it a fresh wind. From what I could see, Gnome 2.8 is the only available desktop environment. Yoper divided the distributions based on the purpose they will serve. Instead of making everything available, things are filtered for higher quality. Sun’s Java Desktop System also had a reduced amount of packages. Novell’s SuSe is pushing for Ximian (I might be wrong on that one) and Evolution as a solution.

I think this is all for the best. I have been asked recently what was the difference between all distributions. I have to admit that my best answer was the way they are distributed, because they are all pretty much the same. One you have a distribution selected, you still need to select the packages you want. The new generation simple cuts off one of those choices. This is probably going to help more migration to Linux by making it easyer for new user but also by assuring a higher quality.

This changes nothing to the fact that Debian and Gentoo will remain present with their overwhelming amount of packages for those who like their Freedom. As long as a single desktop environment does not win the war, have a monopoly and impose their standards, I will be happy.

Not Looking Back

Mozilla Logo

With the new set of W3C standards implementation coming up, it feels like the browser war will get a second round. The Mozilla project has been very active with the SVG support and the results are very impressive. Web applications will be more dynamic than ever. The problem developpers encounter is that very few browsers actually support this new technology and millions of Internet Explorer users will be left behind.

Is there a real problem? All Gecko browsers are Free and accessible to anyone on about all known platforms. Is there anything wrong in asking for requirements on an application? The website can remain accessible even if not all fancy effects are present, but this goes back to having two versions of the application, but this is simply a normal transition phase. Overhead is always present when major changes occur. It wasthe same thing during the 16->32 bit transition, until JavaScript stabilized and even right now, 64 bit processors support the 32 bit instruction sets. It’s simply a sacrifice that has to be made. Unless people actually use those technologies massively, they simply won’t spread and we will remain stuck with the current partial CSS2 support.

SVG is only the tip of the iceberg. Mozilla’s XForm implementation, which is supposed to replace the forms in XHTML 2.0, has already begun. A few elements CSS3 start to appear. Right now, XHTML 1.1 is impossible to use if IE support is required. Can a sinlge browser really slow down the technology development process?

We can only hope Microsoft will release a new version that at least comply to the standards soon. I wouldn’t expect new technology anytime soon.

Mozilla has been mentionned a lot, but the growing popularity of Macs since OS X also helps the cause. Safari also offers good support for new technologies.