About keyboard layouts

When I moved to the UK a couple of years ago I gladly switched to the UK keyboard layout which, unsurprisingly, happens to be far better than the Italian one to type all those nasty non-alphabetical characters you need when programming.

Despite the wide availability of those on-the-fly layout switching tools, I decided to use the UK design exclusively, as anyway the Italian layout is not even particularly good to type Italian! For instance, it’s impossible to get capitalized accented characters. Fail.

Luckily, I found out a good alternative. Apparently most Linux distributions include a nice UK keyboard layout that makes AltGr followed by most punctuation keys behave as a dead key. That way, you can not only type «è» and «È», but also «ñ» (AltGr+]) and even crazier stuff very easily. Unfortunately the default UK layout in Windows doesn’t behave this way, and the “dead keys” layout makes «’» itself a dead key, which is very inconvenient in my opinion. So in the end, thanks to the excellent Microsoft Keyboard Layout Creator I ended up “porting” the behavior of the default Linux layout to Windows.

While I was at it I also decided to learn touch typing. I ended up using KTouch, an excellent little program. All in all I was a bit disappointed to find out that touch typing is really just optimized for textual English, while, again, I was hoping to cover also a heavy use of punctuation and accented characters. In this case I eventually wrote a small script to generate new lectures for KTouch according to my requirements.

Should someone be interested, both tools are now available here.

Note: all this stuff is based on Querty. I’m not that hardcore.


Ouch, two full years have passed since my last post here… and I wholeheartedly hate abandoned websites! In the meanwhile I graduated, moved to a different country, got a job in the visual effects industry and also got married. Quite a time.

In this period my amateur interests moved away a bit from 3d and onto digital painting, a passion that found an outlet in my other blog-turned-kinda-webcomic (sorry, in Italian).

Someone may still have this blog in their feeds from my short spell as a Blender developer. I’m unlikely to contribute anything substantial anytime soon, but luckily, since then, the evolution of the rendering side of Blender has been in the safe hands of Brecht, who is definitely heading in the right direction. At the moment I’d rather devote my scarce resources to help out Gimp, which gets a lot less developer love!

In the meanwhile I’ll start reusing this blog to post code snippets, scripts and the odd technical rant mainly to have them handy for my own convenience but also in the hope that someone might find them useful occasionally. Blender aggregators who wish to keep this feed are advised to track only the Blender category as most other posts wouldn’t be of interest.

About getters and setters in Python

Recently I’ve been developing a fairly large Python application, involving a somewhat articulate and deep class hierarchy. With time, I ended up writing a certain amount of getters and setters which felt suspiciously unpythonic. Looking for better solutions I stumbled upon a number of sources making fun of people writing getters and setters in Python (e.g. Python Is Not Java), because:

  • there’s no real data member protection in Python anyway (“we are all consenting adults here”)
  • if you later want to embed some logic in the retrieval of an attribute, you can keep the same interface by just adding a property

These arguments were so compelling that made me feel pretty stupid. Consequently, I immediately started converting all my getters and setters into variables or, when needed, into properties.

Unfortunately, it soon became apparent that I had not considered all the facets of the problem. Problems surfaced soon, primarily because, as I had to find out the hard way, avoiding getters/setters does not play that well with inheritance.

First of all, overriding the access of an attribute becomes a bit quirky — despite being perfectly acceptable when using getters. In particular, the child class must know if a certain attribute is indeed an attribute or a property — and this already breaks the promise of keeping the switch from variable to property a local change.

class Foo(object):
	def foo(self):
		return 5

class Bar(Foo):
	def foo(self):
		return Foo.foo.fget(self) + 1

print Foo().foo # 5
print Bar().foo # 6

The syntax gets really cumbersome. For instance, if you want to override only the getter or the setter, verbosity ensues:

class Foo(object):
	def foo(self):
		return 5

	def foo(self, val):
		print val

# one way to do that
class Bar1(Foo):
	def foo(self):
		return Foo.foo.fget(self) + 1

	def foo(self, val):
		return Foo.foo.fset(self, val)

# another way
class Bar2(Foo):
	def foo_get(self):
		return Foo.foo.fget(self) + 1

	def foo_set(self, val):
		return Foo.foo.fset(self, val)

	foo = property(foo_get, foo_set)

Foo().foo = 4
Bar1().foo = 3
Bar2().foo = 2

Granted, true Pythoners would say that this is not Python’s fault but mine, as I have evidently made some wrong, or at least unpythonic, design decisions at a higher level. That’s an acceptable criticism, but I still think that the organization of my program was fair when employing getters and setters. Most importantly, though, the consequences of substituting them in class hierarchies where overriding is common have not been sufficiently explained by the previously mentioned Internet sources.

In other words, sometimes “practicality beats purity“!

Still here

I can’t believe it’s already six months since my last post here. It’s definitely time to resume regular blogging — also broadening the scope of this blog from Blender to other technical stuff as well.

In Milan on Wednesday 17

Even though I’ve not been active as a Blender developer in the last few months, I kept lurking the mailing lists in an effort to stay up-to-date on the latest developments — and 2.5 in particular.

On Wednesday 17, together with a bunch of fellow Italian blenderheads, we will be talking about our dear 3d suite as part of the warm-up meetings of the Italian hackmeeting 2009.

Plans for the Lightcuts branch

Recent efforts, especially those led by the awesome cuban coder nicknamed Farsthary, have made it clear that sooner or later (but probably sooner), true Global Illumination will be available to Blender Internal.

Apart from that, in a number of discussions on the mailing list, it become apparent that the problem with Blender Internal is that its amazingly broad feature-set lacks a bit of overall vision, so that its not always clear how any two features interact. It seems that the post-2.5 period will be dedicated to some sort of design refactoring of Blender Internal so that different render paths can be handled more smoothly, both on the coder’s and on the user’s side.

I think that now it makes sense to wait for that to happen before attempting the integration of the Lightcuts stuff.

Moreover, I’m also very glad to hear that André Susando Pinto a.k.a Jaguarandi, great developer and past gsoccer colleague of mine, is again a gsoccer this year working on performance improvements to the internal raytracer. He’s the right guy to do that, after the convincing job he did last year on the BVH stuff. His work for 2009 will turn out extremely beneficial for my Lightcuts branch, which relies on the raytracer pretty heavily.

So my next steps are: 1) finishing the “reconstruction cuts” part of the algorithm and 2) porting the branch over to 2.5 — but not necessarily in this order.

In the meanwhile, I would like to spend my small amount of spare time to help out on 2.5 which is coming along really well: I’m truly impressed! Congratulations to everyone that has been working on it so far.

It finally arrived

It took a while but it finally arrived!

My collection of geeky t-shirts just got a lot better!

My collection of geeky t-shirts just got a lot better!

Even though this much awaited t-shirt is now secured in my hands, I’m still committed to improve my project. Admittedly I’ve not committed much recently (sorry for the pun) but I’m still working on it, albeit at a slow pace. Stay tuned!

Egocentric pumpkin

I’m back from the Blender Conference, which was very stimulating as always. For some reason, it’s very easy to hang out with the other participants, and to relate with people you’ve never heard of as if they were close friends! My opinion is that Blender hits a soft spot between technical and artistic interests (and possibly even ethical, because of the open source part): this makes the overall atmosphere one of… tasteful nerdiness I would say 🙂

By the way let me stress once again how kind and helpful are all the top Blender heads, despite their success in the field. @ndy and Bassam are always in for a chat, and Brecht is always painstakingly giving guidance to hordes of famelic developers fighting their way through Blender’s codebase (I was one of them by the way!).

Anyway. It suddenly occurred to me that since my initial involvement in the Summer of Code, I have almost completely stopped using Blender, apart from setting up test scenes.

This is why today I took the time to carry out a small Halloween-themed project. Actually, I don’t care at all about Halloween, but anthropomorphic pumpkins are undeniably a fun subject.

Egocentric pumpkin

Egocentric pumpkin

The project eventually turned out to be mostly a programming effort after all. The blend file contains two scenes:

  • the pumpkin scene, where a Python script link takes care to move the camera and the eyes of the pumpkin randomly at each frame, also setting random values for a couple of shape keys and changing the color of the background
  • the “Polaroids” scene, where a number of randomly rotated objects display the frames rendered in the other scene (all assignments were carried out through Python scripting again)

The artistic quality of the result is pretty questionable, but, hey, if someone wants to improve shading, modeling, or whatever, here is the halloween.blend blend file to play with.

Egocentric pumpkin (monocolor)

Egocentric pumpkin (monocolor)

Leaving for Blender Conference

Tomorrow I’ll head towards Amsterdam to attend my third Blender Conference in a row. This time, besides meeting the usual lot of cool people, I’m also going to talk about my work on lightcuts — and I hope to get some useful feedback as well.

Anyway: in the last couple of days I took the time to implement some minor stuff I had been postponing for a while.

First of all, support for baking is now in. Just bake with the “Enable Lightcuts” toggle pressed: it should work now. I have to admit that this did not go through much testing, so please report any strange behavior.

A second addition was compatibility with the SSS feature from Blender Internal. Previously, it used to crash outright. Now, it gives pretty good and smooth results:

SSS + Lightcuts

SSS + Lightcuts (actually 1024 lights, not 8k as stamped)

(Apologies to Maurice R., who asked for this a long time ago… and it was actually a pretty easy fix!)

The possibility to have meshlights was another frequently requested feature. I just committed some initial code to support it.


Suzanne is the only light in this scene

It works automatically for all materials having an Emit value > 0.0. Unfortunately the object itself gets a completely flat shading. This will probably require a different handling UI-wise in the future, avoiding the Emit value altogether.

Finally, I also implemented some algorithmic improvements recently described in a paper by the original authors of the Lightcuts algorithm. According to them, this should lead to faster tree building times and better quality trees, which have in turn a positive effect on rendering times. Actually the improvements are barely noticeable most of the time but at the very least it simplifies code.

Lightcuts: multiple representatives

I finally surrendered and added “multiple representatives” to my Lightcuts implementation. This essentially means that you have an option to select noise instead of banding.

When computing the contribution of a cluster, you need to select a representative point light, whose position will be used to compute visibility and whose color will be used to shade the pixel. The original Lightcuts paper selects this representative light at a global level. With multiple representatives you choose a different representative at each sample.

The current implementation required some code refactoring, so that I ended up being a bit cautious about correctness — read: a number of optimizations are still missing. A lot of testing is still required but, as always, I rely upon my friends at blenderartists who are doing a wonderful job stress testing the system and reporting their feedback.

Quick comparison of original algorithm vs. multiple representatives option; the latter is currently significantly slower but a number of optimizations are still missing

Quick comparison of original algorithm vs. multiple representatives option; the latter is currently significantly slower but a number of optimizations are still missing

In different news, the insane amount of bug fixing that has been going on at the end of the Apricot project, led many developers to plan a new release before diving headfirst into the enormous 2.50 undertaking. As far as I understand, Blender 2.48 will include not only bug fixes but also new features, including some of the amazing work my fellow soccers did this summer.

As for my project, while the code itself is in a fairly acceptable shape, it will take more time to merge. Unfortunately, even though the code is not invasive at all, the concept of lightcuts is not that easy to integrate cleanly in the current Blender workflow. And we all know that Ton would rather miss a feature than pollute the user experience with an alien concept; this error has been made in the past and now core developers are stricter on this — and I don’t blame them at all. I’m waiting for the Blender Conference to have some input or even some enlightening discussions with them about this delicate part of the work.

The future of the Lightcuts GSoC project

Next week the Google Summer of Code 2008 will be conclusively over. I will sum up my thoughts on this wonderful experience in a future post, adding some suggestions for prospective applicants as well.

Here, instead, I would like to detail my plans for the future of this project.

Development — Development will continue, but necessarily at a slower pace. At the end of October I’ll be attending the Blender Conference where I would like to show a couple of new features. Apart from that, though, what this project needs now is some “boring” refinement.

Blender Internal features — I’ll have to go through all the Blender Internal features and for each of them ponder if it is possible to support it. Unfortunately the very nature of the algorithm makes it impossible to support all features (some of which are pure hacks) but what’s in and what’s left out must be clearly stated to users.

Workflow design — In my opinion if you want to use Lightcuts, especially if you want to obtain indirect lighting, you have to plan ahead a bit; I can’t see, in general, the same light rigs being successful in both Blender “vanilla” and Blender Lightcuts. Matt Ebb, and many others for that matter, would be happier to see Lightcuts as a less invasive tool. Of course mine is a statement while Matt’s is a desire, so it’s easy to agree with both. Anyway, however this “fight” ends, I need to figure out how to integrate Lightcuts in Blender from a UI point of view. In this particular area I would like to hear opinions from users.

Merging — Some people are afraid that a project living in its own branch is at a greater risk of dying; for sure it’s at a greater risk of going out of sync. While it’s true that this project is fairly isolated code-wise, and that development could continue also on trunk, I feel more free to experiment in my own branch. As my confidence with the Blender codebase increases, I could also attempt some bolder refactoring of some rendering engine internals — and that would be better tested in a branch.

My idea is to merge after the Blender Conference, while the aim for the entire project is to be released in Blender 2.50.

Ok, enough words. Here’s a Sponza test, provided by Melon on blenderartists:


A Sponza rendering lit by a single area light, with 5 bounces of indirect lighting

The entire thread is worth reading in my opinion.

Ps. — Next week I won’t be working on this project, but this is a planned break as I resume my duties so don’t think I’m running away!

More indirect lighting stuff

This week I have been busy completing the indirect lighting part of my project, but this turned out to be harder than expected. The problem was not particularly the algorithm itself, as I’m currently implementing a naive variant. As a reader pointed out, the state of the art in instant radiosity is currently Metropolis Instant Radiosity by Benjamin Segovia, but implementing that is not trivial and is left as future work.

I actually had a hard time figuring out how to read surface colors, which is something I need to color indirect lights and obtain color bleeding in the end. In the end I came up with some very hackish code that for sure wouldn’t be accepted in trunk. On the other hand I can’t see a cleaner way to do that without some changes in the existing Blender codebase. I have to talk to some other developer and see what’s the best thing to do.

Anyway, right now I have the ability to generate 1st and 2nd bounce indirect lighting from area lights, taking colours into account. I was able to obtain a rendering like this one:

"Unfinished Match" with indirect lighting

"Unfinished Match" with indirect lighting - Tree creation time: 00:04.64 - Lights: 19968 (0l + 0s + 19968o) (10000d + 9968i) - Error rate: 0.020 Max cut: 1000 - Average cut size: 362.88 - Shadow rays: 345.63 (1.73%)

The interesting thing is that this model was previously lit by a very complex lighting setup in order to fake indirect lighting. This rendering was obtained by placing a single area light behind the window.

Here are some more tests:

Color bleeding test

Here there's an area light near the wall, turned upwards at 45°.

Hangar test

Hangar test - Here I exaggerated indirect lighting through post-process.