Rendering WordWars on radiators

Qarnot Blender Interface

TL;DR: Testing the private beta Python API from Qarnot

About a month ago, while keeping an eye on the Libre Software Meeting (RMLL) in Beauvais and watching the news about WordWars spread on some blogs and magazine, I stumbled on a video presentation about open source software tools for animators and Blender 3D rendering using Qarnot (in french). And I was struck.

Qarnot Computing is a high performance computing service provider, which means they can do a lot of complex calculations, super fast, using a large amount of computers. This service, of course, is not new. For example, 3D content creators often use these external services, also called “render farms”. Some 3D renderings require a lot of computing power in case of complex scenes, long movies or highly detailed realistic simulations. And by externalizing that work to a large amount of processing power, this can be done at a fraction of the time necessary compared to what it would require on a single machine.

Where Qarnot stands out from its competitors is that instead of spending a lot of money on cooling and infrastructure, they decided use the excessive heat produced by those computers to maintain temperatures in private houses and buildings. So, instead of having one giant warehouse with thousands of computers that needs to be constantly refrigerated, they have installed a network of “radiators”, made of the same computers, in hundreds of private interiors. So no more energy is required to cool down the computers, the heat is used to make a home comfortable and no more horrible window-less farms either, the infrastructure is distributed across a city. Using this approach, they claim to have reduced the carbon footprint of such services by 78%. And I could not be more happy to test this on my own projects.

The next cool thing about Qarnot is that they provide rendering services for Blender. The Blender Foundation uses them for their latests open movie projects. They offer some free credits to test their services. And they are cheap too. All these were so many good reasons to give them a try.

My WordWars project uses Blender to render the daily war news from The New York Times into a movie that looks like the intro scene from StarWars. So every day, a new Blender file is generated and rendered. On my home machine, it takes about 1h30 to render but on my dedicated server, it takes up to 8 hours. And I’m running this project from the server because I want to keep my home computer for other things and its also not constantly up or connected. The server, well, is up 24/7. So I decided to contact Qarnot.

I needed to get in touch with them, because so far, the only way to launch a rendering using their system is via a web interface. It’s really practical for a one time project, but having a file that automatically updates every day needs a different approach. I do want to keep things automated. So I asked if they had any API, any way to interact with the rendering farm via scripts. And as a response, I was happy to get invited to test their Python API, still in private beta.

They provided me with the necessary Python files, documentation, and a small, but feature complete, example that I could start with. It took me a while to figure out how to operate it correctly though. Maybe because I never interacted with a render farm before or maybe also because I got a bit confused with some of the parameters and the limitations of the private beta.

One thing that I found confusing was the ‘framecount’ parameter. The ‘advanced_range’ indicates which part of the movie you want to render but I did not understand, at first, how this related (or not) with ‘framecount’. ‘framecount’ is also named ‘frame_nbr’ in another function (‘create_task’) and that also puzzled me for a while as what this variable really represented. After testing different approaches, I understood ‘framecount’ as being the maximum number of frames that you want to process at the same time (the whole point of speeding the process of rendering). I say maximum, because it “feels” different depending on the amount of processing power available for the task. And in the end, the whole range of frames you want to render will be rendered, it might just take more time.

As this is still private beta, I could only get access to some of their processing power and was limited in storage space for the resulting renderings (limitations that you don’t have normally). So in the end, I could only run portions of my project − 300 frames per task − and that would take around 10 minutes to process (rendering the whole clip would take a little more than an hour using this method). So I’ll keep all these as tests for now and wait for the public release, which they should announce by the end of this year. Testing the same render (full power) using their web interface, the whole clip was rendered in less than 10 minutes. And the cost for the service would be around 0.75€ for that particular case.

So using Qarnot, I could be delivering a WordWars video clip, 10 minutes after the news have come out, at an affordable cost of less than a euro and without feeling guilty by double heating the planet.

I’m also guessing that, since the Python API they provided runs under Python 3.4, you could also expect a Qarnot-plugin that will integrate directly in Blender interface. And if they are not the ones to create this, someone will do it for sure.

I want to thank them for allowing me to test this and the patience they took to respond to the many questions I had. I also wish them success as this is truly an inspiring way to create a computing company.

Blender bpy/bge Brussels Workshop 2013-03-29: zoom

Last friday was our monthly Blender workshop and the day before, my attention was attracted by this post on Reddit. Rorts was mentioning this intriguing video “Zooms From Nowhere” by Chris Timms and was wondering how to do something like that in Blender. I thought this might be a good challenge for the Blender bpy/bge workshop since it seemed to involve python scripting.

Well, it was easier than we thought since there was already a script (addon) called “Import images as planes” available in Blender. We then had to figure out how the particle system worked (this is where you praise working on the same project with more than one brain).

Python scripting came later in the afternoon to scrape images from the internetz and feed the particle generator with “interesting” content. (See the googlesucker01.py file hacked in no time by Frankie Zafe)

Final render was baked during the weekend.

All the project files are available on our Github repo, with the necessary explanations to help you make your own “zooms into internetz culture”. If you make any, please send us a link. We’d love to watch.

zoom-scr-01

Blender bpy/bge Brussels workshop #4

blender-bpy-bge-brussels-4

So, François and I are organizing monthly Blender workshops at Constant Variable where we play around with python code and the Blender Game Engine (BGE). The point of these meetings is to get together and exchange ideas and experience around the use of code and 3D realtime rendering with open source software. Last meeting was this friday and here’s a quick explanation of what was achieved.

François came with a little .blend file he set up with a cube rotating randomly. We looked through the code, studied it and improved it by making it shorter and easier to duplicate. The file is available on our shared github repository and is called bge_cube_animation. The cube can be duplicated without changing the code so that all cubes will perform their own random animation.

The next project was to get the BGE to send and receive OSC messages. We looked through different python OSC libraries and studied the great example provided by Labomedia. We didn’t start studying OSC by chance. It’s a well known protocol for exchanging data between realtime visual or audio programs but it’s also what comes out of Melon (a kinect based controller) that François has been working on at Numediart.

To get it all working together, the pyOSC library had to be updated a little to handle OSC messages that were not yet implemented. The new version of the code is available on my gitorious fork (pyOSC v0.3.7), waiting to be merged with the official one.

We then rushed to finish the workshop by recording a self satisfying video of François pushing blocks around using the Kinect -> Melon -> pyOSC -> BGE set up. The file is also available on our Github repo. (See bge_cube_osc.blend in user/frankiezafe/ directory)

Next meeting will be in a month or so and announced on Constant and Blender Brussels Meetup websites. Feel free to join us.

Blender Brussels Workshop – 21/12/2012

So we had our second Blender Brussels workshop today. The first one happened about a month ago. This time, we spent the day at Constant Variable and talked about the basics of python and its use within Blender. We worked on scripts to generate random meshes and then manipulate them in the game engine.

Here’s a short video of what the output might look like.

If you want to play with the file, download it here. You’re also welcome to join us on the next round, in january 2013. The exact date will be announced later.

Blender python friendly workshop

Frankie Zafe asked me to give him an introduction to Blender and Python. So we met at my appartment this friday and he had the great idea to bring two of his friends to this improvised workshop. We went through the basics of python and Blender in the morning, then hacked on some scripts in Blender in the afternoon with the intent of generating random meshes with the software. The picture above is the result of Frankie’s work. Go see some more here.

We’ve decided to renew this experience next (maybe every) month. But I wanted to open up invitations. So if you’d like to join us and play around with python code in the Blender environment, feel free to drop a mail or comment and I’ll send you the invitation. (Ho, we are Brussels based in case you didn’t know).

frankiezafe.