Highlights of the Blender Conference 2016

Here are three video recordings of presentations done at the latest Blender Conference that I think are worth watching.

Mad Entertairment Studio

I’ve never had the chance to attend the Blender Conference, but they have always provided live streaming and published recordings afterwards. The diversity and quality of the presentations invite me every time to go through it. There is a lot to discover in the field of 3D creation and, as with any open source software, people have used the tool in many different ways for the purpose of their research. So, after watching a few videos, here is my selection for this year.

Paul Melis uses Blender to demonstrate how path tracing works. Path tracing is a rendering method based the physical properties of light and thus simulates realistic lighting of a scene. Blender Foundation has developed a rendering engine called Cycles using this method. But how does it work? Paul Melis has modified Blender to literally show us how the rays are moving around in 3D before being turned into colored pixels. Along the way, he shows us how and why certain things might influence the final render.

For her PHD in media studies, Julia Velkova is focusing on open-source animation film production. She does not uses Blender, so this is not a technical presentation. Instead, she is focusing on the community, economy and mechanics behind the production of free culture using free / libre software. Her presentation puts in perspective contemporary media production with the history of art and technology. She also raises good questions for the Blender community and the free / libre art and technology practitioners as a whole. I’ll be looking forward to the conclusion of her PHD in a couple months.

MAD Entertainment Animation, is an Italian based animation studio that has gradually switched to a full Blender production pipeline over the last couple years. While this is more and more common in small sizes studios across the world, Ivan Cappiello presents here the latest projects they have been working on and shares the methodology they use when you don’t have big production budgets but still want to make big feature animation films.

I’ve singled out their presentation because of the particularly poetic look and feel they have achieved in their work (the illustration a the top of this article is a still from one of their production). But also for the Kinect based motion capture they use to help animators quickly set up poses for secondary character animation. All very inspiring.

There is of course a lot more presentations to watch if the subject and software is of any interest to you. So let me know which are your favorites from this year and why.

How to get the latest Blender running on a Pi

Arch Linux Arm - RPi2

How to get the most up-to-date applications running on a Raspberry Pi and other ARM based single boards computers?

Update 25 Oct 2016: I have written a small step by step tutorial to get you through the install process on a Raspberry Pi 2 and up. It’s available here.

Update 16 Feb 2016: Blender 2.76b is now available for the Armv7h architecture from the package manager.

For a project I’m working on, I need a small computer that can just run some scripts 24/7 while being connected to the net. Performance is not a key issue, although it’s always nice to have a fast system. But in this case, since we’re trying to keep a low budget, a computer under $50 should do the trick. And that’s why we went for the Raspberry Pi 2.

While I was developing the project, I used the latest version of Blender (who doesn’t anyway?) and some other Python libs. When moving the whole project to the Raspberry Pi, that’s when things got messy.

I’ve been using Arch Linux as my main system for a year and am really happy with it (thx Chris). So naturally, I used Arch Linux ARM for the Pi. I’ve been using it for other projects, so I felt comfortable. For those who don’t know Arch yet, it’s a bleeding-edge rolling release distribution. That means you always get the latest shit as soon as it’s available and you don’t need to do big upgrades of your system every 6 months or {2|4} years. It has also a very technical and dedicated community, that takes pride at making good documentation.

What I did not expect is that Blender was not available in the repositories for Arch ARM, although it’s of course available for the i686 and x86_64 architectures. So I started looking for a distribution that had Blender already packaged, which Raspbian has. (Raspbian is a mod of Debian crafted for the Raspberry Pi and thus promoted by the Raspberry Pi foundation as the go-to distribution for their hardware.)

But Raspbian, based on Jessie, only packages Blender 2.72, a version of Blender released in 2014. And that’s pretty far back in Blender spacetime. So my hand made Blender scripts were suddenly more buggy and not performing as well. Bummer. Since I’m kinda used to Debian systems, and since Debian has also a bleeding-edge rolling release, I thought “No problem, I’ll just switch to stretch/testing and I’ll get Blender 2.76.” Well, that did not go too well on the Raspberry Pi. I’m not sure why. I guess Raspbian is making too much modifications to the Debian core, but after switching to testing, no more Blender in the package-list available.

So back to square one. Where do I go from here? Some people online were saying Blender was not buildable on ARM architecture. But I found packets for Blender 2.76 in the Fedora branch for ARM, and Blender is available for Raspbian. So what am I missing here? Then I stumbled on this post from Popolon, where he managed to patch and compile Blender on an ARMv7 architecture using Arch (and that’s exactly what I need for my Pi). He even provided a link to his build, but that was unfortunately too old to run on the current version of Arch.

But that’s where the power of Arch comes to the rescue. Arch is a system with a lot of pre-compiled packages, and for whatever is missing, there is AUR (the Arch User Respository). What comes from AUR is a a set of scripts that will help you compile a specific application for your system. Of course, you could do any compilation yourself on any Linux system, but what I find easier here is that since you have the latest packages installed already, compiling new ones is maybe a little easier since you don’t really have to worry about having the right versions of a library. It’s always going to be the latest one, which is usually the ones needed for the application you’re trying to install.

With a slight modification of the PGKBUILD I found for Blender, I started the compilation on the Raspberry Pi 2. 6 hours later, I had the latest Blender running.  Super. I can move on with the project.

Now, I also sent feedback to the Arch Linux Arm community about this. And have heard it’s in the pipeline to be added to the official repositories. That’s great news. It could mean next time, I will not need to compile it. And others can benefit from that also. But if this story can only tell you one thing is to trust Arch Linux for running the latest software on an ARM based computer. Even if it’s not yet in the repositories, you’ll probably have the best chance to get the thing running using that system more than any other.

Rendering WordWars on radiators

Qarnot Blender Interface

TL;DR: Testing the private beta Python API from Qarnot

About a month ago, while keeping an eye on the Libre Software Meeting (RMLL) in Beauvais and watching the news about WordWars spread on some blogs and magazine, I stumbled on a video presentation about open source software tools for animators and Blender 3D rendering using Qarnot (in french). And I was struck.

Qarnot Computing is a high performance computing service provider, which means they can do a lot of complex calculations, super fast, using a large amount of computers. This service, of course, is not new. For example, 3D content creators often use these external services, also called “render farms”. Some 3D renderings require a lot of computing power in case of complex scenes, long movies or highly detailed realistic simulations. And by externalizing that work to a large amount of processing power, this can be done at a fraction of the time necessary compared to what it would require on a single machine.

Where Qarnot stands out from its competitors is that instead of spending a lot of money on cooling and infrastructure, they decided use the excessive heat produced by those computers to maintain temperatures in private houses and buildings. So, instead of having one giant warehouse with thousands of computers that needs to be constantly refrigerated, they have installed a network of “radiators”, made of the same computers, in hundreds of private interiors. So no more energy is required to cool down the computers, the heat is used to make a home comfortable and no more horrible window-less farms either, the infrastructure is distributed across a city. Using this approach, they claim to have reduced the carbon footprint of such services by 78%. And I could not be more happy to test this on my own projects.

The next cool thing about Qarnot is that they provide rendering services for Blender. The Blender Foundation uses them for their latests open movie projects. They offer some free credits to test their services. And they are cheap too. All these were so many good reasons to give them a try.

My WordWars project uses Blender to render the daily war news from The New York Times into a movie that looks like the intro scene from StarWars. So every day, a new Blender file is generated and rendered. On my home machine, it takes about 1h30 to render but on my dedicated server, it takes up to 8 hours. And I’m running this project from the server because I want to keep my home computer for other things and its also not constantly up or connected. The server, well, is up 24/7. So I decided to contact Qarnot.

I needed to get in touch with them, because so far, the only way to launch a rendering using their system is via a web interface. It’s really practical for a one time project, but having a file that automatically updates every day needs a different approach. I do want to keep things automated. So I asked if they had any API, any way to interact with the rendering farm via scripts. And as a response, I was happy to get invited to test their Python API, still in private beta.

They provided me with the necessary Python files, documentation, and a small, but feature complete, example that I could start with. It took me a while to figure out how to operate it correctly though. Maybe because I never interacted with a render farm before or maybe also because I got a bit confused with some of the parameters and the limitations of the private beta.

One thing that I found confusing was the ‘framecount’ parameter. The ‘advanced_range’ indicates which part of the movie you want to render but I did not understand, at first, how this related (or not) with ‘framecount’. ‘framecount’ is also named ‘frame_nbr’ in another function (‘create_task’) and that also puzzled me for a while as what this variable really represented. After testing different approaches, I understood ‘framecount’ as being the maximum number of frames that you want to process at the same time (the whole point of speeding the process of rendering). I say maximum, because it “feels” different depending on the amount of processing power available for the task. And in the end, the whole range of frames you want to render will be rendered, it might just take more time.

As this is still private beta, I could only get access to some of their processing power and was limited in storage space for the resulting renderings (limitations that you don’t have normally). So in the end, I could only run portions of my project − 300 frames per task − and that would take around 10 minutes to process (rendering the whole clip would take a little more than an hour using this method). So I’ll keep all these as tests for now and wait for the public release, which they should announce by the end of this year. Testing the same render (full power) using their web interface, the whole clip was rendered in less than 10 minutes. And the cost for the service would be around 0.75€ for that particular case.

So using Qarnot, I could be delivering a WordWars video clip, 10 minutes after the news have come out, at an affordable cost of less than a euro and without feeling guilty by double heating the planet.

I’m also guessing that, since the Python API they provided runs under Python 3.4, you could also expect a Qarnot-plugin that will integrate directly in Blender interface. And if they are not the ones to create this, someone will do it for sure.

I want to thank them for allowing me to test this and the patience they took to respond to the many questions I had. I also wish them success as this is truly an inspiring way to create a computing company.

Making of “Word Wars- News from the Empire”

Sharing here my thought and tool process that brought me to create the project called “Word Wars – News from the Empire“.

Word Wars Blender Scene

I’ve been playing around for a while with Blender scripting and even organized monthly workshops about it to share the experience with other artists in a group called “Blender-Brussels“. And since the beginning of these workshop sessions, my goal was to turn one of these one day projects into a daily video generation tool.

Then last month, while preparing the class I gave to a couple of artists in New York city, I started writing a small example script that would grab some text from the web and turn it into a 3D object inside a Blender scene. And while playing around with the script, the idea to turn this into a very resource hungry news reader came to me. Basically, from then on, the rest followed.

As a Star Wars fan, I’ve always been puzzled by the countless memes and reinterpretations it has generated. It somehow reflects how Hollywood culture can really take over our imagination and even become the mythological stories of our western society. But it also portraits Hollywood’s fascination for war stories, an important part of U.S. culture in general. I can’t think of any other western country where the war hero is so present in politics and everyday life. But again, when you know that the U.S. has almost constantly been at war since it’s creation, this comes with no surprises.

So it became clear to me that I wanted to address these subjects in a simple and buzzworthy manner. Following the path of the YES MEN, I chose The New York Times as my only source of war news.

Practically, the whole project consists of a prepared Blender scene, with a starry night (I’ll come back to that later), some placeholder for the texts, a modified Star Wars logo and the “Main Title Theme” music by John Williams.

Then I have two Python scripts. One that fetches the RSS feeds from the NYT and filters the news searching for war related keywords. If at least one article is found, it will add the text to the scene and modify the animation keyframes (because I do always want to have the text start at a certain time and vanish into the infinite emptyness at another particular time also) to fit with the music. The first script finishes by rendering the clip. The second script takes care of the uploading to Youtube, adding the title and filling the description.

To get the feeling right, I studied carefully the original intro from the first Star wars movie. The dedicated wikpedia page helped me also figure out some of the things, but in the end, I took some creative liberties that maybe only a hard core Star Wars fan would notice.

The original font used by Lucas is the “News Gothic” by american author Morris Fuller Benton. But since I’m a big fan of open source fonts, I preferred to use “News Cycle” by Nathan Willis, also an american author and − full disclosure − a good friend. The font is similar to News Gothic and fitted perfectly for the job. Using it, I was also happy to promote his excellent work in the open font world.

For the logo, I downloaded the svg version of it from Wikipedia and searched for amateur SW fonts for the missing characters (O&D). In the end, I found myself redrawing almost all of it, point by point, in Inkscape, until I reached the desired look.

Word_Wars_Logo

For the music score, it quickly was out of question that I would try to find a replacement for the original score by John Williams. First, it’s so iconic − the music is a meme in itself − that it would be pointless to find a remix or a different version of it. I really wanted to keep close as much as possible to the real feel of a Star Wars intro, and well, can’t do without John and the brass from the London Symphonic Orchestra. Second, if you worry about copyright issues, there is two arguments that made me stick by this choice. One is that, since I was uploading to Youtube, I knew they would let me use the music but would also certainly pay the necessary royalties to John for me. Then, if anyone still complained, I could certainly make a case, with all the clips and remixes you can find online, that the song could be considered as public domain. (I know that last argument is a bit too far fetched, but there should be a case like this in copyright laws.)

Then for the star field background, I wanted to pay my respects to those fans scrutinizing every official Star Wars trailer looking for a detail or a key that would unlock a piece or a new character from the coming movie. So I searched for the real star field that you can see from earth and found it from Paul Bourke’s page, luckily in a very high resolution. Since there was no license mentioned on the page, I contacted him by mail. Here’s his response:

No license … go wild.
Acknowledgements welcome.

I could not be happier. Adding this little detail, that until now I (guess) was the only one who could see it, for me, really tied the whole project together. It’s subconscious to the viewer, but s/he is watching those flying vanishing news from earth.

After all this, I polished the scripts a little, moved it all to a small dedicated server, cried a little when I saw the difference in rendering times between the server and my desktop, reworked the scene to pre-render the parts that never change, gained a couple of hours, then patiently waited 20 days (for 20 videos to be generated) before releasing it to the public.

For those interested, you can download the project files from this repository. Feel free to use and modify as this project is released under a Free Art license and let me know if you make anything of it.

And while I was writing this post, Youtube announced me the latest video, “Episode XXVII”:

Blender shortcuts right in your search engine

Blender Cheat Sheet

Two months ago, I attended a community meetup in NY called Quack & Hack. The point of it was to gather people around coding to improve the DuckDuckGo search engine.

If you don’t know DuckDuckGo (DDG) by now, it’s the “search engine that does not track you”. And it’s been my tool of choice when I want to search something on the web. It has nice features like the !bang mechanism and the instant answers which I find really handy. But most importantly, I feel good not being followed by an all seeing eye like You-Know-Who.

Another great thing about DDG, is that they have a program called DuckDuckHack where they invite coders to submit improvements to the search engine. These, if approved, end up for example as being new instant answers. I, personally like the weather one, the password generator or the Gimp cheat sheet. But they have many many more. It’s impossible to know them all.

So, as a user and fan of Blender, I thought it would be nice to have the crazy number of shortcuts for that open source 3D software directly available in my search engine of choice. It would not only be helpful to me, but maybe also to the larger Blender community. And in the end, it might also have some of those Blenderheads care a bit more about not being tracked when they avidly search for the latest hot features of Blender.

In the end, the process of getting those shortcuts live on DuckDuckGo took longer than I expected. But it’s now live since yesterday. So try it out and search for “Blender cheat sheet” and tell me what you think.

You can take a peek at all the effort it took by looking at the conversation I had with the DuckDuckGo team on Github. What mainly happened is that I first submitted a “Blender Cheat Sheet” using the classic way for submitting instant answers and during that period, they were working on a different one, which would be easier just for submitting “cheat sheets”. So I had to port that to the new system, and adapt the code until their new stuff was stable and ready to ship.

But I’m glad to have gone through all this. It was an interesting experience to work with the DuckDuckGo team, a nice and friendly crowd. It also feels rewarding to know that my little contribution might hit thousands of users. And I’ll be glad to have that feature in my toolbox when I’ll be giving the next Blender workshop here in NY, at the end of June.

Blender Cheat Sheet all

Écrire un manuel de 200 pages en 5 jours

Libérathon BGE

C’est l’expérience que j’ai vécue en début de mois d’août, avec 13 autres personnes, enfermées dans les nouveaux locaux de F/LAT, à Bruxelles. On appelle ça un booksprint (ou un “libérathon”) dans le jargon. Pour résumer, il s’agissait d’écrire un manuel complet, en français, sur l’utilisation d’un logiciel libre, dans ce cas-ci, Blender.

Et je dois dire que c’est une expérience intense et enrichissante. Je ne croyais pas que c’était possible d’écrire, si vite et à plusieurs mains, un livre qui, sans vouloir trop nous jeter des fleurs, me paraît une bonne base de travail et un bon support pour toute personne qui voudrait enseigner ou apprendre comment faire des jeux vidéos avec Blender.

La beauté du projet est que ce manuel est entièrement publié sous licence libre. Donc il peut être effectivement utilisé par n’importe qui, dans n’importe quel contexte. Il peut même être publié et vendu par un quelconque éditeur. Mais surtout, il peut continuer d’être amélioré par de futurs lecteurs ou amateurs puisque toutes les sources et outils nécessaires à son édition sont disponibles en ligne. Ce qui à priori donne aussi une meilleure chance de pérennité ; les logiciels (et particulièrement Blender) évoluant constamment.

Ce projet est une initiative de la branche francophone de Flossmanuals, une association de loi 1901 dédiée à la création et à la diffusion de manuels francophones pour logiciels libres. Élisa de Castro, en maîtresse de cérémonie, avait composé une équipe diversifiée de graphistes 3D, illustrateurs, codeurs, artistes,… venant de France et Belgique qu’elle dirigea avec souplesse dans la réalisation de ce projet. Imaginez 14 personnes autour de la table, toutes et tous avec des connaissances différentes du logiciel et une idée pas nécessairement toujours précise de comment l’enseigner, ça peut entrainer le groupe dans des discussions interminables. Sauf qu’on avait que 5 jours.

intro_un-jeu-utilisant-le-game-engine

Et comme si la barre n’était pas déjà assez haute, le groupe s’est rapidement mis d’accord sur l’idée que, puisqu’on écrivait un manuel pour faire des jeux vidéos, il faudrait qu’on crée en même temps un jeu vidéo, au moins pour avoir de belles captures d’écran à mettre dans le livre ou des fichiers d’exemple un peu plus intéressants que de jouer avec des cubes et des triangles. Ce jeu est lui aussi publié sous licence libre et son développement continue sur Github. Il est prévu d’en faire une sortie officielle lors de la première publication papier du manuel.

Comme toute expérience intense, poussée par une équipe avec un objectif précis, ce dont on en retient le plus, c’est la rencontre avec les différentes personnalités présentes. Je ne veux les nommer tous ici (vous lirez les crédits du livre, si cela vous intéresse), mais j’espère bien en recroiser quelques-uns au détour d’un festival ou, qui sait, d’un prochain booksprint. Et si l’aventure d’apprendre à créer un jeu vidéo avec Blender vous tentait, je suis tout disposé à organiser un atelier sur ce thème au sein de F/LAT ou dans l’établissement d’enseignement  qui vous plaira.

Blender-Brussels “lookback” workshop

blender-lookback

Last saturday, we had our monthly Blender BPY/BGE workshop where we explore creative coding through the use of Python in the context of Blender. This time, the workshop was called “Blender lookback“, mocking the famous “Facebook lookback”. The goal was to explore the Blender video sequencer to automate the creation of video clips based on a folder containing movies and images.

The idea was suggested by Sophie from Radio Panik. She has a weekly program and would like to illustrate it with a generated gif animation. This could be extended into any movie based on a RSS feed for example. But for this workshop, we had a background video, a sequence of drawings representing an animation and had to composite this all together in a neatly gif file. The animation and the video would change every week, but the process would be kept the same.

The script is, as always, available from our github repository for you to study. It grabs the video and the images from the anim/ directory in two separate video tracks. It changes the contrast, adds alpha and colorizes the animation. Then exports the equivalent number of frames. Imagemagick is then used to convert all these in a neatly gif.

The Blender-Brussels BPY/BGE are a monthly series of free workshops organised by Julien Deswaef and François Zajéga. They are open to everyone interested in playful scripting with Python and Blender. Announcements are usually made on the Constant vzw newsletter and the Belgian Blender User Group (BBUG) forum.

Blender-Brussels november report

We’ve passed more than a year since the first free *cough* monthly *cough* Blender workshop organized by François Zajega and I. And since I put myself into taking notes during the the last one, here’s a quick report of what was discussed and achieved.

Morning time is for coffee and “show and tell”. Since François and David had both attended the #BConf for the first time, we took the opportunity to get their feedback on this major event in the Blender community. Here’s what they had to say.

With an overall positive feeling about the conference, they pinpointed the fact that it was about only one software, without any discussion about the operating system it runs on, and that it was also about finding solutions, methodology and good practices and how everyone runs Blender to achieve their goal. They also were surprised by the variety of profiles attending the event. From university researchers to 3D professionals or amateurs and even sales representatives, the main crowd although mainly revolved around education. And as with any free software conference, feeling part of a community is a great source for motivation boost.

On the downside, they were a bit deceived by the inequalities between the different panels, asking themselves if there was any screening at all. They were also a bit skeptic about the location of the event. More rooms with a focus on topics is on their wish-list.

To end this conversation, I asked if they had to remember only one presentation, which one would it be:

François chose Shitego Maeda’s “Generative Modelling Project”.

shitego-GMP

And David: Helmut Satzger’s “How to render a Blender movie on a supercomputer”

After that and for the rest of the day, we jumped into some Blender manipulation. For those who don’t know about our workshop yet, we focus on Blender from a coding/scripting perspective. This time, we wanted to explore Blender from the command-line with the goal of finding a way to extract useful elements out of a .blend file (without opening the Blender interface).

$: man blender is your friend, of course.

$: blender --background --python myscript.py is the way to start a chrome-less Blender with some custom python script filled with bpy api goodies.

With this still, it took us about a frustrating 2 hours to figure out what is clearly written at the end of the manual: “Arguments are executed in the order they are given.” Which means, you have to call your .blend file BEFORE the python script that will act upon it.

After that it was just a matter of reading the api docs to come with a ~40 lines of code script that does the extracting trick well enough to call it a successful workshop.

You can download it, test it and fill some issues if you find any on our code repository. The script will extract, in separate folders, any text, python script, image and/or mesh as .obj and .stl file from any packed .blend file you supply. Put it to good use.

If you liked this, share it and come to our next workshop. Announcement will be made on the BBUG

#Loop 5 @ Recyclart

loop-5-recyclart

Demain soir (jeudi 6 juin 2013),

une série de présentations ultracourtes sur des projets expérimentaux élaborés dans l’atelier ‘Variable‘ à Schaerbeek. Venez découvrir des créations de qualité, leurs artistes ou concepteurs et leur technologie.

J’y présenterai le uHbench et les ateliers Blender bpy/bge organisés (plus ou moins une fois par mois) avec François.

C’est à partir de 20h30, chez Recyclart.

Blender bpy/bge Brussels Workshop 2013-03-29: zoom

Last friday was our monthly Blender workshop and the day before, my attention was attracted by this post on Reddit. Rorts was mentioning this intriguing video “Zooms From Nowhere” by Chris Timms and was wondering how to do something like that in Blender. I thought this might be a good challenge for the Blender bpy/bge workshop since it seemed to involve python scripting.

Well, it was easier than we thought since there was already a script (addon) called “Import images as planes” available in Blender. We then had to figure out how the particle system worked (this is where you praise working on the same project with more than one brain).

Python scripting came later in the afternoon to scrape images from the internetz and feed the particle generator with “interesting” content. (See the googlesucker01.py file hacked in no time by Frankie Zafe)

Final render was baked during the weekend.

All the project files are available on our Github repo, with the necessary explanations to help you make your own “zooms into internetz culture”. If you make any, please send us a link. We’d love to watch.

zoom-scr-01