Highlights of the Blender Conference 2016

Here are three video recordings of presentations done at the latest Blender Conference that I think are worth watching.

Mad Entertairment Studio

I’ve never had the chance to attend the Blender Conference, but they have always provided live streaming and published recordings afterwards. The diversity and quality of the presentations invite me every time to go through it. There is a lot to discover in the field of 3D creation and, as with any open source software, people have used the tool in many different ways for the purpose of their research. So, after watching a few videos, here is my selection for this year.

Paul Melis uses Blender to demonstrate how path tracing works. Path tracing is a rendering method based the physical properties of light and thus simulates realistic lighting of a scene. Blender Foundation has developed a rendering engine called Cycles using this method. But how does it work? Paul Melis has modified Blender to literally show us how the rays are moving around in 3D before being turned into colored pixels. Along the way, he shows us how and why certain things might influence the final render.

For her PHD in media studies, Julia Velkova is focusing on open-source animation film production. She does not uses Blender, so this is not a technical presentation. Instead, she is focusing on the community, economy and mechanics behind the production of free culture using free / libre software. Her presentation puts in perspective contemporary media production with the history of art and technology. She also raises good questions for the Blender community and the free / libre art and technology practitioners as a whole. I’ll be looking forward to the conclusion of her PHD in a couple months.

MAD Entertainment Animation, is an Italian based animation studio that has gradually switched to a full Blender production pipeline over the last couple years. While this is more and more common in small sizes studios across the world, Ivan Cappiello presents here the latest projects they have been working on and shares the methodology they use when you don’t have big production budgets but still want to make big feature animation films.

I’ve singled out their presentation because of the particularly poetic look and feel they have achieved in their work (the illustration a the top of this article is a still from one of their production). But also for the Kinect based motion capture they use to help animators quickly set up poses for secondary character animation. All very inspiring.

There is of course a lot more presentations to watch if the subject and software is of any interest to you. So let me know which are your favorites from this year and why.

A patch for the Github centralization dilemma

Github 404

Github, with its 75,000,000 repositories, has become a central place for open source development and is well-known for having popularized Git among programmers and other code hungry fellas. The irony is not lost on anyone that we are again relying on a centralized service for our decentralized Git workflow. And as with any centralization comes the risk of giving too much power in the hands of just a few.

Of course, a central service such as Github has its benefits. We all know where to search for code. We all also potentially know how the service works and can jump more quickly from one project to another. Third parties can even build upon this resource and push things in new directions, maybe attracting faster early adopters.

But… Centralized services can turn against you. They can censor and be censored. They also can disappear. Maybe Github will not disappear soon, but a user on Github could decide to delete all its repositories and there would not be much you could do about it. You don’t think that has happened? Check RGBDToolkit or Gravit, for example. (You’ll have to put those urls in your preferred search-engine to verify that I’m not bullshitting you and that these projects did exist on Github at some point.)

So, in order to restore balance in the force, I’ve decided to adopt a few habits that I want to share with you. They are not going to solve the centralization problem. But they can maybe provide some safe guards against the major risk exposed in the previous paragraph. These tricks apply for projects you have not created. For your own projects, it’s up to you to decide where you want to host them.

The solution I’m using is based on the mirror feature from Gitlab. Gitlab is an open source clone of Github. It provides the same functionalities, but you can install it on your own server. And many groups are running public instances across the web. Gitlab.com itself, as a company, develops the software and offers hosting of public and private repositories at the same address.

So now, every time I find a nice open source project on Github, and especially the ones with few stars, forks or developers, I create a mirror of it in a public Gitlab repository. The advantage here over just a git clone on my machine or elsewhere is that I’m not just creating a copy of the project at a certain time. The mirror feature will keep watching the original project and pull all the changes that happen after I created the mirror. So I’m confident, that whatever happens to the original repository, all the history and changes will be saved elsewhere.

Because those repositories are just backups, I also disable issue tracking, wikis and any other unnecessary feature that could mislead visitors. The point is not to divert development. There is also a clear mention that those are mirrors and link back to the canonical repository.

So next time, instead of starring a project you like, mirror it. You’ll do everyone a favor. The ones I keep are here. But feel free to choose any other hosting service elsewhere. Let’s keep things distributed.

Git versioning and diff visualizing tools for designers

Git for Designers (1st slide)

Here is the video of my presentation at the Libre Graphics Meeting 2016, in London. For the most part, I expose my quest for a Git based visualizing tool that could help designers integrate a version control workflow.

The slides are viewable from here. You can also download them from this Gitlab repo.

If you find this video interesting or lacking more in-depth information about the subject, please have a look at these detailed blog posts:

  1. Collaborative tools for designers – Part 1
  2. Collaborative tools for designers – Part 2 : Dropbox
  3. Collaborative tools for designers – Part 3: Pixelapse
  4. Collaborative tools for designers – Part 4
  5. Collaborative tools for designers – Part 5: Adobe Creative Cloud
  6. Collaborative tools for designers – Part 6: the Githosters
  7. Github, why u show no more media files

 

“Pointillism”, live coding at #PdCon16~

IOhannes M Zmölnig is an active member of the Pure Data development scene. So it’s of little surprise that he was attending the Pure Data Conference that just happened last week in NYC.

Pointillism IOhannes PdCon16

Pure Data (Pd) is a visual programming language […] for creating interactive computer music and multimedia works.
Wikipedia

You also might have heard about Pd as an alternative to Max/Msp or VVVV.

The conference brought a panel of enthusiasts from all over the world to discuss the development and future of the software. I was especially pleased to hear Mark Edward Grimm‘s experience teaching Pure Data as a multimedia creation tool to some college students here in the US. (See Mark Edward Grimm’s website for more info.)

At night, the same people gathered at the Shapeshifter lab in Brooklyn to enjoy live experimental music from some of the participants. And this is where IOhannes blew my mind, friday night, with a performance he calls Pointillism.

There is a recording of a previous event from 4 years ago that you can watch online. But seeing it live, not knowing what to expect, with a big projection over IOhannes shoulder, was a totally different game. Also, since 4 years, IOhannes has played it multiple times and thus perfected the set up and process at each iteration. (The video here is not doing justice to the performance.)

To explain what is going on, IOhannes is using Pure-data to create a musical instrument. He does this by adding boxes with distinctive functions and linking them together. Everything is done live, in front of the audience, and we can all see what he is doing on a screen cloning his own computer screen. Nothing is hidden.

So far, nothing new here. This is often referred to as live coding.

Where IOhannes plays a trick here, is that he is writing all the boxes by heart and uses a braille font to display their names. This means that nobody in the audience, and barely him, can read what is going on. Nobody reads braille on a screen anyway. Mistakes in the process are almost not permitted because it would be hard to find where they happened. The music itself fiddles around a theme inspired by morse code (IOhannes told me afterwards the music is actually a reading of the dots of each boxes). And sooner or later, the musical and graphical composition becomes a giant knot of boxes, dots and lines, moving in erratic ways. But all ends beautifully in a rapid deconstruction and closes on a black screen.

Needless to say the performer was greeted with a warm applause and had to come back on stage as the crowd would not stop. I’ve rarely seen such joy and amazement in the eyes of the audience at events like this.

IOhannes respects all the rules of the genre but with a twists that makes it accessible for people outside of the community. Pointillism is clever, brilliantly executed and a pleasure to watch. I could not recommend it more to any tech festival looking for a original performance and do hope you’ll be able to experience it live some day.

#IOhannes right side up

A photo posted by Sofy Yuditskaya (@horus_vacui) on

Conversation avec ma banquière

Aujourd’hui, j’ai appelé ma banque pour un problème de carte. Une fois le problème réglé, j’ai eu droit, comme toujours, au petit speech promo sur le dernier produit qui pourrait m’intéresser. Ça s’est déroulé à peu près comme ceci:

− Seriez-vous intéressé par installer l’application EasyBanking pour smartphone et tablette?

− Non merci, ça ne m’intéresse pas.

− Vous savez que vous pouvez faire toutes vos transactions via l’application et ce de manière complètement sécurisée.

− Merci, mais je fais plus confiance à mon ordinateur pour la sécurité de mes données qu’à mon téléphone. Un téléphone, ça se perd facilement.

− L’application est protégée par mot de passe et si vous fermez l’application, il faut le mot de passe pour la réouvrir.

− Merci, mais ça ne m’intéresse pas.

− Vous êtes sûr que vous ne voulez pas essayer, c’est très pratique.

( Comme elle insiste, je me dis que ça va être peut-être intéressant de faire pivoter la conversation. )

− Pouvez-vous me dire quelles sont les permissions demandées par l’application lors de son installation?

− L’application permet uniquement de faire des transferts entre vos comptes personnels ou vers d’autres comptes préalablement sauvegardés. Il n’est donc pas possible de faire des transactions vers des comptes inconnus.

− Non, je vous demandais quelles sont les droits de l’application sur les données de mon téléphone. Là, je suis sur Android et quand j’installe un programme, il demande l’accès à certaines fonctionnalités…

( J’en profite pour regarder en même temps sur le Google Play store et continue. )

− Je vois que l’application demande l’accès aux coordonnées GPS, l’accès aux photos, médias et fichiers, la possibilité de faire des appels téléphoniques et l’accès à la caméra du téléphone. Vous trouvez normal que l’application demande toutes ces permissions?

− Ha, vous avez déjà installé l’application?

− Non, j’ai obtenu ces informations à partir du Google Play et je vous demande  surtout si vous trouvez ça nécessaire que la banque sache constamment où vous vous trouvez et puisse accéder à vos photos?

− Bin, c’est comme avec l’application Facebook.

− Peut-être, mais pourquoi la banque aurait-elle besoin de ces informations?

− Je n’en ai aucune idée.

− Cela ne vous dérange pas que la banque puisse activer la caméra de votre téléphone sans que vous le sachiez?

(Elle hésite.)

−  Je comprends. Vous n’êtes donc pas intéressé par essayer l’application?

− Pour toutes ces raisons, non merci.

Make sharing bookmarks delicious again

Del.icio.us shaarli theme

One of the pioneers of the web 2.0 was the amazing del.icio.us website. It made sharing bookmarks an incredibly rich and fun experience. With its catchy domain name and well conceived minimalist interface, it attracted a horde of web enthusiasts ready to share their best links with the rest of the world.

Although Flickr is often mentioned as the inventor of tagging, del.icio.us ingeniously applied folksonomy to bookmarking, making it the first social network that could compete with search engines in terms of content discovery. And yes, we’re talking about a time when Facebook was not even born, baby.

As you can feel, 12 years later, I’m still excited by the possibilities and the experience del.icio.us offered. Unfortunately, the rest of the story is a sad slow descent into crappy interface design choices and consecutive owner changes. Although it was fun for a while, by 2011, I was actively looking for an alternative. One that for sure I knew was not going to be sold. One that I could keep control over its development.

Luckily, when you’re angry at something on the web, there is a good chance other people are. And hopefully, something good pops out of it. I found my angry creative man ruling his own corner of the web under the name Seb Sauvage. Seb was angry at del.icio.us, stumbleupon, diigo and all the social bookmarking clones he had tried. So he coded his own in the way he always crafts his tools, in a Keep-It-Simple-and-Secure manner. Then released it to the world.

Shaarli, as in share links, is an open source bookmarking application written in php that allows you to keep, tag, organize and share your collections of bookmarks without hassle. It also does import from del.icio.us, so you can switch service easily. All you have to do is export an OPML file and import it in your Shaarli.

Why am I bringing this subject today?

Because, since the beginning of this year, del.icio.us has been actively pushing advertising without anybody realizing it. See this Twitter search (and screenshot), where hundreds of people are sharing the same link to sponsored content. Yes, people linked their delicious account with twitter and forgot about it.

del.icio.us spamming twitter

I had kept my old account alive as a trace of the past. But seeing that it was now used to promote products under my name, I went out to put an end to it and decided to inform others about it.

Ricardo, from Manufactura Independente, picked up the info and moved his account to his own instance of Shaarli. Then, we chatted with a few other designers on how the old delicious user experience needed to be revived. And Ricardo spent last Sunday to make our wish come true: making a delicious theme for Shaarli, like it’s 2004 all over again.

Shaarli is an amazing project now supported by a community of developers on Github. New features and improvements are added regularly. Although it does not have the same convenience as competitors in terms of social functionalities, it does provides an RSS feed so you can subscribe to your friends valuable links or connect it through IFFT or any service that supports it.

The web still needs to be organized. And more than ever it needs to be done in the open. Keeping notes and bookmarks is valuable information not just for oneself, but for everyone. Let’s just not make this a profitable business for one company by keeping it behind walled gardens. It does not have to be complex. The web from 2004 still works great today.

So here are my booksmarks, free for all, since 2004.

Le Soir édité

Logo @lesoir_diff
L̶e̶ ̶S̶o̶i̶r̶ édité (@lesoir_diff) est un twitter bot qui tente de capturer les changements et corrections d’articles publiés en Une du site du journal Le Soir.

On le sait, l’information de nos jours court plus vite que le temps qu’on a pour la lire. Les rédactions se sont complètement informatisées et connectées de l’écriture à la publication. Ce qui permet évidemment beaucoup de choses: autant d’offrir un article à ses lecteurs dès qu’il est écrit, que de pouvoir le corriger ou de le compléter alors qu’il est déjà publié. Cela arrive aussi parfois que des articles soient même supprimés, comme l’a repéré la RTBF avec cette intox sur un adolescente qui aurait attaqué ses parents en justice à cause de photos publiées sur Facebook.

Ce qui m’a toujours intéressé avec l’information numérisée, c’est la possibilité de ré-écriture. Publier à un moment donné une information et légèrement ou carrément la modifier par après, c’est tellement simple et à la fois pas du tout anodin. Par exemple, parcourir l’onglet “historique” d’un article de Wikipedia peut, dans certains cas, nous enseigner beaucoup plus de choses que l’article lui-même. Sur certains articles en ligne, on voit parfois des mentions du type “mise-à-jour”, souvent en début ou fin de page, datées et accompagnées d’un commentaire expliquant comment l’article a été édité.

C’est cet intérêt pour la manipulation, dirons-nous, qui m’a fait découvrir @nyt_diff, un twitter bot développé par Juan E. D. et qui suit les changements en Une du New York Times.

Comme je trouvais la démarche fort intéressante, j’ai contacté Juan pour lui demander s’il partageait le code de son projet et si je pouvais l’adapter pour d’autres sites de news. Ce qu’il a fait bien généreusement. J’ai alors un peu hésité à savoir quel média j’allais suivre. Je voulais voir du côté belge francophone ce que cela pouvait donner. J’ai choisi finalement celui avec le plus gros tirage et ai modifié le programme de Juan en conséquence.

Techniquement, cela fonctionne assez simplement. Le programme se connecte à intervalle régulier sur le fil RSS de la Une du Soir. Il récupère ainsi les 10 derniers articles publiés et enregistre dans une base de donnée le titre, l’url, le résumé de l’article et son auteur. Lorsque le script retourne un peu plus tard pour effectuer la même opération, il vérifie s’il y a eu de nouveaux articles publiés, mais surtout, il vérifie s’il y a eu des modifications sur les articles déjà enregistrés. Dans l’affirmative, les versions modifiées vont être également ajoutées à la base de donnée et un ou plusieurs tweets vont être postés affichant les modifications enregistrées.

En lançant ce projet la semaine dernière, je n’avais vraiment aucune idée de ce à quoi m’attendre. Je me suis même demandé au début si le bot allait trouver des modifications et à quelle fréquence. C’est donc avec surprise que j’ai découvert que la Une du Soir était, en fait, très régulièrement éditée. Pour être tout à fait clair, il ne s’agit pas ici d’une analyse en profondeur des modifications du journal Le Soir. Seul le fil RSS avec ses 10 articles est consulté. Le programme ne vérifie pas le contenu complet des articles et ne peut non plus déterminer si un article a été effacé. C’est donc un peu à la surface des choses que le projet fonctionne, principalement sur comment la rédaction titre et résume ses articles pour attirer ses lecteurs.

Je laisse le soin aux motivés et aux analystes qui, à partir de ce projet, voudraient faire des vérifications plus conséquentes sur une correspondance entre des changements de surface et de fond d’un article. Il à ce projet open source pour les y aider.

Néanmoins, @lesoir_diff révèle une facette du Soir qui pour ma part était méconnue et amusante à suivre au jour le jour. Outre les articles écrits en direct et donc modifiés minute par minute,

certains ajouts ou corrections se font parfois plusieurs heures après la première publication.

J’ai aussi découvert que les urls des articles étaient immédiatement modifiées pour refléter les changements dans les titres.

Cette pratique, souvent découragée sur le web, m’a beaucoup questionné et donc fait chercher un peu plus loin. En général, quand on publie sur le net, on se garde bien de modifier une url, parce que si un visiteur connaît uniquement l’ancienne, il pourrait ne pas trouver la page qu’il cherche. Cette fameuse erreur 404 qui nous fait râler quand on cherche du contenu. Bien entendu, Le Soir a prévu cette éventualité. Peu importe les changements d’url, on retrouvera toujours l’article. Je vais pas m’étendre sur les détails, cela concerne surtout les professionels du web et le SEO, comme on dit, mais cela pourrait avoir un effet pervers, il me semble. Je pourrais donc moi aussi écrire mes propres urls pour des articles du Soir.

http://www.lesoir.be/833806/article/CHARLES-MICHEL-EN-A-UNE-PETITE

Notez que ça ne change pas le contenu de l’article, ni ne conserve l’url modifiée une fois sur le site du Soir. Quel intérêt, me direz-vous, si ce n’est de troller ses amis? Peut-être… Quoique pour un petit Google Bombing… Qui sait? Entre-temps, comme cela n’apportait rien de suivre les changements d’url des articles, j’ai supprimé cette fonctionnalité du bot. Ça allège un peu aussi la lecture des tweets.

Après une petite semaine de fonctionnement, le L̶e̶ ̶S̶o̶i̶r̶ édité est à son 600ème tweet. Si la face B des médias vous intéresse, n’hésitez pas à le suivre ou à adapter son code pour le faire fonctionner sur d’autres sites d’information belges ou étrangers. Contactez-moi aussi sur Twitter (@xuv) pour plus d’info ou pour partager vos idées. J’ai prévu de faire tourner le programme pour d’autres médias, donc faites signe afin qu’on ne se marche pas sur les pieds.

Je vous laisse sur ce dernier tweet…

Les leçons de cinéma de Tony Zhou

Tony Zhou

Puisqu’on est bien dans une civilisation de l’image et que ça n’a pas l’air de vouloir s’arranger, il est peut-être important de savoir “lire” une image et, peut-être encore plus, de savoir “lire” une image qui bouge.

Que ce soit pour les amateurs de cinéma, de vidéo youtube ou d’anime, pour ceux qui veulent en faire ou juste en consommer, je ne peux que conseiller cette merveilleuse série de documentaires par Tony Zhou: Every Frame a Painting (littéralement, “chaque image est une peinture”).

Tony Zhou est monteur de profession et, dans ses vidéos de quelques minutes, il aborde certaines techniques de cinéma en plongeant dans les références, actuelles et passées, pour nous apprendre et guider notre œil sur ce qui est bon ou moins bon dans une image cinématographique.

C’est jouissif. Ça se regarde plusieurs fois. C’est en anglais, mais tu peux activer les sous-titres en français. Et surtout, tu ressors de là en ayant eu l’impression d’avoir appris un truc. Faut pas lui en vouloir si après tu comprends mieux pourquoi tu t’ennuies pendant certains films.

Je te mets ci-dessous 2 ou trois que j’aime bien. Mais tout est bon, c’est vraiment difficile de choisir.

Petite question pour voir si t’as suivi: tu savais pour Spielberg?

How to get the latest Blender running on a Pi

Arch Linux Arm - RPi2

How to get the most up-to-date applications running on a Raspberry Pi and other ARM based single boards computers?

Update 25 Oct 2016: I have written a small step by step tutorial to get you through the install process on a Raspberry Pi 2 and up. It’s available here.

Update 16 Feb 2016: Blender 2.76b is now available for the Armv7h architecture from the package manager.

For a project I’m working on, I need a small computer that can just run some scripts 24/7 while being connected to the net. Performance is not a key issue, although it’s always nice to have a fast system. But in this case, since we’re trying to keep a low budget, a computer under $50 should do the trick. And that’s why we went for the Raspberry Pi 2.

While I was developing the project, I used the latest version of Blender (who doesn’t anyway?) and some other Python libs. When moving the whole project to the Raspberry Pi, that’s when things got messy.

I’ve been using Arch Linux as my main system for a year and am really happy with it (thx Chris). So naturally, I used Arch Linux ARM for the Pi. I’ve been using it for other projects, so I felt comfortable. For those who don’t know Arch yet, it’s a bleeding-edge rolling release distribution. That means you always get the latest shit as soon as it’s available and you don’t need to do big upgrades of your system every 6 months or {2|4} years. It has also a very technical and dedicated community, that takes pride at making good documentation.

What I did not expect is that Blender was not available in the repositories for Arch ARM, although it’s of course available for the i686 and x86_64 architectures. So I started looking for a distribution that had Blender already packaged, which Raspbian has. (Raspbian is a mod of Debian crafted for the Raspberry Pi and thus promoted by the Raspberry Pi foundation as the go-to distribution for their hardware.)

But Raspbian, based on Jessie, only packages Blender 2.72, a version of Blender released in 2014. And that’s pretty far back in Blender spacetime. So my hand made Blender scripts were suddenly more buggy and not performing as well. Bummer. Since I’m kinda used to Debian systems, and since Debian has also a bleeding-edge rolling release, I thought “No problem, I’ll just switch to stretch/testing and I’ll get Blender 2.76.” Well, that did not go too well on the Raspberry Pi. I’m not sure why. I guess Raspbian is making too much modifications to the Debian core, but after switching to testing, no more Blender in the package-list available.

So back to square one. Where do I go from here? Some people online were saying Blender was not buildable on ARM architecture. But I found packets for Blender 2.76 in the Fedora branch for ARM, and Blender is available for Raspbian. So what am I missing here? Then I stumbled on this post from Popolon, where he managed to patch and compile Blender on an ARMv7 architecture using Arch (and that’s exactly what I need for my Pi). He even provided a link to his build, but that was unfortunately too old to run on the current version of Arch.

But that’s where the power of Arch comes to the rescue. Arch is a system with a lot of pre-compiled packages, and for whatever is missing, there is AUR (the Arch User Respository). What comes from AUR is a a set of scripts that will help you compile a specific application for your system. Of course, you could do any compilation yourself on any Linux system, but what I find easier here is that since you have the latest packages installed already, compiling new ones is maybe a little easier since you don’t really have to worry about having the right versions of a library. It’s always going to be the latest one, which is usually the ones needed for the application you’re trying to install.

With a slight modification of the PGKBUILD I found for Blender, I started the compilation on the Raspberry Pi 2. 6 hours later, I had the latest Blender running.  Super. I can move on with the project.

Now, I also sent feedback to the Arch Linux Arm community about this. And have heard it’s in the pipeline to be added to the official repositories. That’s great news. It could mean next time, I will not need to compile it. And others can benefit from that also. But if this story can only tell you one thing is to trust Arch Linux for running the latest software on an ARM based computer. Even if it’s not yet in the repositories, you’ll probably have the best chance to get the thing running using that system more than any other.