It’s not so uncommon for an educational institution to adopt free software in its programs. But last month Krita’s website was taken down by curious minds of the Internet, hundreds of comments were posted in various forums, crazy theories evolved, and old myths about free software had a new go at curious minds of the Internet in a giant positive feedback loop.
The news of Art and Technology of Image department in Paris-8 university switching to Krita, Blender, and Natron caused quite a stir in CG communities around the world. Some discussion were quite fun to read, like this one on Model Mayhem:
— Sounds like they’re short of cash and trying to make a virtue out of a necessity. I suspect this will last for a year or so till numbers starts to dwindle, with potential students going elsewhere to be given proper training with industry standard software…
— As professionals, although our day-to-day job is busy, we should really be open to how new players – even Open Source ones – solve old (and new) problems. The paradigms might be shifting…
In some other cases discussions (Hacker News, Linuxfr.org) quickly evolved into myths debunking between users and developers of both Krita and GIMP.
But here’s a problem: not enough information about the decision in the original post at krita.org.
Inadequate support from Adobe? In what way? Being pushed around to make choices that go against teachers’ beliefs? What does it mean? How did this go for the students? Were they interested? Did they rebel and demanded being taught software actually used in VFX studios? Questions just keep on coming.
So LGW attempted to fix this the only way we know: by interviewing François Grassard, a teacher at Paris-8 and a CG professional with 17 years of experience in the industry who’s the primary person behind this initiative. It’s a long read, so sit tight. We’ll get to the bottom of it, that’s a promise.
François, thank you for finding a slot in your super busy schedule. Let’s start with assessing the situation with Paris-8, its ATI department, and Adobe. What exactly was meant in the quotes about inadequate support and being pushed around?
First of all, I’d like to clarify some points about this news item at krita.org, because according to my friend David Revoy, this post on the blog seems to have gone viral on the Internet since a few weeks, and I read a lot of comments on several social networks and forums, whose authors are far, far away from the facts.
Here’s a disclaimer: I’ve been a user of Adobe products for a long, long time. I’ve been using After Effects since its first commercial release about 20 years ago. I love this software and I made so many animations with it in so many cases (motion design, TV identity, VFX). As a freelance artist, I use Blender and After Effects together for about 10 years. For me, it is one of the most efficient duets I can use to get my work done fast and efficiently.
So, as a compositing teachers at Art and Technology of Image, why did we decide to stop teaching Adobe products to our students? The answer is a single word: budget.
Editing a shader in Blender, from the “Le Désert Du Sonora” short movie production
Our university is among public schools that are mostly free in France, except registration fees that are pretty acceptable in comparison to fees in some other countries.
To acquire all licenses of software that we teach, we have to rely on our own budget only. And our budget is ridiculously low. So we always have to talk to software vendors about the price of licenses.
Some companies are pretty fair with us, and we can, for instance, buy all Maya licenses we need. In other cases, companies can’t lower the price of their products. That was the case for Adobe. We asked for a discount, they said they can’t do that. End of the story.
We don’t have to blame Adobe for that. As a commercial entity, they have to make choices and decide if they can accept this kind of a deal or not. Last year, I was an After Effects and Nuke teacher. Since this year, I start my compositing courses with Nuke only.
So, after the meeting with Adobe we discussed this problem and tried to find solutions. When we said “we don’t want to make choices that go against our beliefs”, it means we refused to choose the worst solution in this case: crack Adobe’s software and continue to work like before.
The philosophy of ATI is to teach our students to be flexible in any kind of situation. That’s why most of our previous students now work as TDs [technical directors] in several animation/VFX companies. We always try to find good solutions — technically, ethically, and compatible with the budget of a production.
So what did you decide to do?
I called my friend David [Revoy] who helped me discover Krita a few years ago. He kindly agreed to come to Paris to introduce Krita to our students.
After 3 hours of showcasing, our students were mostly impressed by the capabilities of the software. Of course, it’s not so easy to learn new software when you have used Photoshop for such a long time. But they were fine with that, because we were lucky enough to have wonderful students.
As the result, we switched from Photoshop to Krita. We didn’t do it for some kind of revolutionary ideology, even though, as you know, we are experts in revolution here in France.
We did it to continue using high quality software, and Krita was the best option. I think it’s going to be a pretty good experience for everybody at ATI. And we know that a lot of other animation schools in France are watching us.
This was, in fact, one other puzzling bit in the original news text for those who aren’t CG professionals living in France. Could you please tell more about RECA and their involvement?
It seems that we are not the only ones to have difficulties to deal with Adobe (or other vendors) regarding the price of licenses.
RECA stands for “Réseau des écoles françaises d’animation” (“Network of French schools of animation”). It’s a group composed by 25 schools of animation and visual FX that are well known in France for the quality of their work.
The goal of this group is to discuss plans and strategies of each school and ways for us to deliver the best knowledge to our students. We share ideas and methodology all year long during several meetings.
Around the month of June 2014, we met up and talked about the license prices for each product. We discovered that buying all licenses for all of the students was difficult not only for our public structure, but also for private schools.
When we exposed our plans about Blender, Krita and Natron, a lot of schools in the RECA said that they were really interested too. But it is obviously more difficult for a private school to integrate this kind of experiment, when they have strong partnership with some software vendors.
Compositing a sequence from “Le Désert Du Sonora” in Blender
As a public school, we have more flexibility to do that. This means that we are the first school in the RECA to integrate open source software. But if this experience is a success, a lot of other schools could follow the same path.
Originally, the news focused on Photoshop/Krita, but there’s also Blender and Natron involved. What’s the story with them?
I started to learn Maya, 3ds Max, Lightwave, XSI, and so many more applications since their first releases, but I’ve been a Blender user for years now, and it’s what I mostly used for all 3D these days.
We don’t have an official teaching schedule for Blender at ATI. We sometimes organize master classes about specific topics, such as organic modeling, character animation, dynamic animation for VFX, etc.
For 3D, we’ve been primarily teaching Maya since 1998. But since a few years, most of our students arrive to ATI with some preliminary 3D knowledge on Blender. And for a lot of them, they already have pretty good skills. And that’s a really interesting point! Because even if all instructors at ATI really love Blender, the choice of this software came naturally from our students!
You mean, students who’d probably die for a job in a big studio actually wanted to use Blender?
They make choices on their own. We never pushed them to use it. That’s why a lot of projects at ATI massively use Blender for 3D purposes — sometimes in conjunction with other 3D packages, and sometimes alone.
Yes, after the switch from Photoshop to Krita, we had to resolve the issue of finding compositing software. Previously, we used After Effects to introduce compositing to our students. The second software we use is Nuke. It’s a powerful tool, but it’s sometimes difficult to understand all the technical theory behind it.
At some point I heard about an open source compositing software named Natron. Even better, since the team is French from the INRIA lab! I contacted them, and they promptly replied. The project was not so stable back then, but it was being developed at an incredible speed and it was becoming better and better with each day.
I tested it a lot during the summer and made a lot of suggestions and bug reports. The team has been listening to users really patiently and with a lot of interest. So I called my friends at ATI and propose to use Natron to introduce compositing to our students. By coincidence, Natron shared exactly the same philosophy and shortcuts with Nuke. It just couldn’t get better, perfect timing!
ATI focuses on teaching our students how to use 2D and 3D software, but also how to code their own applications with C++, C#, and Python. Cédric Plessiet, who is one of our resident code masters and teachers at ATI, now teaches them how to write OpenFX plugins for Natron. And the best part is that these plugins are fully compatible with Nuke, Fusion, Resolve, and many more commonly used production tools.
By the way, our students dived into Natron, and a lot of them decided to use it in their own future projects. Sometime—because of curiosity, sometimes—only because it’s open source, and sometimes—for both reasons.
Compositing a sequence from “Le Désert Du Sonora” in Natron
The news mentions that the whole Art and Technology of Image department has switched to Blender, Natron, and Krita. How many teachers are involved with this?
There are 3—4 teachers involved with teaching how to use these applications, that’s including me and sometimes some external professionals. But the same people also teach at the same time other software like Maya or Nuke. We mix them all together, because all the techniques behind applications of the same kind are nearly similar.
Understanding those techniques is what we teach in the first place. Using actual software comes afterwards. We think it is the way to be more flexible and to switch easily from one software to another.
One correction, though: I don’t think the term “switched” is right here. “Integrated” would be more appropriate. And in some cases — “deeply integrated”, depending on the students.
How is that?
We are really flexible about the software our students can use during their intensive projects, as far as licenses allow. We try to listen to them, to their suggestions and ideas. Even though we have to prepare them for their professional life, we are an Art university where experimentation has to take place.
We still teach a lot of proprietary software at ATI: Maya, Houdini, Unity and Unreal Engine for real-time applications, and games, and much more. We keep in mind that our students have to know most of the software that is being used in big studios.
But at the same time, we try to teach them how to integrate some alternatives, such has Krita, Blender, Natron, DJV View, FFmpeg, and many more. Depending on ideas that each student has, this integration of open source tools in his/her pipeline will be partial or exclusive. We try to teach them using as diverse software as we can.
But it’s not so difficult, because most of our students are really open-minded and sometimes teach new stuff to each other during their own “internal masterclass”, during the night or the week-end 🙂
So you experiment a lot?
We do, all the time. We know we are in a complex triangle involving schools, companies, and software vendors. Each part influences the others a lot! Vendors want to push their software into schools, schools want to prepare their students for the software they will use in studios, and studios buy software known by students.
In this situation, we don’t want to break the system. We only want to integrate new concepts and software in this system. We think open source software has a place in productions’ pipelines. Partially or in a most radical way, depending on the size of the studio and the kind of the project.
The news also mentions that a three-week intensive project that involves using Krita, Blender, and Natron. Are there any other courses that will be rewritten to use free software instead of the proprietary counterparts?
The project we talked about at this time is now finished. It is the work of a group of three students. An experimental short film named “The Desert of Sonora” (“Le Désert Du Sonora”). The team who completed this project already won a Blender Suzanne Award with “Jonas” short animated movie a few months ago.
In both case, these movies are an artistic experiment, but also a technical experiment. For “The Desert of Sonora”, they choose to only work with open source software to test the efficiency of a complete free software based pipeline.
Gäel Labousse, one of the students, said to me that they are currently writing a report about the creation of this project, to expose advantages and difficulties of this kind of pipeline: what was cool, what wasn’t, and what could be improved. I think it’s a pretty good approach. They don’t idealize this open source pipeline, but instead they stress-test it and change things that are wrong with it.
Other groups during this three intensive weeks use Blender and Krita, but only this group decided to only work with free software. Fortunately, the results of their efforts is quite impressive. That’s really encouraging.
As far as I can tell, you have tons of experience using and customizing/automating After Effects. You did a video course on motion design with After Effects 5.5 in the past, and you currently work for a company that automates Ae workflows. How and why did you start exploring free software options?
Because my father was a programmer, I start to write code at the age of 7. It was difficult in the early 1980s to code graphical stuff with my old ZX81. But it was the time when everybody tried to experiment. It was an era right before the boring MacOS/Windows battle. New computer came out each month, and neither were compatible with the others. It was obviously not really productive, but it was fun to test and discover new alternatives.
I think I still follow this philosophy of always looking for new ways to do things. I’m probably just a paranoid guy who tries to find alternatives all the time, even though I’m happy with the software I already use.
The reason why I do that is because sometimes changing one’s habits and point of view permits to resolve previously unsolved problems. That’s why I decided to explore free software after having used proprietary software for many years.
The reason why I use libre software is not because it’s gratis. I decided not to use Maya or 3ds Max anymore and mostly switched to Blender, because it is an efficient tool for my job.
So, really, you are just a pragmatic guy?
I’m a very pragmatic guy. If you give me inefficient software to work with, I’ll refuse to use it, even if it comes at no cost. I choose the software I use not because of the price, but because it’s good for my business. Blender, Krita, Natron, FFmpeg, DJV View, and many more apps can be used in production. I use them every day! The way we can use them in a pipeline depends on the project, of course.
As you said, I work for the company called “Ivory” on Automate-IT — a solution based on After Effects to automate TV promos and motion graphics for TV channels. Why After Effects? Because at this time, it is the software used by 95% of TV channels for this purpose and most of the projects done by artist in this area are created in After Effects.
But because we work a lot for TV channels, we see the part of free software like Blender growing every day in any kind of production. I think television can adopt free software more quickly than the cinema industry right now, because of the size of the projects and teams. But it’s a work in progress, where problems are usually more psychological than technical.
Do you know that when you sit in a modern digital theater to watch a movie like Transformers, you have 80% of chance to have a open source player in your back, right inside the dark room behind you? Take a look at this page of the leader for Digital Cinema Player, and read the really small grey line at the bottom. Your favorite movies are played by FFmpeg!
How much experience with Nuke or other node-based compositing apps did you have prior to using Natron?
For compositing purpose, I previously used Commotion, Combustion, Shake… and After Effects—since its very first release. Today, I mainly use After Effects, Nuke, Fusion, Natron, and sometimes Blender—for really simple cases.
That’s not because Blender hasn’t a good compositor. It’s really powerful, but it’s only about efficiency. Time versus money. Deadlines in this industry are shrinking more and more each day. We have to find the best solution to a specific problem or job.
One of the scenes from “Le Désert Du Sonora” in Blender
How often is free software the best choice, in your experience?
Sometimes it’s the best option, sometimes it isn’t. I try to spent a (huge) part of my time to improve free software by giving my feedback to devs and develop some solutions around free software to create my own alternative.
From my point of view, the Blender compositor is not the most fast and efficient way to work, and I prefer to work with other solutions. That’s why I’m really happy that Natron is now available! For me, it’s the first real alternative for high quality compositing. Of course, it’s still at an early stage of development, but I’m quite impressed by all the features that are already available, and it’s after just one year of development by two programmers!
Earlier in the interview you mentioned that you use Blender in conjunction with After Effects. Why?
Well, After Effects is really powerful, but lacks a lot of features in some specific areas. For instance, particles simulation has to rely on plugins such as Red Giant / Trapcode Particular. Without this plugin, it is really difficult to quickly create complex and good looking particles simulation.
But I can use the incredibly useful particles system of Blender, render out images sequences, and integrate my particles in After Effects. Thanks to Bartek Skorupa, I used the After Effects exporter addon for Blender each day!
Same about integration of 3D objects right inside After Effects. Cinema4D Lite that is now bundled with Ae could be a solution, but it has a full raytracer which is most of the time pretty slow, compared to a traditional After Effects composition. Elements 3D plugin from VIDEO COPILOT seems to be the only good solutions for that, even if there are some limitations when we compare it to a complete 3D package like Blender.
And with Blender, I can export my camera and empties to camera and null objects in After Effects in a flash! It’s a really efficient way to work. It saves me a lot of time. Most of the time, when I have to render an animation, 50% of the work are done in Blender, and the other 50% are done in After Effects with a lot of tricks to speed up the process.
Because I use Blender, the After Effects limitation about the inclusion of 3D objects in composition is not a problem. I export a Z-Depth pass from Blender to After Effects, sometimes a mask in greyscale, most of the time directly fetched from the OpenGL renderer to speed up the process again!
If you and only you can manage the 2D and 3D parts at the same time, After Effects has from my point of view, no limitation. The Blender/AE duet helps me work really, really fast.
For people who only manage the After Effects part and don’t touch 3D at all, the software obviously needs some extra stuff.
So, what can we do in the open source/free world to overcome this kind of problems and limitations?
Well, we can take a part of our time to develop alternative solutions. For instance, 3D workspace is planned to be integrated in Natron in future releases. But when can we hope to get a full particles system in Natron? We don’t now. Maybe really soon, if we can pay someone to develop it.
Still from “Le Désert Du Sonora”
While doing a few “best commercials made with Blender” curated lists, I spoke to various studios using Blender + After Effects. It seems that availability of custom addons for Ae is one of the major roadblocks to adopting Blender and Natron for compositing, or just Blender for everything. Do you see some way to deal with that? Or do you think it’s not really a problem?
It’s always the same question: who will contribute to a project like Natron or any other kind of free software? We can think about severals solution like documentation, tutorials, code and, of course, money. If we want to include free/libre software in pipeline in a more radical way, we have to detach the word “gratis” from the concept of open source software.
If we want to have a real alternative to each software we use today in production, we have to pay for that. We have to consider the work of talented developers who provide a new vision of modern production. The goal is not to break the system in place, but to create an alternative shaped by the artists and for the artists, even if they can’t code anything.
We speak to Alex [Gauthier] who is the main developer of Natron about the software and its future every day, defining a realistic roadmap. Alex said to me a couple of days ago that I use his software better than he does. But he develops so fast that I look like a snail next to him!
So we have to remember: it’s a team play where we can combine the skills of each other in a specific domain. Using graphic tools is not only a technical topic, it’s all about methodology. With a good methodology and good knowledge of your work, you can usually switch easily from one software to another. I see in free softwares a real potential to re-establish a direct communication between developers and users. But it could take a lot of time, even though I’m confident in the future.
ATI students during the talk by David Revoy
When I take a look to my students, I can see that learning mainstream 3D applications and experiment with free software at the same time is not a problem. They can handle that, because they are curious enough.
This way, without pushing them at all, open source applications are slowly taking place in production pipelines. Of course it’s gonna start with small projects, because it’s easier to create a new pipeline from scratch in this case.
Isn’t it a dream of every second free software enthusiast that studios like Weta Digital step up the game and start using free software for content production on their Linux boxes currently running Maya and Houdini? Or that small studios using Blender et al. would make it big in the box office?
I usually hear some people saying that they don’t understand why big studios can’t replace software like Maya in a snap! Obviously, all those people never worked as technical directors in such a big studio. That was my job for about ten years before I switched to freelancing.
When you have to manage a project with more than 200 artists and a lot of complex shots, you have to plan everything, sometimes 2 years before the first artist arrives in the studio. Once the project is launched, you can change some details in the pipeline, but you never break it out completely.
Excerpt from a short animated movie “Herakles, Aux Origines De La Crau” by Les Fées Spéciales
We have to prove that we can use free software and create a solid pipeline on top of it. Some new studios try to it, like the newly born Les Fées Spéciales in France who decided to only use open source software, but also to improve it in a production environment. It’s how we can make sure that free software is being improved through a constant exchange between users and developers. I hope to see this kind of studios more and more in the future.