An open standard for representing rich material and look development content in computer graphics, MaterialX enables artists to describe materials in a common, networked form, providing assets with a consistent look across renderers and platforms. Originally developed at Industrial Light & Magic (ILM), it was first deployed in production on “Star Wars: The Force Awakens” in 2015, and for real-time theme park experience “Millennium Falcon: Smugglers Run.” Following the open sourcing of MaterialX in 2017, creative and technology partners – including Sony Pictures Imageworks, Pixar, Autodesk, Adobe, and SideFX – continued to shape its ongoing development. The project was adopted by the Academy Software Foundation in July 2021, with a dedicated Technical Steering Committee (TSC). Initially in incubation, MaterialX has now achieved all technical milestones required to be certified as a hosted project of the Academy Software Foundation.
To learn more about MaterialX’s history and evolution, we spoke with a handful of creatives and engineers who have been involved with the initiative since its early days. Surprisingly, we discovered that the origin of MaterialX traces back to 2008’s “Iron Man,” among other interesting facts. Read on to find out how a smart, but seemingly unachievable idea, became a reality.
Special thanks to Jonathan Stone, Doug Smythe, Niklas Harrysson, and Guido Quaroni!
In late 2007, ILM CG Supervisor Doug Smythe (a three-time Academy SciTech Award winner and an Academy Award winner for Best Visual Effects) was in the thick of production on the film that launched the Marvel Cinematic Universe: “Iron Man.” At the time, most VFX studios had proprietary shaders and asset texturing workflows. Each studio had its own secret sauce, which made collaboration tricky. Needing to share hero “Iron Man” model assets with another VFX vendor, Smythe started writing a Python script to decode ILM’s texture assignments on the geometry to enable the other vendor to match ILM’s work.
Smythe: I remember thinking to myself that there must be a better way to share these files. With outsourcing and subcontracting work across multiple vendors becoming more common in the VFX industry, I thought it would have been really nice if I could just send over the asset that I’m running in a certain package or renderer and have it work in a different package, even within the same company. There wasn’t any way to do that at the time. It was a problem that needed solving, but I had a show to work on, so the idea sat unexplored for a while.
A spark re-ignited
Fast forward to early 2012, when Smythe was asked to look at a proposal on simple portable graphs by Jack Greasley of Foundry. It was a GraphML-based specification for connecting shading nodes. While Smythe found the proposal intriguing, he noticed that it lacked texturing or geometry material assignments. Re-inspired, Smythe put pen to paper on April 19th, 2012, and drafted the first proposal for a network-based shader look, taking all these different components into account to create reusable shader definitions with different parameters. He named the theoretical open standard MaterialX and presented the topic at a meeting in the Westin Bonaventure Hotel at SIGGRAPH that year. There he consulted with industry colleagues and continued adding ideas.
Without the proper coding resources to move it forward, however, MaterialX remained a dream – until 2013, when Lucasfilm kicked off its Unified Assets Initiative. Implemented for “Star Wars” projects, the initiative set out to define a new set of standards for generating assets that could be used across multiple projects and in different formats; effectively the goal was to create a digital backlot of content. Without a meaningful approach for expressing material networks, Jonathan Stone, a Lead Rendering Engineer for Lucasfilm’s Advanced Development Group, turned to Smythe’s work and found the missing puzzle piece.
Stone: The Unified Assets Initiative was a major focus for the Advanced Development Group at Lucasfilm, which was launched in 2013 with a leadership team including Kim Libreri and Roger Cordes. While we had ways of addressing unified geometry with Alembic, color spaces with OpenColorIO (OCIO), and textures with OpenEXR, there was no equivalent for material graphs. We didn’t have a good way to express the intricate network of nodes that artists would create in applications like Mari, Substance, or Katana. This meant the data couldn’t travel between rendering environments, let alone from offline to real-time pipelines. We took this idea for MaterialX, which was very promising but not yet implemented, and really embraced it. Once we got the green light to move forward, we started developing the code to make a solution that we could use in Lucasfilm productions. A big inspiration for MaterialX within Lucasfilm was that it could express materials in a way that would transfer between media, not just between applications, and that remains a strong driving force for the project today.
Smythe: MaterialX took off when Jonathan got involved. He’s an incredible programmer and knows how to organize a library and design an API so that it’s reusable modular code that you can document, share, and have multiple engineers work on in parallel. I did the first conceptual design for MaterialX, but Jonathan oversaw all the coding work and brought a new set of ideas to the project.
Lucasfilm’s first internal test case for an early version of MaterialX was on “Star Wars: Episode VII – The Force Awakens” (2015).
Stone: “The Force Awakens” was a show where all of us were involved very early on from pre-production and we saw MaterialX as being an important element in that process. The problem that we were trying to tackle was a relatively simple one, and that was, can we express the material library that artists paint ‘Star Wars’ assets with in a uniform way that would work across multiple applications, and would have persistent meaning over time? All the assets for “The Force Awakens” were painted using material presets that were very painstakingly crafted by art supervisors and engineers and then captured as MaterialX files. Initially, these presets were used by our texture artists to paint in Mari, using a custom layered material framework we developed at ILM, but over time our goal was to make these presets part of a common material library that could be used across all of our look development tools.
In order to achieve the full potential of MaterialX, we realized early on that it couldn’t be developed entirely within Lucasfilm, and that it would ultimately need to become an open project in the spirit of Alembic (developed by ILM and Sony Imageworks) and OpenEXR (developed by ILM). The content creation tools available to artists were constantly evolving, and ultimately it would only be possible to maintain support if the vendors developing those tools embraced MaterialX as a core feature that evolved along with them.
Finding common ground
While Lucasfilm was developing MaterialX, Autodesk was also looking at solving for portable materials via its Abstract Material Graph (AMG) project. The intention was to create a format to describe materials in an agnostic, abstract way that wouldn’t require implementation details in the actual description to make it transportable to different architectures. Niklas Harrysson, principal engineer at Autodesk, together with then-Autodesk colleagues David Larsson (now at Adobe) and Anders Lindqvist (now at NVIDIA), wrote a white paper outlining a solution to the bottleneck.
Harrysson: At Autodesk we had actually looked at this problem ourselves for a while, the problem of having portable materials. Transferring geometry and other components of a scene graph was feasible, but portable materials remained a big problem that required solving. With more than 100 products and a million users at Autodesk, we saw this as an increasing problem across M&E, AEC, product design, and manufacturing. All of these sectors found it difficult, if not near impossible, to move an asset between products without compromising on quality. About a year or so into our AMG project, Lucasfilm shared its MaterialX specification with us. We compared our approaches to see where we could help each other.
Stone: Around 2016, we began collaborating with Autodesk more seriously as a result of their Abstract Material Graph project. Both teams quickly realized that it was a great opportunity to align our efforts, take the best aspects of both, and build a much more powerful MaterialX. Autodesk was one of the first companies to start implementing MaterialX independently from our work at Lucasfilm, and that became one of our most important early partnerships.
Smythe: It was gratifying to have a major software vendor get behind our idea and support it in a meaningful way. It turns out that there’s a common need for representing the look of surface materials and having descriptions that can translate from one package to the next. We found the core idea behind MaterialX – to have a single format that can describe everything – was a universal need, and we were pleasantly surprised to partner with Autodesk on developing it.
Niklas came up with the idea to break information down into blocks and connect them in a way that not only describes the shading graph, but also generates code to output OSL or GLSL. We were focused entirely on describing the pattern graph side of the problem with nodes, but the surface shading was still monolithic C++. Instead, Niklas created a proof of concept with his approach and showed the resulting shaders generated from these blocks were pretty performant. We merged that into MaterialX, which was key to its adoption. It meant that you could describe not just how your textures connect to your shader, but also the actual shading mechanism.
Harrysson: We were really trying to solve the same problem, but we noticed quite quickly that we had taken different approaches to the problem and tackled different areas of the problem. The result was that we figured out that our two solutions were complementary, so we should merge them and collaborate.
Taking the leap into open source
At SIGGRAPH 2016 during a Birds of a Feather event, Doug and Jonathan gave a presentation on MaterialX and publicly announced the publication of the open specification, followed by the open sourcing of the complete MaterialX codebase at SIGGRAPH 2017.
Stone: There was a great deal of support and positive interest in the project once it was open sourced. In the first year, the community was mainly contributing ideas for improvements, but a few teams began contributing development work to the codebase, and this momentum has been building steadily over time. Today, we have a very robust pattern where external teams are contributing almost every day, but it took quite a while to build this level of community involvement.
Smythe: Open sourcing MaterialX was always on our minds from the very beginning, based on our experiences with OpenEXR and the idea that contributions could improve the standard. We also thought that open sourcing MaterialX would accelerate adoption, because it makes it a lot easier for integrating applications under the hood. A standard is limited if it’s not widely supported and has proprietary source code.
Progress through mutual collaboration
To break down CG material descriptions into Python generation – describing what the material looks like from a color and surface perspective, with light integration – Lucasfilm had focused on developing fixed uber shaders to solve for the light integration, whereas Autodesk approached the light integration model in a more abstract way. Lucasfilm and Autodesk merged the two and moved forward together.
Harrysson: At Autodesk, we did the actual implementation layer where you take the abstract material description and translate that into something the computer understands, and we did that by code generation. This means you can generate shaders for many different applications and architectures from the MaterialX file. For example, real-time game engines that use a GPU to run the code need a specific target language, versus offline rendering, which might require another type of shader. That was another large piece developed internally at Autodesk and open sourced on our side; our manager, Eric Bourque, was a driving force for this initiative. Then in 2019, we merged the code into MaterialX main and consolidated it into one open source repo on Github.
Around this same time frame, then-Pixar VP of Engineering Guido Quaroni (currently Senior Director of Engineering for 3D and Immersive at Adobe) proposed creating an open source standard around scene description, which would ultimately become Universal Scene Description (USD).
Quaroni: Initially, we thought about calling the standard Layered Scene Description, because of the ability to “compose” layers containing 3d data. But LSD wasn’t a great acronym, so we went with Universal Scene Description. Pixar held a lot of the patents around the USD concepts, so the idea was always to release them as open source as well. As soon as we realized that MaterialX was steering toward 3D pattern generation, in late 2018, we started internal conversations on possible adoption at the studio. The first step for adoption was adding MaterialX pattern generation graphs within USD. We had conversations with Autodesk and other studios, and collectively agreed MaterialX had the potential to become an industry standard. There were so many aspects of MaterialX that already aligned with what USD could do, and it made sense to leverage these building blocks for materials instead of redefining the specifications as part of USD.
Smythe: I think making a clear connection between relationships in MaterialX and USD is important for widespread adoption. We’ve continued working with the USD team to make sure we’re distinguishing the dividing line between the two standards, and that where there’s overlap, there’s functional parity and we’re using the same semantics. This way, you can use either representation interchangeably, and the translation between the two is as seamless as possible.
Academy Software Foundation backing fast tracks development
In early 2021, USD updates allowed users to choose to also build MaterialX code base inside the USD distribution, creating a streamlined path for USD to adopt the MaterialX standards and encode them into USD files. Shortly following, the Academy Software Foundation accepted MaterialX as a hosted project, and established a working group around continued development, including strengthening the connection between MaterialX and USD.
Stone: The Academy Software Foundation’s backing was critical to transforming MaterialX from a small team collaboration over Zoom into a community discussion where everyone could participate. Initially it was primarily Lucasfilm and Autodesk developing the codebase, and we didn’t yet have the diversity of voices that we wanted. The Academy Software Foundation seemed like a natural home for the project, and since MaterialX has joined the Foundation, its development has become a true community effort, which is what we had always envisioned.
Smythe: Having MaterialX accepted as an Academy Software Foundation project was huge, because it was like a seal of approval acknowledging that what we’re doing is good for the industry as a whole. It indicated that they believe in this particular slice of technology and that others should too, and I think it was the impetus for software vendors and production studios that were on the fence about MaterialX to look more closely at supporting it.
Quaroni: What I really like about the MaterialX standard is that it decouples the definition of what the material is and what a pattern is from the actual language that is used later on by a renderer to actually compute it. Having MaterialX be part of Academy Software Foundation provides the right framework to move it forward. In particular, I appreciate the role of the working groups around USD, including a dedicated one exploring USD and MaterialX integration. It’s an excellent service to the community.
Harrysson: USD adoption was a really positive development for MaterialX. Instead of inventing their own materials system inside of USD, they decided to support MaterialX inside of USD. It’s quickly become the de facto standard for scene graph descriptions, so I think that was key. We continue to work closely with Pixar, and they join our TSC meetings.
An eye on the future
As MaterialX has grown, it’s now featured during Open Source Days, where the Academy Software Foundation presents the most significant open source developments from across the industry. Autodesk and Lucasfilm remain key participants of the MaterialX TSC, with teams from SideFX, Isotropix, Khronos Group, Epic Games, and NVIDIA also assuming strong collaboration roles.
Stone: In terms of project evolution, we believe it’s important to continue integrating new ideas from the community while maintaining backwards compatibility, so that we can embrace new techniques while robustly maintaining the look of existing content. In recent years, teams such as SideFX, NVIDIA, Autodesk, and Apple have contributed important new features to the MaterialX codebase, including new procedural nodes and supported languages, providing artists and developers with new options for content creation and rendering. On the Lucasfilm side, we have recently contributed new shader translation technologies to MaterialX, allowing content to be authored in one shading model and rendered in another.
Harrysson: One key thing we’re working on right now is standardizing on a library of shader closures (functions for light integration). In particular, Open Shading Language (OSL) has decided to implement and support these natively, which gives us much better results when generating OSL shaders. In addition to OSL, we have added support for NVIDIA’s Material Definition Language (MDL). This was a joint effort between Autodesk and NVIDIA. And as a result, renderers using MDL can now live in the same ecosystem as those using OSL. Materials described in MaterialX can be translated to both targets.
Stone: Building upon the support for shader code generation that Niklas and Autodesk developed, the Lucasfilm ADG has developed an open MaterialX Viewer, which allows you to directly visualize generated GLSL code in a lightweight application with an OpenGL viewport. A more recent project is the open MaterialX Graph Editor, developed by Emma Holthouser at the ADG, which allows you to create and edit MaterialX content directly as a visual network. We’re particularly excited about the potential for this project to allow independent developers to create new node definitions that can be shared with the broader community.
Another important aspect of the MaterialX roadmap will be improvements in its integration with ecosystems such as USD and glTF. We’ve been working with Guido and the USD team for a number of years to build alignment between USD and MaterialX, but it’s only in more recent years that we’ve begun focusing on glTF as well. Where USD is especially effective for content authoring, glTF is more focused on efficient content delivery in web applications, and it’s common to encounter situations where a network of procedural nodes can express a material more efficiently than baked textures. We’re very interested in future collaborations with Khronos Group and other teams in the glTF ecosystem to find ways to harmonize these two initiatives going forward.
MaterialX has come a long way from its origins at Industrial Light & Magic, and it’s continuing to evolve. Check out details of MaterialX’s most recent release, version 1.38.7, here. Also on the horizon: Adobe is planning to support MaterialX across its 3D applications, such as Substance Painter and Designer, alongside USD support initiatives.