Epic Games and its partners, including The Coalition, have created the closest thing to an interactive motion picture we’ve ever seen, with new levels of fidelity in character realization, environmental rendering, lighting quality, and post-processing. Yes, we are talking about the new Matrix Awakes.
You owe it to yourself to check this out if you own a PlayStation 5 or Xbox Series console, especially if you’re suffering from cross-gen fatigue. Beyond the obvious visual spectacle, Epic’s goals with this demo are numerous and varied.
“Let’s just release it, let’s release it to the public, right?” says Jeff Farris, the Technical Director of Special Projects. “[Let’s allow] developers and customers to put their hands on it and hold ourselves accountable for making sure that this tech is 100 percent ship-ready. And that’s what we pushed through to make sure that yes, you can put this on real-world hardware – this level of graphics and this level of interactivity is achievable.”
“We didn’t want to see a YouTube comment that says, oh, it’s running on a massive PC, it’s not a real PS5,” adds Jerome Platteux, Art Director Supervisor. “Yes, it’s on every next-gen console, including Xbox Series S.”
Kim Libreri, Epic’s CTO, joined the company after a successful career working on major Hollywood films such as The Matrix trilogy and has maintained contact with Lana Wachowski. Epic was able to gain access to the IP and assets as a result of this.
Perhaps this is how The Matrix Awakens manages to deliver a shot-for-shot real-time remake of the classic scene in which Neo is awoken by Trinity reaching out to him through his computer. Epic had access to the original assets and increased the resolution on Neo himself, with UE5’s Chaos physics engine used to accurately map the slow-motion movement of his coat.
It’s also how the team was able to recreate the original Neo and Trinity despite scanning today’s Keanu Reeves and Carrie-Anne Moss for their MetaHuman character rendering system.
Epic’s goals were influenced by the entire notion of the Matrix megacity. The idea of a large open environment with infinite view distances had been hinted at but never achieved in previous UE5 demos. The Matrix Awakens is built around a procedurally generated, then customized open world that can be produced by a small development team.
The city itself is a blend of New York, San Francisco, and Chicago, and is based on real-world assets (adapted and altered to minimize copyright restrictions). Because this open world, like The Matrix Awakens’ main character IO, isn’t a part of Warner Bros. IP, the entire demo, including assets, will be given away with the full release of Unreal Engine 5 in the spring of next year, along with tutorials for developers on how to create their open worlds.
“I’m excited to see what the community is going to do with this, right?” says Jeff Farris. “I mean one of the big goals with the engine itself is to just facilitate creators. What’s the friction and how do we make it easy? You look at the tech like Lumen and Nanite, and that stuff really helps with that, but just releasing this to the community and letting people do what they’re going to do with it is amazing.”
The Lumen real-time global illumination system has been significantly improved, in addition to delivering the open-world hinted at in previous Unreal Engine 5 demos. Epic has upgraded the system from impressive but limited software implementation to a hardware-accelerated ray tracing solution, which provides improved performance and fidelity in indirect and diffuse lighting.
Realistic reflections and area light shadows are also possible with Hardware RT. Epic does this with its characters in The Matrix Awakens in the same way that a director of photography would place a light card next to an actor for better lighting in a scene. “When Trinity is in the car, for example, and you know the light isn’t bouncing because the seats are a little too dark, we simply add a white card [off-screen]. The white card is analyzed by Lumen, and you can see light bouncing back “Jerome Platteux explains.
Epic also boasts about its massive AI system. After the chase set-piece, there’s a period of the demo that highlights a variety of technologies, and the AI agents (pedestrians, automobiles) are showcased in one of those micro-demo segments. “Here, we wrote a new kind of high-performance scalable AI system,” Jeff Farris adds. “In this demonstration, 35,000 people are wandering around, 18,000 vehicles simulated, and 40,000 parked autos. And it’s not a bubble surrounding the character in the traditional sense, is it? We’re not excluding people based on their view frustum or anything. There are some optimizations in there, but we keep track of everything. Let us simply stimulate the city to exhibit the AI system’s performance potential.”
One of the main reasons The Matrix Awakens can handle such a dense city is that the Nanite system provides detail that is as close to ‘free’ as you can get. One of the concerns about the system is that it is limited to static geometry, but the addition of traffic demonstrates that it is more dynamic. Deformable meshes, on the other hand, are still on the ‘to-do list, which means that the solution for vehicle crash damage is sub-optimal and can cause performance dips – in essence, affected vehicles become standard rasterized objects. “This is due to Nanite’s current limitations, which limit it to rigid objects. As a result, this was a very clever solution. The future will be Nanite, and that is where we are headed, but at this time slice, we aren’t there yet “Michal Valient agrees.
“The process was interesting because [in a movie] you go to New York or San Francisco, and then you scout and you’re like, ‘oh, we can do a scene here… and then we can do a shot there, and then you go somewhere else.’ [With the open world] it’s pretty close to the way a film crew would go and start to film the city. You have tons of content, so you just find the right angle for that shot.”
The Matrix Awakens is essentially three demos in one: a stunning character rendering display, a high-octane set-piece, and a bold achievement in open-world modeling and rendering. The biggest surprise, though, is that all of these systems coexist within the same engine and are inextricably intertwined – but keep in mind that this is a demo, not a shipping game. There are problems, and there are problems with performance. Perhaps the most obvious issue is a creative decision.
Cutscenes are rendered at a refresh rate of 24 frames per second, which does not divide evenly into 60 frames per second. Even at 120Hz, there is still judder due to inconsistency in frame-pacing. Frame-time spikes of up to 100ms can occur when the camera cuts jump. What’s going on here is that the action is physically moving around the open world, posing significant streaming challenges – a kind of micro-level ‘fast travel,’ if you will. Meanwhile, as previously stated, fast traversal and car crashes cause frame rates to drop. When you combine them, you’re down to 20 frames per second. You must accept that we are still in the proof of concept stage at this point.