How Quantic Dream creates the characters of his games

How Quantic Dream creates the characters of his games

A few days ago, sportsgaming.win was invited to Paris in the offices of Quantic Dream, the French developer who created games like Heavy Rain and Detroit: Become Human, and who is now working on Star Wars: Eclipse. During the visit there was an opportunity to learn more about the development team and its 25 years of history, told in another article on these pages that traces the past, present and future of the company. On this occasion I want to tell you more specifically about the tools that Quantic Dream uses to develop its games, since within their offices there are some of the most advanced technologies in Europe when it comes to creating and animating characters in 3D. .

Engine and development tools

Quantic Dream has two offices, one in Paris and one in Montreal. its technologies and tools. Since its first game, Omikron: The Nomad Soul, the company has developed its own proprietary graphics engine, gradually expanding and updating it to make it technically up to date. For games like Fahrenheit: Indigo Prophecy and Detroit: Become Human, the engine was turned like a sock and rebuilt from scratch, while in the run up to Star Wars: Eclipse the engineers in Quantic Dream are continuing to update it to support more modern technologies like Ray Tracing.

According to the needs of the various developers, more targeted development tools are also created. For example, Popcorn is the name of the tool used by the kinematics group, with which they can change camera lenses, framerates, frames and lighting effects on the fly. Another proprietary tool is called Feather and it is the interface that allows narrative designers to control the dialogues and the various branches of the story: if you want you can test game pieces as if they were textual adventures, and quickly intervene in the texts and in the consequences that you choose. and answers have about the characters.

Animated puppets

That time a 3D David Cage appeared in the bonus content of Fahrenheit: Indigo Prophecy Similar speech applies to the tools that Quantic Dream uses to create motion capture and bring the actors into his games. The technologies it uses have evolved and changed over the years. In the days of Farhenheit: Indigo Prophecy, the facial animations of the characters were controlled by a puppeteer who wore gloves with sensors. Each finger of the glove was assigned a facial animation of the character and so, by moving the fingers, you could switch in real time from one expression to another while recording the cinematics. Clearly this had limitations: first of all the number of fingers, because with two gloves you could handle a limited number of expressions at a time.

The glove used to control facial expressions The animations were also like on / off switches: it was not possible to have more or less exaggerated variations or to switch from one expression to another in a way that seemed natural, although the result obtained was still above the standards of those times. Another practical limitation was the need for an extra person to deal only with facial expressions, in addition to the one doing the dubbing and all the different actors and stuntmen involved to record the action scenes and movements.

The small markers to trace the movements of the face, carefully applied one by one Today a different method is used to capture the expressions, and on the faces of the actors many small markers are attached, tiny metal spheres that reflect the light and allow the tracing of the muscles of the face.

The motion capture room

The space that Quantic Dream uses for motion capture has expanded over the years Quantic Dream built its motion capture room in 2000 precisely for the development of Farhenheit. Today in this 11 x 6 meter space, about eighty cameras are positioned capable of capturing the performances of the actors at 360 °. Over the years the equipment has changed, but above all the size of the area in which motion capture can be traced has also changed. In the days of Heavy Rain and Beyond: Two Souls, the space in which the actors moved was much smaller than that available for Detroit: Become Human.

These borders are delimited with colored adhesive tape, and the color is used both by the actors, who must not exceed certain areas, and obviously is used to calibrate the cameras. At the beginning of each new project, the old and worn duct tape is replaced, and here Quantic Dream has its own strange ritual: instead of throwing the tape, the developers attach it to a ball made from tapes from old projects. This sphere today is huge, it looks like a beach volleyball but weighs perhaps more than a bowling ball, and even if it is not particularly beautiful to look at it is an heirloom that somehow carries with it the whole history of the projects that have been made in that room.

In Quantic Dream no one really knows who started this ritual, but all the adhesive tapes used over the years during motion capture are glued to this huge ball Quantic Dream's motion capture room is split in two. Half is the stage, where the cameras and actors are, while the other is the area with desks and computers where members of the development team supervise the shooting to make sure everything is perfect. This second area also functions as a storage room, as all the props are stored there. Some of these objects can be used in different ways: we were shown a mug-shaped instrument, which depending on how it is oriented can turn into completely different objects. In Detroit it was used for example as a cup, a flashlight and a megaphone.

Depending on how it is oriented, this object can be a coffee maker, a megaphone, a flashlight and who knows what else. swords, sticks and katanas. Of course the weapons are fake and are covered in colored markers and ribbons so that cameras can track them. But even here we have been told a curious anecdote concerning the weight of the weapons. Some guns have a realistic weight, weigh as much as a real gun, while others are identical in shape but much lighter, like toy guns. For Detroit it was important to have both versions of the weapon, because while the heavier gun makes the actor's performance and movements more believable, it is also true that the androids in the game don't feel the weight, they don't feel the weight. they struggle to lift objects, so it is important that even the movements of the actors are fluid and light even after hours of filming.

The angle of the weapons used for motion capture, including guns, swords and rifles A curiosity concerns the motion capture of animals, such as the scenes in which there are horses or dogs: in these cases Quantic Dream prefers to move the necessary equipment to the places where the animals are, both to avoid mistreating them, and obviously for do not risk the hall being devastated. Ingrid Sanassee, head of character animations, told us that from time to time the animation team enjoys experimenting with the data collected in the most absurd ways: during the development of Detroit, taken by curiosity, they assigned the 3D model of Connor the mokits created for the pigeons within the game, having fun watching the protagonist of the game flutter in absurd positions. "This stuff hardly ever ends up in games," she told us. "But it's always fun to see what happens, and every now and then you learn something new about what you do."

This puppet is the basis of character rigging , a fundamental procedure for animating the 3D models of the characters. Looking around in the motion capture room of Quantic Dream makes a strange sensation. On the one hand, it is a completely empty space with cameras around it, an impersonal space with no particular attractions. On the other hand, we think back to the games of Quantic Dream and we realize that over the years that same space has been transformed into an infinite number of different settings: among the thousand scenarios it was a hotel, a prison, a shopping center, a station. or a street in an American city.

Collaborations

Quantic Dream's motion capture room is one of the largest and most advanced in Europe. For this reason, Quantic Dream makes its equipment and spaces available to other companies in the entertainment industry as well. In this room Asobo Studio did the motion capture for A Plague Tale, Square Enix shot cinematics related to Final Fantasy and Bandai Namco created the first trailer for Elden Ring. Netflix also used these same spaces for some episodes of Love, Death & Robots.

Creating the 3D character

Director Benjamin Diebling explained to us that one of the current limitations of motion capture is the impossibility of seeing in real time how the scene you shot in the game will look like: "When you shoot a scene from a movie you can immediately see the shot with the actors within the scenography. With video games this is not the case, and after you have shot in motion capture your work passes into the hands of the artists. An important step forward in the sector would be just that: to immediately see the final result between takes ".

For the actors involved, acting in motion capture is much more like theater. It takes a lot of imagination If you had not yet understood, what is captured thanks to motion capture are, as the name implies, the movements. Whether it is body movements or facial expressions, all this information alone does not lead to the finished character; instead they serve what is called "character rigging" in the jargon, that is the procedure that allows you to have an articulated skeleton, a kind of puppet on which the 3D model you want to animate is then applied.

Fino up to now, the scanning of actors and models took place through the use of portable 3D scanners like this The 3D model is therefore created separately, and for several years instead of modeling the characters from scratch, developers like Quantic Dream have been scanning directly 3D of the face and body of the actors. For many years, the development team used portable 3D scanners to scan the actors: they look like a kind of iron with a system of lights and cameras. The cameras detect the surface of an object based on how the light is deformed, and from the received data obtain the 3D coordinates in space. Handheld scanners have worked very well so far, but since 2021 Quantic Dream has taken it a step further by setting up a photogrammetry cage.

Quantic Dream's photogrammetry cage This is a small space that I am on 149 cameras, many of which have a long lens in order to capture the smallest details. You enter this cage and remain motionless while a technician from his workstation takes the photos which are then sent to a software that creates a cloud of points which in turn represents the starting point of the 3D model. Conceptually, the operation is similar to that of the scanners used up to now, but the new system in addition to being more precise also has a practical advantage. Since the photos are taken simultaneously from all angles, there is no need for someone to continually turn the scanner around the actor.

For body scanning, 149 cameras are placed in each corner of the cage. photogrammetry There are, however, a whole series of details to take into consideration before proceeding: first of all you need to have a precise height (unless you want to recalibrate the cameras every time), then wooden platforms are kept in a corner of the room of different dimensions on which, eventually, to pick up the actors. Then, it is essential that the room is dark and that glasses, necklaces or objects that reflect light are not worn, to prevent smudges from being scanned. The result of this scan, raw and not retouched, is a model that Quantic Dream says is between 40 and 50 million polygons, and therefore captures every little detail. Before being able to put it in a game, obviously this model must first be cleaned up and a 3D mesh that is lighter and with a much lower number of polygons must be created. Just to give you an idea, the Kratos model in the latest God of War had 80,000 polygons for the body, of which 30,000 polygons for the face, while a car in Gran Turismo 7 reaches 500,000 polygons.

For a practical demonstration, Quantic Dream allowed me to try the photogrammetry cage, sending me the result of the scan afterwards.

Did you get scared? Good. Aside from the nightmares you will have tonight, the result is interesting because it points out a couple of things. In general, the scan has an impressive level of detail in the conformation of the face, in the imperfections of the skin, the textures and folds of the shirt that perhaps, if you think about it, it would have been appropriate to iron. Where everything is messed up is in the hair and beard, all elements that on many occasions are not even scanned in photogrammetry sessions. The actors are often asked to shave their beards and wear a cap or a headband to pull their hair up, because the important thing is to have a more precise scan of the face and features, while hair, mustache and beard are usually added to the model of the character from the graphics of the team.

A detail of Connor in Detroit: Become Human There are other obvious reasons why the result is far from the quality of the characters of Detroit or more recent productions such as Death Stranding and Horizon: Forbidden West. As mentioned, the actors' 3D scans go through a cleanup phase, in which the artists work on the 3D mesh, adjust the textures and details. This scan was instead made as a simple demonstration and with a single shot, while the photogrammetry sessions with the actors can last much longer.

These technologies will clearly be used for the next Quantic Dream projects, starting from Star Wars: Eclipse. Given the continuous evolution of the development team's tools it will be interesting to see if between Detroit and Eclipse there will be that enormous technical gap that we have seen passing from Fahrenheit to Heavy Rain or from Heavy Rain to Beyond.

Have you noticed errors ?





Powered by Blogger.