The Lost Concept Series explores production designs by Johnathan Banta that never saw the light of day — until now.
There is always so much work that goes into an entertainment property that no one ever sees. Were it not for the books that chronicle the preproduction of Star Wars, the world would have never seen many of the concepts that led to the designs we love (designs which now grace the prequel universe of Star Wars Rebels). Such is the case with some work I did for the Bad Robot television series Almost Human.
MastersFX was contracted to do the practical makeup for the series, and I had the opportunity to add to the early design process by suggesting some digital makeup. I was doing some work on changing skin refectance to make human skin look more like the silicone that is used for practical makeup. So much time has been spent developing this material in the makeup field for many of its desirable properties, such as its subsurface scattering and its flexibility. It is not a direct equivalent to human skin, but close. In many ways it has an engaging character all its own. Stan Winston was one of the early pioneers with silicone, and had even used gelatin appliances on the movie Heartbeeps to create an android look with human-like skin for Andy Kaufmann.
Until recently this kind of additive makeup was the norm to depict robots, eventually ending up with an actor in pancake makeup, contact lenses, or large foam appliances (side trip to the creation of Kryten on Red Dwarf). Despite the artistry, they always appear to be exactly what they are, appliances glued to an human. These methods limit the actor if too heavily applied, so my concept was to keep it as light and flexible a design as possible, and finish the makeup with digital tools.
For the design for Almost Human, on-set prosthetics were to be the starting point, giving the actors something to react to, and the camera something to film that looked like a manufactured skin. I wanted to put my research to work and push the entire skin to look as if it was a form of advanced silicone, by increasing the subsurface scatter look through compositing. Furthermore, since this was the design phase, I proposed a barcode retina for each replicant, and hard points on the outer jaw and bridge of the nose. This was technology after all, and instead of building a machine that does everything, you would instead build something that could have other technology attached to it to improve its ability. Add to that a retiming of the side-to-side eye movement, and blink response, and the suggestion of its automaton nature is enforced. The prosthetics would provide that real world reference, that the digital treatment would extend.
Similar digital makeup work was performed several years ago on the film Surrogates, but focused on removing all blemishes from the actor, providing a smoother appearance to each individual. That is an obvious similarity of the approach, but I wanted to turn the actor skin into a different material altogether (which was fairly successful in these early designs, but needed more research), while still differentiating it from that film. Thus I determined that we could also enhance detail in some form of bio-scanner to determine the difference between android and human. Humans would stand out as every mole, and pimple leapt off the screen (actors would have hated it). In this world, from my design point of view, being too human was the crime, hence why I wanted to imply a more technical visual with the contact plates and code numbers. When designing you often work from scant descriptions and let your imagination go. Hopefully the work will inspire the other creative teams. Some of the concepts I introduced were explored in the final series, but since I had no access to the full script, there was likely already some of these concepts being explored. (the creatives in this case are very clever)
My final contribution to the digital makeup design of the androids, was what we dubbed the “inner-face,” which was a visual display of data that the user could access, perform simple maintainence, and determine certain functions of the machine. We described it as an iPad interface, which could have proved hilarious to see the actors perform (or at least a good way to spread germs).
In the end, a more traditional approach was used in the series. Hire a great actor, and put a good flat base on his skin. The prospect of doing this digital makeup treatment to a lead actor all the time would likely prove expensive, as there are easily more shots in a television series than there would be in one two hour film. However, a version of the inner-face continued into the series, executed by another facility that showed lights under the skin when the android would use its analytical processes. It was good to see the design survive in some method, even if it possibly came from some other designer — at least it was a good idea (see my never written rants on the vagaries of television vfx and production in a link that I will never provide below).
Some internet reactions below which show off the inner-face as it eventually appeared:
“Dorian can upload human blood…wait, did his face just glow?”
The Show Does Not Always Go On
Almost Human remains one of my favorite series in the last few years. It’s theme of technology unbounded reminded me of the first season of FRINGE, which unfortunately stepped away from this theme to more fantastical multi-dimensional explanations of what happened onscreen. The performances by the cast in Almost Human, subject matter of discussion, and the racial/technological analog were marvelous narrative devices. Unfortunately it did not survive the ratings knife, and went into the dustbin of television with other great series like Farscape.
So I present these designs for your enjoyment, and thank MastersFX for allowing me to post them, and the makeup artists who put together the initial video sources that I was able to process.
Digital Makeup or dMFX is a growing field. It is time the makeup artists use the tools on purpose.