I had some great progress from the previous post regarding real-time hair. I’ve been working closely with a shader developer. Under some strict artistic guidance we’ve developed our own hair shading solution and import pipeline for Unity..
Taking snippets from research papers from mid 2000 and injecting in some art direction to add more control to the look of the hair. Things like glitter, ambient occlusion, image based lighting for diffuse and specular term plus 2 lobe colour control, back scattering and absorption.
We do however face the limitation of Unity’s shadow system which is atrocious, but Unity makes up for this in other ways. Fingers crossed we will see improved shadows with Unity 5 and Enlighten. Or at least give us access to the shadows to process them!!!
It’s great to be able to revisit digital hair, I spent a while back in 2007 trying to solve this problem with polygon hair and lightwave to no avail. Then came HairFarm, which solved creating hair but we were restricted to composite scan-line rendering. Now we can do it in real-time and in VR!!!!
Incredible times we live in. VR is here..
I used to enjoy tinkering with off-line rendering with Modo, Lightwave, Softimage (Arnold) and 3DSMax (Vray) it’s really trivial these days to get good realistic human renders due to sophisticated scanning technology and rendering applications like Arnold. But nothing can beat seeing your work up close in 3D, in VR with depth perception. It’s certainly better than waiting minutes, hours or days per render frame. Much better than progressive rendering but real-time technology is limited.
There are some great artists doing some fantastic character rendering out there, like Dan Platt, Dan Roarty and Brett Sinclair. I really hope to see some of their work in VR some day!! It will be amazing to experience. I’m certain that soon, many artists and developers will not only be building content for VR but inside VR, with their bare hands.
Currently, I find it much more satisfying being able to view 3D models in real-time in virtual reality (using the Oculus RIft DK2) but the content creation path is MUCH harder, as you have strict limitations to overcome due to the graphical technology not quite meeting the standards of off-line rendering. So you have to find hacks.
However, the bridge between real-time and off-line is starting to close, very rapidly. The trick with real-time AND off-line hair has always been 1) The hair model input 2) The shader and rendering output.
The shader is integral for realistic results, mainly due to 2 lobe control and the anistropic term used.
Real-time hair has been the privileged domain of a few developers who have developed their own solutions. A good example would be the most recent Tomb Raider game. This was a fantastic TressFX integration and looked incredible.
Older generation hair, advanced even by today’s standards but notice how old these demonstrations are!!
Another more modern example would be “Luminous Studio – Hair technology demo” great reference and inspiration.
Our solution doesn’t feature dynamic simulation (yet) but we are working with a few other developers who have already ported their own TressFX version with simulation. Just over the weekend Kennux has successfully been able to read in our hair models splines through the .ase format and convert them to TressFX strands with simulation. We just need to merge the new shader solution.
I recommend heading over to Kennux’s blog, he’s doing some fantastic TressFX research.
Here are some example screen shots from Unity’s view port. These will render at about 100-160fps in real-time, in mono. About 75fps in stereo with VSync on.
You really have to experience the hair in stereo in VR to believe it. The shimmer, the glints of each strand is really mesmerizing and adds so much character to human scan data. Extra life.
500,000 polygons 60mb Blonde and Brown Hair
300,000 polygons 40mb Brown, Red and Black Hair
250,000 polygons 30mb Brown and Red hair
400,000 polygons 40mb Blonde and Brown Hair
300,000 polygons 40mb Real-time shadows and with IBL diffuse and specular only
250,000 polygons 24mb Black, White and Blonde hair. Also featuring Blue hair with full Marmoset Skyshop integration!
Here are some examples of the shader control we’ve implemented:
Base Colour, Base RGB with Alpha support (order independent transparency), Alpha Cutoff, Diffuse Aniso, Specular Fresnel, Anistropic Direction, Lobe 01 Spec Colour, Shift and Roughness, Lobe 02 Spec Colour, Shift and Roughness, Lobe 03 Spec Colour and Roughness, Glitter Scale, 01 Glitter Colour, Contrast, Amount, 02 Glitter Colour, Contrast, Amount, IBL Specular Shift, Roughness, Colour, Aniso, Amount, IBL Diffuse, Jitter Slot for GGX noise (glitter map), Back Scattering Colour, Size, Occlusion (baked Vertex AO) Colour, Specular, Amount..
That’s a ton of control. Which can all be accessed in real-time whilst running an executable.
Below are some example screen grabs of what the controls can do, a video will be posted soon.
Alpha cut off control:
Anistropic direction:
Lob01 colour control:
Lobe02 colour control:
Lobe01 specular roughness:
Lobe01 specular shift:
Lobe02 specular shift:
Lobe02 specular roughness:
Lobe03 colour control:
Lobe03 specular roughness:
Glitter01 colour:
Glitter01 contrast:
Glitter01 amount:
Glitter02 contrast:
Glitter02 amount:
Occlusion (Vertex AO) specular:
Occlusion (Vertex AO) amount:
Occlusion (Vertex AO) impact overall
Hair in Tuscany with OVR 0.4.0 support:
The hair styles were created by the very talented Dani Garcia (aka Woody)
For the conversion process we have to go through a very strict process of converting the HairFarm style into polygon strips, then export as .FBX with the Ambient Occlusion baked using XNormal. Once we have that data we can use the AO to fake more depth into the hair once fully lit in Unity. As well as use it for strand variation and extra specular amount variation.
Example screen shots of the baked AO:
Example of the transparency mapping to the polygon strip UV’s:
Early Peach fuzz setup (all of the above can be used to further detail the characters and keep to a good frame rate):
We are still trying to further improve the shader and optimize to boost frame rates. When combined with IR’s skin shader and further details the frame rate will start to creep below 60fps, so we are finding ways to optimize. Mainly because we are rendering about 1,000,000 polygons in hair alone, including the skin fuzz.
We hope to integrate multi-light control, at the moment we only support 1 light and IBL from Skyshop. Simulation dynamics with full TressFX or HairWorks integration. SSAO or HBAO for skin and hair ambient occlusion in real-time. Full PBR integration.
The development community is still a long way from truly realistic off-line to real-time humans but I believe the above content creation process is the way forward for hair. Whether you use ZBrush Fibermesh, Hair&Fur or HairFarm, all you need is access to the spline data and a well thought out hair shader. Once you have this, you can do anything!
Next post will be about IR’s real-time custom skin shader called “Flynn”
INFINITE
13 Comments
Joe Saltzman
Now THIS is something to get excited about. Well done, this is pretty darn awesome...Kudos !!!
August 11, 2014Infinite
Thanks Joe!!
August 11, 2014yolao
Hey Lee
Fantastic work, it looks really amazing!!
I was wondering if Unity could be used as a render for rendering character animation?.. does it import high resolution meshes with animation through alembic, mdd, or any other format?
Thanks
August 17, 2014Infinite
Hey Yolao, thanks, yes I should think that's possible. Not sure about Alembic though.
August 17, 2014yolao
hey Lee
Great, thanks for the info.
Cheers
August 17, 2014ScanLab
Great work Lee, as always.
With your trinity example, you have that hair strand sample to the left of the image. Would it be possible to source a few different types of strands based on the idea and cycle through them randomly. Even splitting a sample in two, so the hair looks thinner.
Its very inspiring to see all this new real-time stuff.
September 12, 2014Infinite
Thanks Ruslan most appreciated. Not sure what you mean about "that hair strand sample to the left of the image"?
September 12, 2014ScanLab
Oh, this one:
https://ir-ltd.net/wp-content/uploads/2014/08/Hair-Cards.jpg
I took another look, and realized that what I said was besides the point.
It looks like the polycount is still a bottleneck. I wish there was another way to do hair. Especially, if the ultimate goal is to have hyper-realistic surfaces, with progressive peach-fuzz, which stretches from the nose at 0% to hairline where it goes to 100%. :)
Next on the list: FuzzFarm - Hair follicle scanner for humans!
September 15, 2014Infinite
Understood. Well we have peach fuzz and body fuzz hair in, it all works really well in mono. 200fps no problem, the issue is going stereo at 2k at 75Hz. GPU's aren't quite there yet.
September 15, 2014Fernando Angelo
Wow, those are really impressive results, thanks for sharing. Will the Trinity shader be available in the asset store soon? Is there any plans for a beta?
October 16, 2014Thanks!
Infinite
Thanks. Not any time soon but maybe in the future.
November 16, 2014FancyKiddo
Have you guys taken a look at the strand capture system demonstrated in this video?
https://www.youtube.com/watch?v=QCgWMIYGbV8
It seems like it would be fantastic with your scanning methods, and you have an old library with which to test and debug. I'd love to see something like this come to commercial products, or maybe even to see more people just explaining the technique!
December 28, 2015Infinite
Yes it certainly has potential.
March 29, 2016