I was back in London on Tuesday working on more tests with Merry and his custom built LED Light Array, trying to figure out the work-flow and how we can use and interpret the data into a production pipeline. Already the capture GI colour data is fantastic and each lighting direction serves as an invaluable reference source when sculpting and texturing. Unmatched by any other scanning or capture method on the market. The trick is being able to convert that data and use it to deform and displace a low frequency mesh.
Not easy but possible.
Just for fun
So I wanted to share some of the HDR shots we took on Tuesday. I used HDRShop to combine a bunch of bracketed exposures into one HDR image. Split into Probe HDR’s and Latitude Longitude HDR’s. They aren’t perfect as we used a garden mirror ball (which has dents and smudges on it) but still cool looking! and it is possible to use the HDR’s to render with. I recommend HDRSHop to unwrap the Probes into Longitude-Latitude Spherical Map Textures. Apply the map to an inverted sphere and hey presto!!
HDR Downloads (you can render with these as Probe Images)
CLICK IMAGES TO DOWNLOAD (20MB’s each)
Here is an early demonstration of using the Normal Maps to deform and displace a low resolution scan. C4D is very good at this. Due to limited customer support from my usual capture system, I turned to Breeze Software to capture and Scanner-Killer to process the 3D Captures.
I’m not sure if it is possible in Max, Maya or Softimage to use a World Space Normal Map to deform a mesh directly?? not just at render time as is how Normal Maps are usually used. If anyone has any work-flows on this I would love to hear about it.