Blog

Discover what makes us unique
← Back

I was back in London on Tuesday working on more tests with Merry and his custom built LED Light Array, trying to figure out the work-flow and how we can use and interpret the data into a production pipeline. Already the capture GI colour data is fantastic and each lighting direction serves as an invaluable reference source when sculpting and texturing. Unmatched by any other scanning or capture method on the market. The trick is being able to convert that data and use it to deform and displace a low frequency mesh.

Not easy but possible.

Just for fun

So I wanted to share some of the HDR shots we took on Tuesday. I used HDRShop to combine a bunch of bracketed exposures into one HDR image. Split into Probe HDR’s and Latitude Longitude HDR’s. They aren’t perfect as we used a garden mirror ball (which has dents and smudges on it) but still cool looking! and it is possible to use the HDR’s to render with. I recommend HDRSHop to unwrap the Probes into Longitude-Latitude Spherical Map Textures. Apply the map to an inverted sphere and hey presto!!

HDR Downloads (you can render with these as Probe Images)

CLICK IMAGES TO DOWNLOAD (20MB’s each)

           

Here is an early demonstration of using the Normal Maps to deform and displace a low resolution scan. C4D is very good at this.  Due to limited customer support from my usual capture system, I turned to Breeze Software to capture and Scanner-Killer to process the 3D Captures.

I’m not sure if it is possible in Max, Maya or Softimage to use a World Space Normal Map to deform a mesh directly?? not just at render time as is how Normal Maps are usually used. If anyone has any work-flows on this I would love to hear about it.

 

XYZ=RGB

   

8 Comments
  • linoReply

    Hello
    You can create a custom shader to input these normal maps. You can try that in mentalray. I know studios do that for lightstage input information to plug into renderman. I like your work on these and would like to see some possibilities near to lightstage at affordable cost. Keep it up.
    --Lino

    August 30, 2011
    • InfiniteReply

      Lino, thanks for the info. It sounds quite encouraging. It might be slightly out of my remit though. Also I have heard that Lighstage capture can be rather costly. I don’t really see why that should be the case though, it can be done cheaply and can even be achieved using off the shelf lighting (Studio Flashes as well as trigger box) and cameras.

      September 5, 2011
  • John MatherReply

    You can get a Maya shader that implement's the hybrid normal technique created by Debevec et al here: http://www.cmlab.csie.ntu.edu.tw/~liubiti/HybridNormal/index.html

    September 5, 2011
    • InfiniteReply

      Many thanks John. Does this actually displace the mesh? or is just used at Render time? Ideally I am looking for a solution (other than Cinema4D) to solve this. A physically displaced high resolution mesh is more desirable than a shader at render time.

      September 5, 2011
  • JamesReply

    Hey Guys,

    Any chance of puting the combined hdri map you built for download as hdrshop always caused me issues. Cheers and keep up the great work

    September 13, 2011
    • InfiniteReply

      Hi James, these are the only files I have uploaded. You should be able to use them as HDR probes, they aren't really production useful, just to show what it looks like inside the rig. You don't really need hdrshop to use them.

      September 13, 2011
  • Srinivas MohanReply

    Chk this out
    http://www.jupiter-jazz.com/going-skin-deep

    March 22, 2012
    • InfiniteReply

      Interesting thanks. It would be great to see some example renders of Jupiter SGI Skin in action.

      March 22, 2012
Would you like to share your thoughts?

Leave a Reply

*