•March 1, 2017 • 4 Comments

I was trying out a bit of Substance Painter this week, in combination with 3DSMax and Marmoset Toolbag 3. A few months back I had shot a muddy embankment near a construction area with my phone, about 16 photos, but really didn’t think anything decent could be recovered through photogrammetry. The results show however that it’s always worth a go even if you don’t have the proper equipment with you. Some cloning work in painter and then an export into Toolbag 3 for rendering yielded a decent mud texture, next step is to make it tileable.





A Stylised Tree…

•January 18, 2017 • 1 Comment

Hello 2017! Here are some WIP renders of a little look development scene I’ve been working on, again playing with the idea of stylised photogrammetry. I wanted a general use shader that could be applied to arbitrary objects and shade nicely, with soft internal tones and hard external outlines. The tree model itself was generated from photos taken with my phone, decimated and then brought into 3DSMax. I wanted to play an effect to mimic 2D illustration, and was inspired by backgrounds from Disney’s Winnie The Pooh (2011). I’ve long been a fan of V-Ray’s SSS material system, it softens shading beautifully when used right but it can be a complex beast to tame. To mitigate this, I scaled the scene geometry to be very small, this helped with some extreme softness from the SSS shader right out of the box when lit by the VRay Sun and Sky. I used the VrayToon environment effect to create rough outlines on the geometry and a hand drawn line/crosshatch texture map which was wrapped around the tree trunk cylindrically in a new UV channel. It’s not where I want it to be yet, but I’ll keep at it, below are some work in progress images.

petemcnally_vray_stylisedtree screenshot_02 screenshot_01petemcnally_vraytree_wirepete-mc-nally-giphy

Stylised photogrammetry…

•November 5, 2016 • Leave a Comment

I recently attended a workshop where the subject of stylised photogrammetry arose, in other words, 3D scanning stuff for purposes other than photo realism. Would it even be practical to scan objects for a stylised project when realism wasn’t the end goal? I’ve done some experiments on this before (I blogged about it here). I’ve recently blogged about scanning a human head and I thought I’d try a non-photorealistic application for it, by voxelising the scan and rendering it with SSS materials in V-Ray. I then took it into Prisma on my phone and ran some filters on it, see below!

Original scan work, the head scarf was scanned and provided by 3DScanLA


And the sylised version!


After Prisma:


A header III…

•October 30, 2016 • Leave a Comment

Well here’s a frightening visage for Hallowe’en straight from the depths of the lab! Further research on my head scan below, started to patch up missing areas of the mesh and did some work on the albedo, roughness and SSS maps. It’s coming together slowly, more to do on finalising low poly topology and the other textures, cleaning up the hairlines etc. Rendered here in real-time with GI in Marmoset Toolbag 3, which is being released in November. Textures are currently at 2k and are holding up OK when close up.

Temporary low poly geometry wireframe

petemcnally_head_3dscan02_wireNormal map details


Albedo, SSS, roughness added


Portrait view

petemcnally_head_3dscan03 petemcnally_head_3dscan04

A header II…

•October 18, 2016 • Leave a Comment

Some additional cleanup work in Zbrush here, where I smoothed out some of noise, added some more definition to the skin creases and created a displacement map from the original scanned diffuse texture for micro detail such as pores and stubble. It’s getting closer to something usable but still more work to do on the model and on the diffuse and displacement maps.



A header…

•October 4, 2016 • Leave a Comment

I scan rocks. Lots of rocks. Rocks are good. Rocks don’t move, aren’t squishy, aren’t affected by wind and stay rooted in place if you have to leave the scene and return- rocks just make great scan subjects. Humans are different for obvious reasons, a stifled smile, a muscle twitch, a weight shift or a rumple in clothing caused by a draft are all enough to throw off a 3D scan when using photogrammetry. It’d be great if the same cheap 3D scanning technique that works so well for large static objects worked for humans but unfortunately it doesn’t, the best human scans come from expensive multi-camera rigs that folks like James Busby use to get scans like these. Maybe a detailed mannequin or waxwork would do the job but those are hard to access, people are everywhere but it really takes a patient subject to be a good model for single camera photogrammetry. They’re gonna have to sit or stand still while you orbit around them clicking away with the camera, fix their gaze at a point in space, control their breathing and be prepared to start the shoot from scratch if they have to move for whatever reason. Then they have to be OK with you using their likeness, it’s all a big ask if you’re not paying someone, that’s where family and friends come in!

For the past few weekends myself and my kids have been taking a swing at handheld human scanning. First we tried my eldest son as the model with me taking the photographs, but after a few tries he decided he just couldn’t keep in the laughing and quit on me! So we swapped roles, I set the camera up on a tripod and gave him a remote shutter release and showed him how to frame and focus, and I took a seat in the garden. I closed my eyes to make it easier to hold still, scans like this often have trouble with resolving reflections in the eyes and can leave the area very messy. My son took about 22 photographs in a 180 degree orbit, in a single row from right to left then back again. When I took them back to the PC for raw processing I saw that less than 14 images were in proper focus, the daylight had faded by then and it was too late to reshoot. We had tried hard and rather than see the session wasted I quickly masked the images and threw them into Photoscan anyway. Now, “garbage in, garbage out”, as the saying goes in VFX, and here the results were predictably less than stellar but actually not a bad starting point, definitely salvageable with a bit of patch up work and some sculpting in Mudbox/Zbrush – it’s all practice for me right now anyway. This is what the raw mesh looked like out of the box, rendered here in V-Ray using the Fast SSS shader:


There was more daylight on one side of the head so detail on the right was more accurately resolved, on the side with less light you’ll notice the ear wasn’t captured and there is more noise present around the beard and nose, this is also due to involuntary movement. The noise is more visible from the surface normals (click the image for higher res version):


So, although some micro detail was captured, like the forehead wrinkles and some skin blemishes, the beard and hair are a mess of wobbly noise. This should be fixable in Zbrush (or with, you know, a comb in real life) and I’ll also mirror the ear to the other side and try and patch that up. I managed to get on the beta of Marmoset Toolbag 3 a while ago, here’s a very quick low poly normal mapped version using the skin shader and some secret sauce:


And lastly for now, a cover for my forthcoming* album made from a decimated low poly copy:


In the next post, I’ll cover some of the cleanup work. Thanks for reading!


*Not forthcoming.



Quigley’s Rock…

•September 1, 2016 • Leave a Comment

Recently I spent a week in Inishowen, Donegal where I took a tripod, monopod, laser measuring tape, Macbeth card and a Nikon DSLR with the express purpose of taking a holiday but also doing some proper 3D scans when I could 🙂 Long photogrammetry sessions were impossible on some coastal trips up there, but I did manage to get quite a few partial scans, three or four decent ones and I learnt a truckload during the exercise. Right now I’m in the process of trying to mesh the scans and break them up into usable modular assets as part of a full “Irish Coast” biome, including LODs at 16k, 8k, 4k and 2k tris.

I’ve refined my pipeline a bit too, incorporating Substance Painter 2 into the mix (it has useful simultaneous multiple texture painting features that have been a huge time saver in the mesh cleanup process) and I’ve had a look at Megascans too, which promises to be an essential part of working with PBR. Here is one of the scans that required the least amount of cleanup, click for large versions.






Here is a comparison of the LODs, from 16k tris (nearest) down to 2k at the back




Finally, here is a link to a Marmoset Viewer version that you can navigate in realtime on my Artstation page. (You can move the light by holding shift and mousing around)