Did I mention my new book?

Lampstack-SoftcoverOver the last few weeks I’ve written a book. It’s about how to run web applications in the comfort of your own home.

Yesterday it went live on Amazon! I have an author page and everything!

Turns out that writing was the easy part: formatting it so that it looks good on Kindle devices was a bit tougher. It’s a tech book after all, which means there are several screenshots and code snippets which need to be formatted to stand out from the rest of the text.

Before I call myself “best selling author”, let me describe how this book came to be.

 

Continue reading Did I mention my new book?

How to setup shortcuts to load figures in DAZ Studio

Even with Smart Content it can be tough to find a single figure from your library. If you find yourself loading a particular character a lot, you’ll be pleased to hear that there’s an easy to way to setup shortcuts on the menu bar for such things.

Screen Shot 2015-04-29 at 11.07.20

In this article I’ll show you how to create one for Genesis 2 Female. The principles are the same for any other figure in your library.

First, find your figure in the Content Library. DAZ characters are generally located in My DAZ 3D Library – People.

Screen Shot 2015-04-29 at 10.53.46

Right-click on the figure and select Create Custom Action.

Screen Shot 2015-04-29 at 10.54.26

 

Note that this context menu doesn’t appear in the Smart Content tab – only in the Content Library. As soon as you do, a new menu item called Scripts will appear in your menu bar. Add any other figures you like and they’ll all make their way into the Scripts menu.

You can now go ahead and select Scripts – Your Figure without searching your your favourite characters. You can even customise this further and rename “Scripts” into something else: head over to Window – Workspace – Customise:

Screen Shot 2015-04-29 at 11.00.04

Then head over to Menus and expand the Main Menu. Double-click the Scripts item and rename it – just amen sure the ampersand symbol is the first character. Feel free to drag the menu around to take a different place in the hierarchy.

Screen Shot 2015-04-29 at 10.59.01

Thanks to Slosh for this marvellous tip:

How to avoid Figure Relocations when applying a Pose in DAZ Studio

Some poses are meant to relocate the figure to somewhere other than the scene centre. That’s nice if the pose is part of a set: imagine a wardrobe in a large room from which a figure is supposed to take something, and the figure is moved to said wardrobe.

But sometimes that can just be plain annoying because your figure is moved out of the viewport, and far away from where you want it to be. Besides, the pivot point to move the figure is no longer where the figure is, but somewhere in the next room. Consider this example after applying a pose:

Screen Shot 2015-04-29 at 09.42.30

Not necessarily what we want. Thankfully there’s an easy trick to avoid this: hold down CMD on the Mac or CTRL on Windows, then double-click to apply the pose. Now a friendly context menu comes up that allows us to choose which values can be overridden by the pose.

Screen Shot 2015-04-29 at 09.47.14

To avoid X/Z relocations, simply untick those boxes. Likewise, if you don’t want your figure to rotate with the pose, uncheck Y in the rotation section. Hit Accept and your pose will be applied without those shenanigans that have driven me insane for many years!

Screen Shot 2015-04-29 at 09.50.48

Much better: now the pivot point is still where the figure is, which makes much more sense.

The CMD/CTRL trick works with many aspects of DAZ Studio, worth remembering for any other hidden context menus.

How to apply a Shader in DAZ Studio

Shaders are an important component in many 3D applications, but I never knew that DAZ Studio understood that concept too. I was under the impression that the relatively simple Surfaces Tab would be how to tweak what an object looks like.

Turns out, DAZ Studio has Shaders! And here’s how to apply them. Pay attention kids: this topic is going to become a lot more important as Iray waltzes into our 3D lives.

Select your object in the scene and head over to the Surfaces Tab. A Shader can not be applied to a selected model, we must select individual surfaces of the model first. In this example, I’m using the Genesis figure. I’ll select it so that all parts of the figure (and therefore all surfaces) are selected.

Screen Shot 2015-04-28 at 18.22.57

Now let’s find a Shader. I can’t find a way to display them in the Smart Content tab (huge surprise), but there are some we can find via the Content Library tab: under My DAZ 3D Library, find the Shader Presets folder. Explore any you’ll find in those subfolders. An interesting one is Shader Mixer.

Screen Shot 2015-04-28 at 18.28.59

With the desired surfaces selected, double-click any shader and do a render. Here’s Genesis rendered with the Flagstone Shader:

Genesis Flagstone

Notice how the Shader distorts the geometry of the figure. Or as Orange Toon:

Genesis Orange Toon

You can also combine one shader with another: this can be helpful to prevent the new shader from replacing any texture maps we may have. Hold down CMD (or CTRL on Windows), then double-click the Shader you’d like to mix. Mind you, this doesn’t always work, but it’s what they suggest in the manual:

 

Iray Shaders

DAZ Studio 4.8 comes with several Iray Uber Default Shaders, which can also be found under Shader Presets (indeed a weird name – but hey, I’m just the messenger).

Screen Shot 2015-04-28 at 18.56.45

It is recommended to apply the Iray Uber Base to any material that was originally setup to be rendered in 3Delight. This is an opportunity to mix shaders as described above.

The difference isn’t huge, but I guess they’ll eventually come out with Iray Skin Shaders and whathaveyou. Here’s a quick comparison between a character with default materials applied (left), and the Iray Uber Base mixed in (right):

Compare

Getting started with UberEnvironment in DAZ Studio

UberEnvironment is a shader based light, available exclusively in DAZ Studio. It brings image based lighting to DAZ Studio when using the 3Delight render engine. It has been around for a while, but it has been a bit of a mystery to me for the last 6 years. I think I’ve finally grasped some of its basics – time to write them down before I forget.

Unlike traditional lights in DAZ Studio, UberEnvirnoment creates a sphere in the scene onto which a HDRI image is projected. The sphere itself then emits light, creating some very realistic looking ambient light. UberEnvironment can be used on its own or in combination with other lights for nicely balanced results.

Historically, the UberEnvirnoment product had to be purchased through DAZ by a vendor named omnifreaker. It has since been updated to UberEnvironment2 and is now included for free in DAZ Studio 4.x (as part of the “Default Light and Shaders” – make sure this installed).

The product itself is difficult to find in the Smart Content tab because it appears mixed in with some other scary items. It’s much easier to access it via the Content Library tab: navigate to

  • DAZ Studio Formats – My DAZ 3D Library
  • Light Presets (not Lights!)
  • omnifreaker – UberEnvironment2

You’ll find the following icons:

Screen Shot 2015-04-28 at 16.00.47

The first two icons are the base product. They can be added by double-clicking or by merging into the scene (using a right-click). This will not replace traditional lights in the scene. I don’t know what the difference between those two is, only that the second one brings in a much smaller sphere than the first.

The third icon is a link to the “official” documentation, and the fourth icon is a conversion script to use standard HDRI images with the format required by UberEnvironment (I know nothing about it).

The following eight icons resembling coloured tea pots will load one of eight preset HDRI maps into the scene. By default no image is loaded for an “ambient occlusion only” look. Double-click one of these eight icons (or load your own under Parameters – Light – Color) and you’ll see the image appear on the sphere.

And lastly, the bottom 5 icons represent render settings for the UberEnvironment light. Lower quality renders images faster but will introduce a lot of dark blotches – good to get an impression of the scene. Note that higher quality settings will take a while, especially if transparency and SSS is involved.

Once added to the scene, UberEnvironment can also be rotated to match any other lights. To better see this effect, select it and scale it down until it’s visible in the scene (don’t worry, it won’t render – it only shows up in the scene preview, and the scale of the sphere does not affect how the light is rendered).

Screen Shot 2015-04-28 at 16.37.02

The hotspot on the HDRI image is the location from which the IBL is going to come from, casting shadows in the opposite direction. You can use the standard transform tools or the parameters tab to rotate the sphere.

 

Let’s look at an example

Here’s our model Lilith sitting on an uncomfortable looking box. She’s lit with a single spotlight. Nothing else is in the scene:

Spot only

The shadows look very dark and the skin looks like plastic – that’s just how 3Delight renders things.

Now I’ll turn the spotlight off and add an UberEnvirnoment to the scene, which can create a lot of light by itself.

 

Uber only

The skin looks less like plastic now, but much of the background has seemingly disappeared – so we’re using a little too much intensity. UberEnvirnoment is great when it’s mixed in with other lights. Let’s turn it down from the default 100% to 20% and turn our spotlight back on. This will mix both results together.

Spot+UberEnvironment

Not bad: softer shadows, less “crushed blacks” as we say in television. Both the spotlight and UberEnvirnoment have an intensity setting – adjust them for best results through test renders.

The above render took just under 10 minutes: as soon as it reaches skin or hair, DAZ Studio is seemingly stuck, give it a moment and will carry on. It’s all that sampling it has to do I guess. To speed up the render, select one of the lower quality settings.

Low Quality

This render took just over one minute, thanks to the low quality preset. Don’t worry about those blotches for now, they will disappear with higher render presets.

 

What do the render presets actually change?

Those presets change the values for Shading Rate, Max Error and Maximum Trace Distance. You’ll find those when you select the UberEnvirnoment in your scene, then select Parameters – Light – Advanced. Feel free to fiddle!

Screen Shot 2015-04-28 at 16.32.59

There are a lot of other Uber-products available, most of them work on a very similar principle: UberArea, UberHair, UberSurface, UberEverything.

These helpful links explain more about UberEnvironment:

How to disable Image Grids in Carrara

You know those grids in Carrara that often get in the way? Those that only appear in the viewport and not in the final render. The ones that show you the outlines of your objects in yellow:

Screen Shot 2015-04-27 at 19.42.29

Sometimes you can’t see your scene with too much clutter. And I keep forgetting that there’s a super easy way to switch these grids off.

Cast your eye to the to of the viewport and find the following icons:

Screen Shot 2015-04-27 at 19.42.37

See those three little grid icons? Click any of them to make it disappear in the viewport. It’s that easy! Disabled grids turn dark, enabled ones are lighter in colour:

Screen Shot 2015-04-27 at 19.42.54

With the side grids switched off, the scene looks less cluttered.

Screen Shot 2015-04-27 at 19.43.02

Sometimes I forget how user-friendly Carrara can be 😉

How to reload image textures in Blender

We often have to tweak images in an external application while they’re already applied to a 3D object. To see our changes in action, it is necessary to reload the textures in Blender. Few applications detect such changes automatically (which is sad – because it’s not exactly rocket science to implement this).

To do this, change to the UV Editor in Blender and select Image – Reload Image.

Screen Shot 2015-04-26 at 09.56.19

You can also use the keyboard shortcut ALT+R without changing into the UV Editor.

How to use reference images in Blender

Reference images are helpful for modelling objects or to add simple backgrounds to scenes. There are at least two ways in which we can add them in Blender.

 

Adding Background Images

One way to do it is via Background Images. On the tab next to the Properties Palette (expand it with the little plus icon on the top left), find the Background Images tick box.

Screen Shot 2015-04-26 at 09.30.47

Open an image of your choice and select the relevant changes, such as opacity, stretch/fit/crop, and select which axis you’d like this image to appear on. Blender does not allow you to select more than one image at a time, so you can’t add all views of an object. But as you load more images, Blender does remember them in the image list, so you can pick a new one from there instead of loading it again.

Note that such images will only show up in orthogonal views, not in perspective views. To change your current view, select View – View Persp/Ortho.

Screen Shot 2015-04-26 at 09.39.33

 

Loading images as planes

Another option is to import an image directly as a plane. It’s a shortcut of creating a fully UV-mapped object into your scene and position it just the way you like it, preserving the aspect ratio of your image. This is an add-on that’s installed by default with Blender, but it needs to be activated to be used.

Under File – User Preferences, head over to the Add-Ons tab. Select the Import-Export section and find Import Images as Planes. Tick the box to the right of this option and it’s activated.

Screen Shot 2015-04-26 at 09.42.32

Now head over to File – Import and select Images as Planes. Pick an image and you’ll see a new plane object in your scene.

Screen Shot 2015-04-26 at 09.45.44

View your scene in Textured, Material or Rendered modes to see your image.

How to use Blender as a simple Render Farm for animations

Blender has a bafflingly simply way to let several computers render the same animation. Render Farms are usually setup in a way that one machine is the “master”, and the others are declared “render slaves” that each render a single frame or even a single bucket of a frame. The master then assembles everything into a single video file.

But rendering directly to a video file isn’t always desirable because single frames cannot easily be re-rendered if there was a problem. While there is a way to setup Blender using such a master/slave setup, there is a much easier way to render an image sequence: all we need are several computers that can see the same directory on the network. This could be a Dropbox folder, so the render nodes don’t even have to be on the same network.

Here’s how to let several computers render the same animation together, producing a directory full of images that can be assembled in a video editor.

Create your animation and head over to the Render Settings property. At the bottom of the screen, deselect Overwrite and select Placeholders. For the Output Path, navigate to your shared folder than can be seen by all networked computers.

Screen Shot 2015-04-18 at 09.27.41

Under Render, hit Animation to start rendering the image sequence. Do this on every computer on the network after loading the same animation file. Blender will go to work on all nodes, each of which will render the next frame in the sequence.

Screen Shot 2015-04-18 at 10.02.17

 

Minor Caveat

The way this works is that each node looks at the folder and takes a look at what files are already there, then it picks the next one and renders it. Because there is no communication between the nodes you may experience occasional frames that have been rendered twice – because two nodes may see that 23 frames have been rendered and the 24th one would need to be picked next.

Such double-renders are marked in the sequence though, so it’s easy to delete them before assembling the sequence. The more computers are involved, the more this is prone to happen. In my tests I’d say that of a 100 frame animation there are perhaps 10 additional renders. Blender marks them as “conflicted copy” with the same image and the computer that caused them.

In my opinion this is a very small price to pay given the fact that you do not have to deal with complete re-renders or network communication trouble. This way of rendering can even utilise remote machines without nothing more than a simple shared folder.

How to drape cloth in Blender

Cloth2

Blender has an excellent physics engine that can simulate cloth – among a great many other things. Blender does this using a modifier: all we have to do is declare one object as being “the cloth”, and other objects as the ones colliding with the cloth.

Let’s see how in this quick example.

Starting with the default scene, I’ve added a plane to the scene as the ground, and a grid with a resolution of 100×100. The difference between a plane and a grid is that a place only has four vertices, and a grid has many more. The more vertices we have, the better our cloth is going to look like.

I’ll position the cloth above the cube so that it can fall down during the simulation and drape itself around the cube. The scene looks something like this:

Screen Shot 2015-04-17 at 09.11.11

Now I’ll select the cloth and head over to the Properties Palette on the right and side. Click the little Physics icon on the far right – it resembles a bouncing ball. If you don’t see it, expand your palette.

Screen Shot 2015-04-17 at 09.12.55

This will bring up some other interesting properties that allow us to modify how the cloth behaves when draped. Choose from a few presets or go crazy and build your own. I’ll stick with all the defaults for now.

Screen Shot 2015-04-17 at 10.05.03

Enabling cloth like this will create a cloth modifier – just in case you happen to examine the little wrench icon, you’ll find it in the list. It’s a good idea to head over there now and add a Subdivision Modifier to your cloth. This will imply more geometry without the overhead of having to calculate this during the simulation.

Screen Shot 2015-04-17 at 10.08.12

Next we’ll select the objects we want our cloth to collide with – in my case that’s the ground plane and the cube. Then head over to the Physics Icon again and declare these objects as Collision Objects.

Screen Shot 2015-04-17 at 10.11.02

Now comes the fun part: asking Blender to calculate the physics. This happens simply by pressing the Play Button on the Timeline. By default an animation is 250 frames, and if you press play now, Blender will calculate all 250 frames. This is a lengthy operating and often not necessary for a still image.

To trim this down, set the end point of your animation to something like 30 (at the bottom of the screen). This will make Blender stop calculating after 30 frames. Depending on the size of your cloth you may need more frames, but you can always add them later. Blender will remember previous calculations and simply start at frame 31 if no other properties have changed.

Screen Shot 2015-04-17 at 10.13.38

Press Play (on tape) and Blender will start draping your cloth frame by frame. When it’s done it will present you with a fluid animation. Feel free to stop it and move the playhead to a desired point in time at which your cloth looks best.

You can of course render out the entire animation too and have your cloth interact with moving objects or other forces such as wind.

Screen Shot 2015-04-17 at 10.17.49

Notice that the cube pokes through the cloth at the corners and seemingly sinks into the floor. This can be fixed either by lifting the cloth object up a little bit, or by selecting the cloth object and tweaking some of its collision properties (click the Physics Icon in the Properties Palette and scroll down to find this option).

Take a look at the Distance value: the default is 0.015, and only a tiny increase to 0.04 can make a difference here. Don’t increase the distance too much, otherwise the cloth will appear to hover over the ground.

Screen Shot 2015-04-17 at 10.21.00

Every time you make a change to any of the physics properties, Blender must recalculate the animation. It does this automatically as soon as you hit play, updating every single frame. As soon as you see a fluid motion again in the viewport all frames are updated.

And what’s even better: this animation cache is saved with your .blend file! Open it at a later time and everything is pre-calculated without the need to run the simulation again.

Grouping and Parenting in Blender

Blender is different than other applications. If you’ve used grouping or parenting in other applications, it may throw you off guard how Blender thinks about those things. A little explanation is in order to bring clarity to our cluttered 3D minds.

Usually we can group objects together so that when we select one, the other one is selected at the same time. In other applications, grouping and parenting are the same thing, but not in Blender. Groups are in itself only of limited use, while parenting is what other apps call grouping (from what I understand).

 

Grouping

To group several objects together, select them all by holding SHIFT and right-clicking each object, then select Object – Group – Create New Group.

Screen Shot 2015-04-26 at 10.05.50

On the right hand side you can name your new group. Note that if you miss this opportunity, there doesn’t seem a way to do this later: I haven’t managed to find a way to even display groups in the current scene.

Screen Shot 2015-04-26 at 10.08.55

The outline of all your selected objects will turn green, and any transform action will affect all selected objects at the same time.

In the scene hierarchy however, neither the group is displayed, nor are any of your objects parented. In fact, the group itself doesn’t even show up. It’s as if nothing has been grouped at all.

We can however instantiate a new group in another scene, in which our group will behave like a cohesive object. To do so, select Add – Group Instance and select your (hopefully named) group. This will insert all your objects (and a null object) together as one into your current scene. It will even show up in your scene hierarchy now.

Screen Shot 2015-04-26 at 10.17.06

 

Parenting

Parenting is what other apps call grouping: you move the parent and all children move, but you can move a child individually without the parent moving. Besides it tidies up the scene hierarchy via collapsable little icons.

To parent objects in Blender, select all objects, making sure that the last object you select will become the parent. If you’ve made a mistake in the selection order, hit A to de-select everything and start again. Click Object – Parent – Object, or use the keyboard shortcut CTLR+P.

Screen Shot 2015-04-26 at 10.22.02

When you do, a small pop-up window appears. Select Object again and you’ve built a relationship in the scene hierarchy.

Screen Shot 2015-04-26 at 10.22.18

The object selected last will become the parent. Feel free to rename it for the whole group to make more sense. Now the familiar plus/minus icons let you collapse large scenes into something more manageable.

Screen Shot 2015-04-26 at 10.28.49

Consider adding a null-object to the scene and select it last to make it the parent – this assures that you can still move the last object independently from all the others. You can do this under Add – Empty.

 

Unparenting

If you no longer want an object to be part of the parent group, select it and click Object  – Parent – Clear Parent. This will return the object(s) back to the scene root. Alternatively, use keyboard shortcut ALT-P.