In this series I’m building an animated title sequence using a set made for DAZ Studio in Blender. This requires lighting and material tweaks, and messing with textures. It’s not a tutorial, just some dude trying his best at Blender (without knowing much about it, but learning a lot in the process).
The end result is an intro for my my game streams, as well as these new seasons of 3D Shenanigans. Once the set is built, I’ll replace a couple of key textures so that the cinema and screen show something different. It’s a lot of work!
I’ve made an interesting discovery the other day about one of my render nodes: with identical GPUs, one appears to render faster than the other. I didn’t get it at first. But with a possible explanation in my head, I got the thinking and applied the same principle to my other node, and was able to increase its render speed by 24%!
How exciting is that?
It’s all about retro hardware, and how to make the most out of what you already have. Let me tell you what I discovered, and how I made use of an old AMD/ATI GPU in my setup that I never thought would work.
I was asking myself this very same question. The obvious answer is YES of course, it really depends on the export settings. I had assumed of course that Premiere is clever enough to take the original resolution from whatever media is available, and do its rendering from that. Big mistake. Because it doesn’t do that!
While it is possible to pick a 4K or 4K UHD export preset, or even create your own, Premiere will up-scale your footage from 1080 to the desired resolution.
I’ve done some tests on this recently and can confirm that’s how Premiere works under the hood. If you want to get crisp 4K output from your edit, the timeline needs to be set to 4K or 4K UHD (depending on what aspect resolution you’re editing in).
I had an image sequence rendered on one of my nodes, and sadly my D-NOISE add-on did not kick in as expected. This was entirely my fault, and I thought I could perhaps just denoise the sequence rather than re-render it. Turns out it works, even though it does not match the results of a regular denoised render.
Be that as it may, let me show you how to use Blender’s mysterious Compositor to denoise a sequence of images automatically.
In this episode I’ll tell you much of what I know about the Environment Lighting in DAZ Studio. This technique is also known as Global Illumination. I’ll explain the meanings of such cryptic abbreviations as IBL and HDRI, and how all these pieces fall together to make your scenes look handsome.
This is a continuation of the previous episode about Mesh Lights. If you haven’t already, you can watch it here.
I’ve recently built a little animation during a live stream, and Rod’s suggestion was to add NASA’s Curiosity Rover into the scene. It’s a freely available blend file, and I thought it was a great idea. It added a lovely character to the otherwise deserted alien landscape, and I quickly animated it into position.
Trouble was, the little guy was essentially an afterthought, and when I was watching the animation back, it became obvious that its wheels needed to be turning as it was driving around. While I was keen to do this, I had no idea what mechanism I should use for such an Endeavor (har har), or what Blender had to offer in this regard.
My first thought was to simply animate the wheels with keyframes, but this would be a lot of work, and if the rover’s speed were to change I’d have to probably animate those wheels again. There being six and all, I discovered a better way to make the wheels turn, using something called a Driver.
A while ago I wrote an article about how to grow grass on a place in Blender 2.79 using the Particle Emitter system. This process has changed since Blender 2.80, and since it was never intuitive to begin with, we’re all a little confused as to how it works in the new version. While I still remember, let me jot down a note for everyone’s benefit.
I’ve been wondering if there was a way to replace dummy objects I’ve placed in Blender with other meshes. Say we do a particle simulation, and during rehearsal it’s all about speed – but for the real render, we need higher resolution meshes that might take a while to render in the viewport.
Thankfully it’s super easy to do this in Blender, here’s how. Let’s replace the default cube with Suzanne.
select the object you’d like to replace
head over to the Object Data Properties (green triangle icon)
at the top of the tab, left of the name of your object, click the drop down and choose browse mesh data to be linked
This brings up a list of items in your scene. Pick the one you would like to use as a replacement – and that’s it.
Note that this will only link the geometry and materials, it will not take across any modifiers.
While I was deep engrossed looking for a feature in the Blender Settings, I found something else I didn’t know about. It’s a way to always orbit around a selected object, rather than do that awkward thing where the viewport just goes off into oblivion when you least expected it.
I frequently use the NUMPAD + . (full stop) trick to focus on the selected object. This zooms in on the object, centres it on the screen, and as a result I can conveniently orbit around it. However, if an object is framed off centre, or even off screen Blender does something else when you move the camera… and I must admit that I’ve not been able to figure out what it is exactly. I probably never will. B
ut that’s OK, because there’s a simple tick box under Edit – Preferences – Interface that’ll make Blender orbit around whatever is selected, no matter where it is in relation to the screen. It’s called Orbit Around Selection.
When enabled, it behaves more like I would intuitively expect. Another Blender Mystery solved, and it makes me appreciate this amazing work of art even more.
With any software demo (or with failing eyesight as we get older), it’s important to have some visual aides so that your audience knows what you’re talking about. I’ve been streaming some Blender sessions recently, and I usually have my excellent little cursor highlighter tool called PointerFocus active. That’s good for viewers to follow the cursor.
By fluke something nice that’s build right into Blender, and that’s the ability to make outlines of selected objects show up a little bolder. They call it Thick Outlines, and this is what it looks like.
I think it looks quite nice, and I’m sure I’ll forget where and how to set that up so I thought I’ll write a note to my future self (and you, dear reader) in the process. We enable this by heading over Edit – Preferences, then under Interface, there’s a drop-down named Line Width. Set it to thick to get this effect.
There are a number of other good options here to that all improve the readability of Blender on your system. Enjoy fiddling!