Remember how I was so thrilled about that new Blue Yeti microphone in my previous post, and how this thing sold out so quickly? Well it arrived… and I’m less than pleased with the service I’ve received from online giant Amazon.
What they’ve sent me as the correct item, but it was not a new item. It showed heavy signs of usage. Let me show you some pictures below, anon about that hilarious chat I had with their customer services agent.
Grab a coffee and read a funny story of how Amazon may have lost their edge in Customer Satisfaction.
I always forget how to rotate HDRIs in Blender. It’s really not that difficult, but somehow this information doesn’t seem to save in my brain. I’ve given up trying understand why, so I thought I’d write it down for a future visit. At least I know where to look now 🙂
In the Shading Tab, switch over to World. Add your HDRI image as you usually would (with an Environment Texture).
To make this thing rotate, we need to make ourselves a Texture Coordinate (under Input) and plug that into a Mapping Node (under Vector). Connect the Generated output into the Vector input, then plug the Vector output into the Environment Texture so that we can control the various aspects of our HDRI now.
We’re after the Z rotation, which will make or HDRI rotate horizontally. Here’s the complete node setup (click to enlarge):
Getting characters and scenes from DAZ Studio into Blender is one of the toughest things to get right. It’s an endlessly time consuming, confusion and generally un-fun process. Several scripts exist to make this happen, yet many of them fail to make it a one-click solution. Jacques aka mCasual has been working for years on something called TeleBlender. Steve aka Backdoor 3D recently did a live stream on the process, and I finally had a chance to try it out myself.
In this article I’ll show you the workflow that I found worked best for me. You may know a better way, and perhaps it’s not the intended way of working, but it thought it might come in handy (since usage instructions on the download page of TeleBlender are literally non-existent).
I’m using the following versions, which will probably no longer exist by the time you read this article:
You may have seen the announcement about DAZ Central recently, DAZ’s new content management app. I’ve had a look at it as soon as I heard it and I thought I’d give you my impressions and opinions about the new software. I’ll also try to answer the question why it even exists, considering that we can already do what it does with other means.
Note that what I’m telling you in this article is based on observations, opinions and speculation rather than insider knowledge or hard facts. Call it “fan fiction” if you will. It’s more about sharing those thoughts and an attempt at explaining the often mysterious and unexplainable.
The other day I was stumped with what felt like an easy task: create a non-standard video in Premiere Pro, whose final output was supposed to be 1920×120. As wide as 1080p, but only a small strip in height. That should be simple, right?
Well technically it is, but as it often happens, the official documentarian isn’t quite correct. Apparently we can change our video size right after creating a new sequence, with File – New Sequence, under the Settings Tab. Notice that the video size is greyed out though. Dang! They didn’t mention that, did they? For them it “just works”. Good for them!
Turns out that not every preset supports aspect ratio changes. So the issue really was that mega scary and ever so slightly excessive menu at the top, listing every camera manufacturer’s (obsolete and proprietary) presets. The only useful ones in here are DNx and Custom. The latter can be found at the very top of the menu, and if we pick that, we can change the video size of our sequence [insert applause].
Changing existing sequences
If you already have a sequence whose size you want to tweak, select the sequence in question, then head over to Sequence – Sequence Settings and bring up the menu from there.
I had some audio issues with my 5 year old Blue Yeti recently. It started as occasional small crackles when listening via the headphone output, some of which ended up on recordings as well. It was so occasional that I could edit it out, so I never thought much of it. For the last couple of weeks though, the crackles are now audible during my daily Stardew streams. Again they’re subtle, but I thought perhaps replacing the USB cable might do the trick.
Sadly it did not. Something else must be amiss, so much so that I lost audio completely during the stream today. I replaced the cable again, and it held up for the remainder of the stream, but it looks like I’m in the market for a new Blue Yeti microphone.
I’ve just been experimenting with uploading a Genesis 1 figure to Mixamo, and importing the animated figure into Blender. There are several trillion options what with the combinations of tick-boxes and values. Thankfully, nothing is documented, just the way I like it.
I thought I’d quickly post a screenshot of what actually works – for the next 10 minutes. We all know how quickly these things change:
Note that I’m doing this in the release version of Blender 2.82, in late May 2020. It’ll probably stop working by the time either of us reads this, but hey – at least I’ve tried.
My old Xbox 360 controller has been in use for nearly 10 years, but it’s still going strong while suffering from very sensitive Dead Zones. Those are the areas around the untouched centre position of a game pad that can sometimes deliver erratic results, especially after years of use (although I’ve seen brand new ones suffering from the same phenomenon).
Unreal Engine lets you define the dead zones for a project, and I just found out how to do it. It’s a project wide setting that can be found under Edit – Project Settings – Input. There’s a big section called Bindings at the very top of this huge list, at the bottom of which is a small “advanced” triangle. It’ll open even more options. Scary indeed! However, this is where we find Axis Config, as well as sections for each Game Pad Axis. Open each axis to reveal a Dead Zone property.
The default is set to 0.25, which is very generous and works perfectly in most cases, yet at the same time can feel a little rough and abrupt at times. Don’t set it to 0 (that’ll be terrible and lead to drifting), but anything from 0.05 upwards might give good results. Try it out and see if it helps game pad improvements.
I get this question regularly, in which new users ask me something along the lines of, “can I make my own clothes for Genesis, and if so, how do I do this?” Little do most people know what a huge undertaking this is, so I thought I’d outline the principle in basic strokes, to give y’all an overview what’s involved in the process.
Note that I’m not a clothing creator myself, so I’m not the right person to ask about details. If I knew the ins and outs as well as some of the PA’s do, I’d sure share it with you as articles or videos, trust me.
Hence this is not a tutorial, but rather a very in-depth answer to a comment I frequently get, in the hopes that it will give readers an overview of the whole process, without getting lost in too many details.
I’ve had this question twice recently, and it’s another interesting nugget of information I thought I’d share with you: why do DAZ figures take so long to load? Especially the no-frills base figures? And why does this only happen for some users, and not for others?
The two guys who contacted me about this (Richard and Hans-Werner) both had large amounts of content installed on their systems, and the first logical question is, could a different organisation of the content speed up the figure loading process (i.e. move content to another drive, or split content into multiple folders). The answer is: sadly no.
Likewise, a faster drive won’t make much of a difference either, be that an SSD or an even faster M2 drive. Those are great of course, and they will speed up content load times in general, but the root issue of excessive load times with DAZ figures are morph files.