I’ve made an interesting discovery the other day about one of my render nodes: with identical GPUs, one appears to render faster than the other. I didn’t get it at first. But with a possible explanation in my head, I got the thinking and applied the same principle to my other node, and was able to increase its render speed by 24%!
How exciting is that?
It’s all about retro hardware, and how to make the most out of what you already have. Let me tell you what I discovered, and how I made use of an old AMD/ATI GPU in my setup that I never thought would work.
I have several Amazon accounts: one in the US, one in the UK, and one ein Germany. Every now and again I de-register one of my Kindles from one account and register it with another one. Depends on what content I’d like to read and on which account it’s available.
The other day I switched my Kindle Fire from my German Amazon account back to my US account, my main account, containing all my my english content. To my surprise, the device registered fine, identified itself as “Jay’s Kindle”, but none of my content was showing up. Likewise, the device was not showing as registered on my web interface.
What was going on? Where was all my content? This had worked not too long ago!
I tried installing the Kindle iOS app on my iPhone and registered it too – only to find it behaved exactly the same way: no content, and the device was not showing itself on my Amazon account.
After getting in touch with Customer Service, I can now tell you what happened – and a neat trick of avoiding it, should it happen again. Interested? Read on!
I was interested to hear about Sony’s plans for the future of gaming: Turns out that they’ve bought Gaikai, a company specialising in rendering games in a data centre, streaming the results back to you. All we do is to transmit your gamepad’s directions. Therefore there’s nothing to install locally, no updates or disks to deal …
I finally found out what the difference is between Machine Language, Assembly and Assembler – and how it fits in with Interpreters and Compilers. For those of you game enough, let me explain what these cryptic terms mean – and how they span computers from the early C64 to today’s high-end laptops.
Something that has plagued the early computers was their speed of how they executed things in BASIC – or rather the lack thereof. As nice as BASIC is, sifting through an array of variables can compare them with a known value does take some time.
That’s not BASIC’s fault though – it’s rather the way it is executed. You see, BASIC (on the C64 and his comrades) is an interpreted language. This means that while the computer is working, it’s translating the BASIC statements into something it can actually understand – which is of course not BASIC. All a computer really knows is if something’s ON or OFF. Computers are truly binary machines – no matter how old or how new they are. So if you tell them to PRINT “HELLO” then some translation work needs to happen for HELLO to appear on the screen – and that takes time.
That’s what an interpreter does: translate one language into another on the fly – much like people can listen in Spanish, and speak the same thing in English, for the benefit of an audience (usually not for their own pleasure).
The great thing about interpreted languages is that the source code always remains readable. As you can imagine, ultimately the interpreter will throw some ones and zeros at the computer. There’s no way you could make a change to that as it bears no resemblance to your source code.
One alternative to speeding up the programme in question would be to have the something like the interpreter to go to work BEFORE the programme is executed. Ahead of time, and in its own time. Then we could present the translated result to the computer right away, taking away the “on-the-fly” translation and saving some CPU power. I guess it won’t come as a big surprise that this is done frequently too: it’s called compiling, and a Compiler does such a job.
While most iOS Developers around the globe are busy learning Apple’s new programming language Swift or playing with early versions of iOS8 and Yosemite, I’m deeply involved in something much less cutting edge. In fact it’s from over 30 years ago, and it’s courtesy of Microsoft:
I’m having fun getting back into BASIC 2.0 as featured on the legendary Commodore 64 (or C64 or CBM 64).
This was my first computer, and I’ll never forget it. German computer magazine “64er” dubbed it the VC-64, or “Volks Computer” (because Commodore’s previous machine was called the VC-20 or VIC-20). It was huge everywhere, but particularly in Germany it was just THE machine to have.
Sure, there was the Amstrad CPC664 and 464 (which were re-branded as Schneider) or the ZX-81 and Spectrum, but they were somewhere in that 5% category of “other home computers”. We never had the BBC Micro – for obvious reasons, and none of my friends could afford anApple II.
I no longer own the hardware, but some of that early day knowledge is still in me, together with many burning questions that have never been answered. There’s so much I always wanted to know about the C64, and so much I wanted to do with it: write programmes, learn machine language, and generally use it for development. I had no idea that there was such a thing as a Programmer’s Reference or developer tools. Time to get back into it!
Today we have wonderful emulators such as VICE (the Versatile Commodore Emulator) and it’s just like sitting down with my old computer again, on modern day hardware. I’m even doing it on a plastic Windows laptop for a touch of antiqueness (if I don’t get too annoyed with that).
Don’t ask me why this piece of computer history has become such an obsession with me over the last couple of weeks. I feel that for some reason it fits in with all this high-end cutting edge development I’m doing and rekindles me with how all this super technology started: with cheap plastic that was to change all our lives forever.
I remember the questions from members of my family who had not jumped on the computer bandwagon: “So what do you actually DO with a computer?” – and I guess today as much as back then you would answer, “What am I NOT doing with a computer anymore?”
The 8 bit “home computer” revolution started all that, including the stuff we use every day and half-heartedly take for granted – like downloading a PDF on the beach at 100Mbps, while sending videos to loved ones across the globe in half a second.
Before I get too old to remember, let me see if I can piece the story of “Me and The Machine” together (before my brain inevitably turns into that of a retired old gentleman yelling at the neighbour’s dog in a foreign accent).