Category Archives: Me and the Machine

If you can’t see your Kindle content on another device

I have several Amazon accounts: one in the US, one in the UK, and one ein Germany. Every now and again I de-register one of my Kindles from one account and register it with another one. Depends on what content I’d like to read and on which account it’s available.

The other day I switched my Kindle Fire from my German Amazon account back to my US account, my main account, containing all my my english content. To my surprise, the device registered fine, identified itself as “Jay’s Kindle”, but none of my content was showing up. Likewise, the device was not showing as registered on my web interface.

What was going on? Where was all my content? This had worked not too long ago!

I tried installing the Kindle iOS app on my iPhone and registered it too – only to find it behaved exactly the same way: no content, and the device was not showing itself on my Amazon account.

After getting in touch with Customer Service, I can now tell you what happened – and a neat trick of avoiding it, should it happen again. Interested? Read on! Continue reading If you can’t see your Kindle content on another device





How to cure Kindle Fire sync issues

Back in 2011 I bought a first generation Kindle Fire in the US. It hadn’t been released anywhere else, and this device started the whole Kindle Tablet business for Amazon.

It’s still working, and I’m still using it as a “bedside” Kindle (my Kindle 3, or Kindle Keyboard, doesn’t have a backlight, so the Fire is my “reading in the dark” companion).

Trouble is, the Kindle Fire doesn’t always sync my books with other Kindle devices. Sometimes it does, but sometimes it does not – and I never really knew what to do about it.

Until some online research gave me the solution that I’d like to share with you. Just in case this happens to your device.

This fix may work with other (Android based) Kindle Fire devices too, but I’ve only tested it with a first generation Fire (serial starts with D01E, Firmware 6.4.3).

Continue reading How to cure Kindle Fire sync issues





Computers coming full circle

Sony-acquires-cloud-gaming-platform-Gaikai-1087715

I was interested to hear about Sony’s plans for the future of gaming:

Turns out that they’ve bought Gaikai, a company specialising in rendering games in a data centre, streaming the results back to you. All we do is to transmit your gamepad’s directions.

Therefore there’s nothing to install locally, no updates or disks to deal with – and more importantly we don’t need super high-tech hardware at home that needs to be upgraded every 3-5 years. Technology upgrades happen in the data centre, and all we pay for is access to the game itself.

Sony say that they want to bring this service to the Playstation 4 (it’s currently in beta), Playstation 3 as well as Bravia TV sets. This could mean a massive back catalogue of 10 year old games from the PSOne and PS2 era, as well as top titles from PSP, PS Vita, PS3 and PS4.

It’s right up there with “cloud based office” solutions like Office 365 and the iWork suite – not to mention Dropbox, Flickr, Vimeo and whatever else we use as an external hard disk replacement. I like the cloud movement, it means I will have to visit the Computer Fixperts data retrieval much less, I tend to have butter fingers with my fragile tech.

Looking back over the beginnings of computers in the sixties and early seventies, we’re now experiencing exactly what had been commonplace back then: computer time sharing.

 

The Sixties

Back in the days, computers were to large and expensive that there was no way the likes of you and I could have one at home. But universities and companies had them in something like a control room, with terminals from various other rooms to access the computer. That’s why Linux is such a capable multi user environment: one machine, lots of users logging in submitting jobs to The Machine.

At first those terminals were local and connected via a thick heavy cable to The Machine in the same building. Later you could have a terminal at home and dial in via a local (free) phone call and use the computer. Terminals were small keyboard type things with a monitor and literally no computing power.

DEC_VT100_terminal

Then in the mid to late seventies the MOS 6502 processor came out and started the home computer revolution. Over the next few decades the likes of you and I bought computers and ran them at home, and it didn’t take long for technology to become so cheap and ubiquitous that our machines at home (and in our pockets) were better than what was sitting in those custom data centres. Those were the nineties and naughties.

 

Virtual Machines

Remote computers are great for “always on” services such as websites and emails – so you can rent a full computer in a data centre and manage it yourself if you like. Over the last decade or so it became more economical to maximise hardware capacity by creating virtual instances.

Those are “units” that emulate a full machine and react just the same, but in reality they’re just containers running on larger clusters of hardware. Rather than a CPU sitting idle 99% of the time, my own idle time could be put to good use elsewhere and “pretend” to be someone else’s fully fledged machine. The added benefit is that if one physical machine in the cluster crashes, the others can buffer the mistake until it’s fixed (much like a hard disk stripe).

 

Meanwhile, on your desktop

We’ve reached the point in the home computer revolution where a faster processor, a shinier display with more colour depth or more RAM aren’t going to make a difference anymore. Neither do faster data lines to the outside world. We have all that and more.

We’re at the end of what the MOS 6502 started in the seventies. Your desktop can no longer be made any better than it already is. It’s an interesting thought to recognise this.

Which leaves the question: what’s going to happen next?

Sure, we can shift everything into The Cloud (THE shittest description for this phenomenon bar none) and access the same services we already have with slower machines and inferior hardware. Those could be good at other things: they can be small and battery powered or cheaper, like our smart phones and tablets – yet they would appear as powerful as a fully fledged laptop, because computing is done in The Cloud. Amazon’s Silk web browser in the Kindle Fire is a good example: with relatively slow hardware, it pre-renders web pages in their data centre and is supposed to deliver a better user experience.

 

So what’s next?

Just like back in the early eighties when ordinary humans laid their hands on the first home computers – all we can think of doing with “The Cloud” is to replicate what we can already do – without The Cloud. That’s not innovation though is it?

That’s why I think of Playstation Now as such a cool idea: have real time graphics render off site and see the results – we’ve not seen this before.

I remember when the iPad first came out, and we all thought “this is great for emails and web browsing”, but we could do that already on laptops. Shortly after it became a revolution, all these innovative apps started coming out which turned the iPad into something else, changing our lives. Ray was saying back then, “Currently the iPad is a placeholder” – meaning society hasn’t decided where this is going just yet.

Perhaps it’s the same with The Cloud. It takes another decade to really appreciate where this is going, what the next real innovation is (it’s not 3D or 4k TV by the way).

Personally I’d like to see a “less is more” approach. What’s happening online is quickly becoming more important to society than what’s actually around us. We need to get out more and care less about who’s writing what on The Internet, regardless if it’s some website or some social network. We have other senses that need to be fed too.

I hope both current and future generations (me included) will be able to remember that there are things other than The Cloud, and there are other places in our world than Online.





Machine Language, Assembly and Assembler, Interpreters and Compilers

I finally found out what the difference is between Machine Language, Assembly and Assembler – and how it fits in with Interpreters and Compilers. For those of you game enough, let me explain what these cryptic terms mean – and how they span computers from the early C64 to today’s high-end laptops.

Interpreters

Something that has plagued the early computers was their speed of how they executed things in BASIC – or rather the lack thereof. As nice as BASIC is, sifting through an array of variables can compare them with a known value does take some time.

That’s not BASIC’s fault though – it’s rather the way it is executed. You see, BASIC (on the C64 and his comrades) is an interpreted language. This means that while the computer is working, it’s translating the BASIC statements into something it can actually understand – which is of course not BASIC. All a computer really knows is if something’s ON or OFF. Computers are truly binary machines – no matter how old or how new they are. So if you tell them to PRINT “HELLO” then some translation work needs to happen for HELLO to appear on the screen – and that takes time.

That’s what an interpreter does: translate one language into another on the fly – much like people can listen in Spanish, and speak the same thing in English, for the benefit of an audience (usually not for their own pleasure).

The great thing about interpreted languages is that the source code always remains readable. As you can imagine, ultimately the interpreter will throw some ones and zeros at the computer. There’s no way you could make a change to that as it bears no resemblance to your source code.

One alternative to speeding up the programme in question would be to have the something like the interpreter to go to work BEFORE the programme is executed. Ahead of time, and in its own time. Then we could present the translated result to the computer right away, taking away the “on-the-fly” translation and saving some CPU power. I guess it won’t come as a big surprise that this is done frequently too: it’s called compiling, and a Compiler does such a job.

Continue reading Machine Language, Assembly and Assembler, Interpreters and Compilers





Me and The Machine, Part 1: The 8-Bit-Age, ca. 1985

While most iOS Developers around the globe are busy learning Apple’s new programming language Swift or playing with early versions of iOS8 and Yosemite, I’m deeply involved in something much less cutting edge. In fact it’s from over 30 years ago, and it’s courtesy of Microsoft:

I’m having fun getting back into BASIC 2.0 as featured on the legendary Commodore 64 (or C64 or CBM 64).

Commodore-64-Computer

This was my first computer, and I’ll never forget it. German computer magazine “64er” dubbed it the VC-64, or “Volks Computer” (because Commodore’s previous machine was called the VC-20 or VIC-20). It was huge everywhere, but particularly in Germany it was just THE machine to have.

Sure, there was the Amstrad CPC664 and 464 (which were re-branded as Schneider) or the ZX-81 and Spectrum, but they were somewhere in that 5% category of “other home computers”. We never had the BBC Micro – for obvious reasons, and none of my friends could afford anApple II.

I no longer own the hardware, but some of that early day knowledge is still in me, together with many burning questions that have never been answered. There’s so much I always wanted to know about the C64, and so much I wanted to do with it: write programmes, learn machine language, and generally use it for development. I had no idea that there was such a thing as a Programmer’s Reference or developer tools. Time to get back into it!

Today we have wonderful emulators such as VICE (the Versatile Commodore Emulator) and it’s just like sitting down with my old computer again, on modern day hardware. I’m even doing it on a plastic Windows laptop for a touch of antiqueness (if I don’t get too annoyed with that).

Don’t ask me why this piece of computer history has become such an obsession with me over the last couple of weeks. I feel that for some reason it fits in with all this high-end cutting edge development I’m doing and rekindles me with how all this super technology started: with cheap plastic that was to change all our lives forever.

I remember the questions from members of my family who had not jumped on the computer bandwagon: “So what do you actually DO with a computer?” – and I guess today as much as back then you would answer, “What am I NOT doing with a computer anymore?”

The 8 bit “home computer” revolution started all that, including the stuff we use every day and half-heartedly take for granted – like downloading a PDF on the beach at 100Mbps, while sending videos to loved ones across the globe in half a second.

Before I get too old to remember, let me see if I can piece the story of “Me and The Machine” together (before my brain inevitably turns into that of a retired old gentleman yelling at the neighbour’s dog in a foreign accent).

Continue reading Me and The Machine, Part 1: The 8-Bit-Age, ca. 1985