AI is developing fast. It seems like only a few months ago the internet really started talking about ChatGPT and how we might really use it. Already now, we’ve moved from GPT 3.5 to GPT 4, which claims to be far more powerful and able to create more complex answers.

As a joke that hit critical mass quickly:

OpenAI (the developers behind this project) are hiring for a Killswitch Engineer.

According to the job description, this individual would be tasked with basically twiddling their thumbs most of the time. But their role would be to be ready just in case, and ready to “throw a bucket of water on the servers” if needed.

Seems like it could be easy money for somebody, and presumably there’s not much experience needed.

Of course, as the jokes have already postulated, if the AI really did become sentient this person would likely be its first target.

Even though it’s a joke, for now, here’s why it’s still relevant.

It wouldn’t be shocking to see a real position like this crop up. Many figures in the tech industry, not least of which Elon Musk, have warned of the dangers of letting AI loose into our way of life.

The sheer speed at which an AI can process massive amounts of data, make decisions, adapt, etc. would make it difficult to control past a certain point.

Are we there yet? No, and not even close as many would affirm. But the speed at which just this one project, ChatGPT, is advancing has rightly stirred some conversation about how we want to handle AI going forward.

A “wait and see” attitude is certainly the wrong one where AI is concerned. While I am excited about where these projects can go and it’s not my intention to throw doom and gloom at the category, it seems a bit to me like having a positive experience with a wolf. It’s cool; it’s fun. But never forget that it’s a wild animal with instincts you don’t totally understand.

Handle with care, you might say.

But as emerging tech goes, this has been the most interesting to follow of basically any for me — in a long career of following tech.

Share This