Katrina Payne
4 min readDec 13, 2016

--

>Nobody smart enough is afraid of killing robots in a terminator-like

>scenario,

That… kind of runs contrary to what your opening article was about. That whole “we need to stop being scared of them” bullshit that I am annoyed with.

Also, your last message kind of indicated that you are scared that this might be possible.

If you weren’t scared, you’d not say I shouldn’t get my hands on such technology. You would have stated that “yeah — you are not going to succeed at it.”

So… yeah… you are just sounding more ridiculous in this interaction.

But then.. you also didn’t really read anything of this interaction that overly close. Apparently not even the stuff you yourself are writing.

> How long in the AI creation process have you got, again?

I am going to reinterate for you, since you’ve not been able to understand me talking to you, with how ridiculous you truly are.

Far enough to realise that making AIs that kill all humans is not a likely end case scenario. Even if you are specifically attempting to do this.

>Unless you’re more of a genius than any other human in existence, you

>haven’t gotten that far.

What is to say that I am not?

Oh right… your ego now, as you are realising how silly your original article truly was.

Oh… and my avatar image of me putting various delicious video game related plastic in my mouth because it is tasty as all fuck. I suppose that might have me appear questionable.

>And based on your comments and extremely short time and range of

>thought, you haven’t got far.

How are you basing this notion of my short time and range of thought? My comments? The ones you are clearly showing an inability to read? With every reply further showing your lack of knowledge and general silliness on the matter — and tendency to contradict yourself on this matter?

Look, here is the thing… the fear of killer AI is just THAT ridiculous. You could literally attempt to purposely build an AI and have it not succeed at the task — regardless of if it is able to learn or not. I know — I’ve looked into figuring out how you’d do that.

It isn’t something anybody has to worry about happening accidentally — as it isn’t even something that people will pull of in a full on serious attempt to having it get done.

And considering you are now jumping around in your statements… talking with you as gotten annoying.

>Read “Super intelligence” and then we can talk.

Eh, I cannot afford it. Maybe if my local library has a copy I might look into it.

Otherwise I’ll have to stick to reading stuff that isn’t total plain nonsense made for people to wank over. In that I usually prefer to be much more honest about my wank material.

>Nobody smart enough is afraid of killing robots in a terminator-like

>scenario, we’re concerned of inhumanly smart minds capable of rendering

>us, slow biological creatures, unnecessary.

Wait… that MAKES EVEN LESS SENSE. Unless you are talking about a time scale that occurs in less than twenty years that is even more nonsense.

That crazy stuff is what? Seriously? That?

First off: humans have ALWAYS been unnecessary.

Second off: Look at the issue with having your id and ego class at each other on this matter.

You’d have a better argument back when they were destroying looms due to the lack of ways for people to get food such matters somewhat resulted in occurring.

You are scaring yourself over total and utter nonsense.

I assumed killer robots as that seems to be much more of a real threat.

Hell, the internet is still in jeopardy of SQUIRRELS taking it out.

Are you literally just spending this time operating on pure luddite wank material?

Look here is the thing: after automated computers have made people’s job’s easier to accomplish… have people been spending more time working or less time working?

Humans will just find something else to work on, once robots have an area covered.

>As they say, humans don’t hate ant, but they don’t mind killing every

>single one of them when they’re making a house on top of a nest.

There is a difference between “killing a single city of humans” compared to “killing all the humans”… like a huge difference.

First off… killing off a single city because you are putting down your own house is something that would piss off humans — but not result in the complete destruction of humans.

Just like it hasn’t stopped the ants from continual existing.

Second off: humans have done this to ourselves many many times in history — and we have still survived.

You pretty much are turning Technological Sodom and Gamorrah into an “end of the world scenario” when it really truly isn’t.

And remember: all Sodom and Gamorrah had to do do to avoid their fate was be charitable. (Yeah… turns out it was they were not giving to charity that truly ticked off the Bible’s protagonist who took them out).

I have no reason to doubt that Technological Sodom and Gamorrah could be averted via that same way out of “being charitable” if we look at it from the standard luddite nonsense you are now peddling.

>I don’t even know why I argue anymore zzz

Mostly to save face… as you know that the Luddite angle is silly for various reasons that you could come up with if you just thought about it…

And the fact that you not paying attention to what you are blogging now has you looking ridiculous.

I mean… you could just post a blog entry of you smashing some looms — you know to get it out of your system — and just play it up on the ridiculous level. I mean, looms are what is causing the new generation to be lazy and entitled and not as good as the one before it. Damned kids these days.

--

--

Katrina Payne
Katrina Payne

Written by Katrina Payne

A mixture of several spicy hot take opinion pieces and apocalyptic log entries from an unfiction ARG

No responses yet