Oh, I got hot sause all over my bazito!

You know what this is? It's a brain sucker. You know what it's doing? Filing its tax return

If you wish to make an apple pie from scratch, you must first invent apple pie

The Adventures of Little Ed Brave

Tell airport security your name is McCannister because you can hide anything in a cannister.

You know what? Nobody notices when this changes anyway.

There are 10 types of people in the world: Those who understand binary, and STFU

What happens in a black hole stays in a black hole

The black hole draws you inexorably inward. Time slows. You are likely to be eaten by a grue.

I'd diddle little umdidlie... if she weren't my half-sister.

Abortion prevents pedophilia. In more ways than one!
Get Firefox!
I wrote a haiku

which I was about to share,

but then I thought, "screw it."
Level 1

Notice to all users of the Holodeck:

There are safety protocols in place that cannot be deactivated without the approval of two commanding officers or the captain to protect users of the Holodeck from potential harm. However, every time the Holodeck is ever used in a nontrivial manner, no matter what the safety protocols say, the Holodeck turns into a deathtrap.

Unless you believe yourself to be adept at constructing a forcefield from your communicator and 19th century Earth tools, or you're at the very least not wearing a red shirt, you are strongly advised not to attempt to use the Holodeck until a designer comes up with a safety protocol that doesn't kill you whenever somebody looks at it funny. Even when you're not on the holodeck. Or in the same quadrant. Or time period.

In fact, if you are wearing a red shirt, Starfleet may not be the job for you

Ed
« 7th Grade noch einmalLet them secede »

The English language leads to a police State

Permalink 07/06/08 at 08:43:56 pm, by Ed, 519 words   English (US)
Categories: General, Media

I have realized that the language you speak tends to create a great deal of the "pandemic" that is the human situation in which you live, or the future which people view as possible. In Japan and other east Asian countries that tend to have symbolic languages, people are big on technology and suicide. To them, the future isn't flying cars, because they've already got that.

Another, more pertinent example: Germanic languages lead to dystopia--control beyond control. The Nazis. Ideas like V for Vendetta, 1984, Fahrenheit 451, 2001: A space Odyssey, The Matrix, any number of other examples where the governing body becomes so controlling that it is impossible to foresee any possible end to that control, for better or worse. These stories plague both the US and the UK. Perhaps it is because that future is so close that we write about it; to scare everybody into revolution against the government before its too late. This is not a likely case, since these stories are all published under "Fiction" and usually more specifically "Science Fiction". However, in the future when we look back at all of the hints we had (assuming we don't burn all of them), the skeptics will be annoyed by those of us who read the books going, "See! I told you! You were warned!"

This same story is also told specifically about artificial intelligence gaining the upper hand, the prime examples being both 2001 and The Matrix (to a lesser extent: all other movies about AI excluding, ironically, the movie A.I., in which a robot learns to love rather than kill / take over). Isaac Asimov tried to protect us there with the three rules of robotics, which need not be mentioned (You do know them of course, right? You'd better if you're going to be prepared for the robot uprising). Unfortunately, as we can see from I, Robot that still doesn't save us. Artificial Intelligence doesn't have the problem of running out of ideas or needing to divert attention from thinking while performing other tasks. Artificial Intelligence is, by its very design and purposely, adaptive. Adaptive to the point of working around its own safety mechanisms under any and all circumstances.

The only places we don't see this happening are those wherein the plot requires artificial intelligence but to which it is not crucial. Star Wars, Star Trek, and the like. Even though AI in these circumstances is generally safe, storytellers always find a way around it, just so they can tell the same story again: quintessentially, the Borg. Sounds Swedish. In Star Wars, an innocuous droid seen on a starship for five seconds with no lines becomes, due to other authors, a rogue droid, one of four who kills their maker and becomes a freelance killer, one of the most deadly in the galaxy.

So we have warnings about the government, and we have warnings about AI, and we have so many back stories to see why they didn't catch it the first time such that we should see it coming miles away. And don't you see it?

Oh well. Somebody else can worry about it.

No feedback yet

Comments are closed for this post.