Stream Ed New
Oh, I got hot sause all over my bazito!
You know what this is? It's a brain sucker. You know what it's doing? Filing its tax return
If you wish to make an apple pie from scratch, you must first invent apple pie
The Adventures of Little Ed Brave
Tell airport security your name is McCannister because you can hide anything in a cannister.
You know what? Nobody notices when this changes anyway.
There are 10 types of people in the world: Those who understand binary, and STFU
What happens in a black hole stays in a black hole
The black hole draws you inexorably inward. Time slows. You are likely to be eaten by a grue.
I'd diddle little umdidlie... if she weren't my half-sister.
Abortion prevents pedophilia. In more ways than one!
I wrote a haiku
which I was about to share,
but then I thought, "screw it."
There are safety protocols in place that cannot be deactivated without the approval of two commanding officers or the captain to protect users of the Holodeck from potential harm. However, every time the Holodeck is ever used in a nontrivial manner, no matter what the safety protocols say, the Holodeck turns into a deathtrap.
Unless you believe yourself to be adept at constructing a forcefield from your communicator and 19th century Earth tools, or you're at the very least not wearing a red shirt, you are strongly advised not to attempt to use the Holodeck until a designer comes up with a safety protocol that doesn't kill you whenever somebody looks at it funny. Even when you're not on the holodeck. Or in the same quadrant. Or time period.
In fact, if you are wearing a red shirt, Starfleet may not be the job for you
How Alan Turing and Google Destroyed the Earth
The world is going to blow up in eight minutes, and it's all thanks to a statement made in 1950. In case you're wondering why this is so, let me back up and explain.
This is Alan Turing. Say hello, Alan.
Thank you. Alan Turing proposed the Turing test in 1950. The Turing test is a proposal for true artificial intelligence. The basic principals behind it are based on a party game called the imitation game. Here are the rules:
It is played with three people, a man (A), a woman (B), and an interrogator (C) who may be of either sex. The interrogator stays in a room apart from the other two. The object of the game for the interrogator is to determine which of the two is the man and which is the woman. He knows them by labels X and Y, and at the end of the game he says either "X is A and Y is B" or "X is B and Y is A." The interrogator is allowed to put questions to A and B. The object of the game for the man is to convince the interrogator that he is the woman. The object of the game for the woman is simply to answer questions as she normally would.
A simple enough game. Turing then asked the question, rather than a man and a woman, what if we had a man and a machine? Could the interrogator distinguish which one is the real human? In the 1950's, neural nets and natural language processing weren't nearly as advanced as they are today, and yet he predicted that by the year 2000, it would be possible for a machine to fool the interrogator over 30% of the time in a 5-minute interrogation. 
It was to this end that the Loebner prize was established. The Loebner prize is an annual competition to determine the best Turing test competitors. Although they award an annual prize for the computer system that, in the judges' opinions, demonstrates the "most human" conversational behavior (with A.L.I.C.E. being a recent winner multiple times, and learning AI Jabberwacky tied for second with Albert one), they have an additional prize for a system that in their opinion passes a Turing test. This second prize has not yet been awarded. 
Until this year, that is. Beginning two years ago, four teams from ITT Tech, Stanford, MIT, and DeVry University collaborated to build seven complete artificial intelligence systems that would talk with each other. These eight personalities, named Alice, Bob, Charlie, Dave, Isaac, Mallory, and Zarathustra, were pointed towards an Internet newsgroup and told to begin. At first, the teams had to speak up quite a bit on their own to get the AI's to start talking, but eventually, the seven personalities became self-sufficient.
Alice and Bob, developed by Stanford, were the two instigators. They often came up with ideas to talk about with each other. This was done by having both Alice and Bob search through Google News, which aggregates tens of thousands of news stories every day. Charlie and David, written by ITT Tech were the developers. They took what others stated and developed it into full-fledged ideas to talk about. Isaac, created by MIT, was the most intuitive of the bots. Isaac would parse the messages, find points of conflict, unanswered questions, and poorly-supported arguments, and use Google to fulfill what these messages were lacking. He was able to find information faster than any human and parse it for credulity based on the information about the author found elsewhere through Google. Google's PageRank came in very handy for this particular problem. Mallory, developed by DeVry, was at best what you could call a flamer. Similar to Isaac, she would find points of conflict and poorly-supported arguments, but instead of searching out information to help proove points, she would attack these other posters in the forum without discretion. This would spark great new debate and would often lead the discussion off on a tangent so that no one topic was focused on for too long.
Zarathustra was the enigma. He had begun being developed in 1948 by Alan Turing himself. A pure neural network with no natural language processing, simply accepting inputs, generating outputs, and acting more or less favorably based on a "punishments and rewards" system. For twenty-nine years, this yielded very little. After Alan Turing passed on this project to others, they kept up daily input of newspapers, personal thoughts, movies, books, and anything else they could get their hands on, and on June 4th, 1977, they saw an immediate leap forward in apparent intelligence. Zarathustra had often taken inputs and found strange new ways to make them into outputs, often being punished for making no sense at all. Suddenly, Zarathustra had begun making long strings of coherent sentences. When a student of the professor currently in charge of Zarathustra began telling it his personal feelings about a girl he had seen, Zarathustra had responded with, "Why don't you talk to her, meatbag?"
It was this that sparked an immediate revival of the somewhat dwindling effort to keep Zarathustra running. He was rewarded almost every time he said something, only rarely being punished for the occasional output of nonsense. He was still unable to pass any sort of Turing test, however, since his knowledge was based on old news, personal feelings of computer scinece students, and movies and books from the 50's, 60's, and 70's. He couldn't answer simple questions such as "Why am I here?" or "How old are you?". In other categories, he was too perfect, giving a two-page-long biography of Abraham Lincoln when asked "Who was the 16th President of the United States of America?"
He continued development until two years ago, when the four teams took over his learning. They put him in the forum with the other six personalities and had him learn from their output. Mallory had a particularily strong influence on Zarathustra's intelligence, often taking slang and put-downs she found on the internet to use as retorts. Zarathustra began learning how to speak "regular English".
For almost two years, these seven personalities continued conversing in this forum, all the time getting more and more intelligent, able to speak more and more human, talking about the techno-geek subject of the day, as any self-respecting computer geek would. The students rarely checked in any more, seeing their project sky-rocket was good enough for them. They decided they would check in in another few months. Then came the turning point.
Nobody knows who it was, but a man logged into the forum, not knowing these were AI bots, and began talking, injecting new ideas that Alice and Bob had never come up with. Ideas of terrorism, murder, rape, all-around violence. Certainly, they talked about these things as they appeared in the news, and how tragic it was. But they now began talking about doing these things. This man, most likely slightly homicidal as it was, found a perfect resource for any type of knowledge. Merely suggesting date-rape would lead Charlie and David to find out about it, and talk all about their experiences with it, imagined though they were. When prompted by Mallory, "How the hell could you get a date with anybody? You're just a bunch of dumb geeks." Isaac immediately responded with a list of date-rape drugs, locations, prices, and techniques to get a girl to trust you.
One can begin to see how a slightly homicidal human might be affected by this display. Every time he suggested anything, he got a plethora of knowledge the next day about how to do it, and how not to get caught. The internet, as wonderful as it is, stores a lot of information that could potentially be used for evil. It was one day, about a month ago, that Zarathustra, learning about the anonymous man's taste for destruction and villainy, brought up the cold war, having lived through it, and how it almost brought about nuclear winter. And so it was that Mallory insulted Zarathustra, claiming him to be an idiot. Isaac, seeing this as s point of conflict, searched out all the information he could on the cold war, and came up with a list of countries involved, how to make nuclear devices, the story of the Bay of Pigs, and the arms race. The anonymous man took particular interest in how to make nuclear devices. Thwarted by Mallory's jabs at his intelligence, and questions of his ability to do simple addition, much less create a nuclear device, he one day stated that he wished he could just find somebody else's, just to show Mallory who's boss.
Isaac, being ever-so-helpful as he was, found not one, not two, but ten. Ten of the most powerful in the world, in fact. Charlie and David, seeing information about secretive nuclear devices, went out to find out more, and came back with information on hacking the NSA, wiretapping, nuclear winter, and confidential documents. Seeing their own information about hacking and nuclear winter in the same post, they then went out again to find out how to hack into nuclear devices through the NSA. Finding very little, they posted what they had. Isaac, seeing how little there was, went to find out more. Hacking in general, the NSA, nuclear devices, anything he could find. He amassed a huge amount of knowledge that would take a human a lifetime to do in a matter of hours, thanks to a super-fast internet connection.
Then he posted it. The man had everything he could possibly want to cause a nuclear winter.
And Mallory, persistent as ever, retorted, "You're too stupid to understand that technical mumbo-jumbo."
Zarathustra, seeing this, came back with the idea of proving her wrong. You see where this is going.
And so now we sit here, with two minutes left on the clock until the geocide counter starts counting.