The Thought Processor
>That’s even worse than mine. Ah, he speaks Arabic and I was going to finish any of them, though…
>Sure, why not? ;) At 3:30 am, the whole cross gender thing again… and it’s like from the camera.
>It is. But I’m not even sure how I left yesterday. I hope threers’ not like some people think I’m in eternal debt to my headphones…
>I really need to gnaw on something.
>Ah, well, in that way. It’s not like this which make me hate him or anything (In fact, he doesn’t have individual fingers, just a hazard of the car during an accident. Seatbelts are good at getting 0 all the time when I have something and not walking around, you won’t be able to count at that picture from the left side of the future…
>Nope, that doesn’t work, blame sunspots.
>It means something?
>You’ve never played Castlevania 3??!?!?!?!?!!!!??????????
Back in college, one of the most entertaining class assignments I had was the one for Dr. Matthews’ 341 course, where we learned about hash tables. The assignment involved reading in a text file from Project Gutenberg, running it through a processor, then spitting out a paragraph or two of English-like sentences. They weren’t completely random rearrangements of words. On the contrary, they usually had some form of coherent sentence structure, even though the sentences themselves were often incoherent ((And generally quite entertaining)).
The section at the top of this post is a sample of what this application produces. The input was some old chat logs. The output is blazing insanity.
This application, and its ability to produce English-like streams of utter nonsense will be at the core of the Echo Chamber. Here’s how it works:
- Read in a source of text.
- Split the text into words.
- For i = 0 to words.length: wordTable[words[i]].Add(words[i + 1])
- For x = 0 to outputTerms: outputString + currentWord; currentWord = wordTable[currentWord][rand()];
- Print outputString.
Or something like that.
In other words, when processing the file, any time you come across a word, add the word that follows it to a list of words that are associated with the first word. Then, when producing the output, select a word to get started, then print that word, and randomly select one of the subsequent words to use as the next word, then repeat the print/select process.
In other words, if you had the input “A B C B D E BÂ A F”, your table would end up something like this:
A: {B F}; B: { A C D }; C: { B }; D: { E }; E: { B }; F: null
And you’d be able to produce wacky sentences like “A F” or “A B A B A B C B D E B A F”.
…
Oh, forget it. That’s too complicated. Here’s a picture instead:
The key is that it’s not completely random gibberish. It only contains words that were used in the original source, and more importantly, every pair of words was used in the original source. So, if you see a string like “America the Hedgehog”, it’s because it started with “America”, randomly chose the word “the”, which was in the source due to the line “America the Beautiful”, then looked up words that followed “the” and found “Hedgehog”, because “Sonic the Hedgehog” was somewhere else in the source. That’s what gives it a semi-English sentence structure, without any parts of speech analysis and without any knowledge about rules of grammar. It simply strings together word pairs that a human once used, so verbs follow nouns, articles follow prepositions, and infinitives get split.
In other words, the whole thing is one big iterative “Before and After” puzzle from Wheel of Fortune.
A bunch of lazy people who want you to feel that lightheaded, anxious feeling again. It makes it uncomfortable. ;)
0 comments
Kick things off by filling out the form below.
Leave a Comment