A few years ago, I read Charles O. Hartman's
The Virtual Muse. The last experiment the book describes is an attempt to get the computer to write English sentences. Hartman develops a grammar, adds some additional rules to it, hooks the whole thing up to a massive vocabulary, and lets it run. The results are underwhelming.
When I read about it, I knew at least one area in which he'd been in error. Artificial intelligence programs are most successful when you restrict the domain in which they work. Hartman's experiment had been too ambitious. I thought about trying something different, more restricted, but the whole idea languished. Until I came across Peter Norvig's
Paradigms of Artificial Intelligence Programming. In the introduction, Norvig shows a very small Common Lisp program that generates random text from a grammar. That gave me the idea of trying to come up with a grammar for some restricted piece of writing and seeing whether anything interesting would come of it. Haiku seemed like an interesting thing to try.
Before we go farther, let's talk for a minute about what a grammar looks like in this case. In formal language theory, a language is defined by a set terminal symbols, a set of non-terminal symbols, a symbol marked as the starting symbol, and a set of rules using the symbols. The symbols aren't particularly interesting in and of themselves. So, let's look at the rules in a very restricted language. We'll call it the Dog Language:
S -> NP VP
NP -> DET N
DET -> a | the
N -> dog
VP -> VERB
VERB -> ran | walked
The arrow means "can be replaced by" and the vertical bar means "or." The symbols in uppercase are non-terminals, things that get replaced. The lowercase symbols are the terminal symbols—they don't get replaced by anything. Now, this grammar represents a very small language—a total of four English sentences:
A dog ran.
A dog walked.
The dog ran.
The dog walked.
Let's take the last sentence and see how we might go through the grammar to get to it. S is the start symbol, and it has only one rule describing what it can be replaced by.
S -> NP VP
S -> DET N VP (replace NP w/ DET N)
S -> the N VP (replace DET w/
the — we can use "a"
or "the")
S -> the dog VP (replace N w/
dog)
S -> the dog VERB (replace VP w/ VERB)
S -> the dog walked (replace VERB w/
walked)
The process of seeing if a sentence can be derived by following the rules of the grammar is called parsing. So, we just "parsed" the sentence "The dog walked" by showing that you can follow the rules in the grammar and arrive at the sentence. The sentence belongs to our Dog Language.
Parsing is interesting and useful. It's done with translating computer programs into binary instructions. It's even done in some natural language interfaces. But we can also use a grammar like this to generate random text. We can begin with the start symbol and apply the rules until we have nothing but terminal symbols. When there's more than one choice, we can randomly select one of the alternatives. And depending on how clever our grammar and our vocabulary (the set of non-terminals) are, they could even be interesting sentences.
End of Part I.