14 April 2007

Yo, Linguists (Parallel networks that learn to pronounce English text)

Hello my linguist friends,

In 1987, two researchers built a very small neural network (software app) that they "taught" how to read. That is, they fed it text as input, and it outputted phonemes to "speak" the text. This is well-known work in AI, but the audio is now online!

It is creepy. The several minutes of audio show some of the system's stages of development, which begins with a child-like babble and then becomes intelligible. Back in the 1980s, it took only 24 hours to train on a VAX.

The paper, the audio, and the data are here. Give the MP3 a chance; let it play for a few minutes.

Sejnowski, T. J. and Rosenberg, C. R., "Parallel networks that learn to pronounce English text," Complex Systems 1, 145-168 (1987).

-g

ps I'm not sure how the training worked; that is, how the errors were backpropped into the network.

pps Extra! Extra! Sure, you've seen photos of a total eclipse of the sun... by the moon. But at TED they showed a photo of an eclipse seen by a the Cassini spacecraft due to Saturn. On the investor Steve Jurvetson's Flickr page.

No comments: