Researchers with the Facebook AI Research (FAIR) development project have had to shut down two artificial intelligence programs because they began to develop their own internal language that the researchers couldn’t decipher. On the surface, the invented language looks like gibberish, but when the researchers analyzed it, they realized that the word patterns represented a more efficient lingo that only the two AI "agents" could understand.
read more

A new device developed by a research team at UC Berkeley is able to produce mechanical speech using only the thought of the word being produced from the subject. The researchers hope that this device will enable patients suffering from conditions that limit or prohibit spoken communication, such as the effects from a stroke or Lou Gehrig’s disease, to be able to communicate normally.

The researchers placed electrodes on the surface of the subjects’ brains in the region associated with language, then recorded the electrical patterns their brains produced when perceiving spoken speech. This information was then applied to a computer model that sorted out which patterns belonged to which sounds, creating maps of the subjects’ perceived speech patterns.
read more

You have something personal to discuss with a friend. You arrange to meet for a coffee and a chat in a public restaurant. It’s noisy, and you have a good heart to heart in the belief that nobody else can hear what you’re saying. Or you leave a message on an answer phone or via a phone app and think that it’s just between yourself and the recipient.
read more