Researchers at Purdue University have recently unveiled that artificial intelligence has taken a small step forward in its ability to read the human mind. With the ability to decode human thoughts, the AI might eventually be used to interface our brains with computers.
To achieve their feat, researchers first built a model of how the human brain takes in and processes information. Using three female volunteers, the scientists had them watch hours of short videos while a functional MRI machine measured activity in areas of their brain such as the visual cortex. A standard artificial neural network, which is typically used for image processing, then learned to associate certain images in the videos with activity in the participants’ brains. As the women continued to watch more videos, the neural network was able to predict what brain activity would occur.
The scientists were also able to use a different neural network to predict what the participants were watching, simply based on how their brain was functioning at any given time. With a 50% success rate, the algorithm could tell, for example, whether the women were watching a short video of a bird or an airplane. The algorithm could even successfully guess what other people were watching 25% of the time, even though it had not trained itself using these other people’s brains.
While all this may seem somewhat scary, it opens a number of possible scenarios for us in the future. Technology like this could allow people to express and translate their dreams, or even interact with a computer by using just their mind. It may even allow those who have suffered from strokes or lay in comatose like states to communicate with us.
Images: Pixabay, Oxford