Why is Meta designing its New AI Process?
The AI advances of Meta are getting more creepy, with its latest project claiming to translate how the human brain perceives visual inputs to simulate human-like thinking.
What has Meta outlined in its new research paper?
Meta’s new AI research paper outlines its initial “Brain Decoding” process. It aims to simulate neuron activity and understand how humans think.
What does Meta say about it?
Meta says that their AI system can be deployed in real-time to reconstruct, from brain activity, the images perceived and processed by the brain at each instant.
It opens up an important avenue to help the scientific community understand how images are represented in the brain and then used as foundations of human intelligence.
What Meta says further?
It is unsettling in itself, but Meta goes further and says that the image encoder builds a rich set of representations of the image independently of the brain.
The brain encoder then learns to align MEG signals to these image embeddings. The artificial neurons in the algorithm tend to be activated similarly to the physical neurons of the brain in response to the same image.
So, the system is designed to think about how humans think to create more human-like responses. Now, it makes sense, as it is the ideal aim of these more advanced AI systems.
But reading how Meta sets these out just seems a little disconcerting, especially with respect to how they may be able to simulate human-like brain activity.
Overall, the results of Meta show that MEG can be used to decipher, with millisecond precision, the rise of complex representations generated in the brain.
What the research at Meta is trying to strengthen?
More generally, this research strengthens Meta’s long-term research initiative to understand the foundations of human intelligence.
What is the end game of AI research at Meta?
The end game of AI research at Meta is to recreate the human brain in digital form. It will enable more lifelike and engaging experiences replicating human response and activity.
It feels too sci-fi like we are moving into Terminator territory, with computers that will increasingly interact with you as humans do.
Which, of course, we already are, through conversational AI tools that can chat with you and “understand” added context. But further aligning computer chips with neurons is another big step.
What does Meta say further about this project?
Meta says that the project could have implications for brain injury patients and people who have lost the ability to speak. It will provide all the new ways to interact with people who are otherwise locked inside their bodies.
It would be fantastic while Meta is also developing other technologies enabling brain response to drive digital interaction.
What is MEG or Magnetoencephalography?
This project has been in discussion since 2017. While Meta has stepped back from its initial brain implant approach, it has used this same MEG (magnetoencephalography) tracking to map brain activity in recent mind-reading projects.
So, Meta has a long history of misusing or facilitating the misuse of user data and reading your mind. All for good purpose, no doubt.
The implications of such are unique, but again, it is a little unnerving to see terms like “brain encoder” in a research paper.
But again, that is the logical conclusion of advanced AI research, and it seems inevitable that we will soon see even more AI applications that more closely replicate human response and engagement.
It is weird, but the technology is advancing quickly.
Do you want to know more about the Meta’s AI research paper?
You can read Meta’s latest AI research paper here.
Do you want to know more?
Click here to learn more about all the new updates on Meta and other social media platforms.