I wish this worked but I'm too jaded to believe. I'm losing arm function so I have to type by voice and use an eye tracker. voice is great, mostly, except everyone knows what I'm doing. I lose so much privacy because I'm announcing everything I type. also talking for hours straight is hard on the vocal cords, and I'll never be able to work in an office if the pandemic ever ends.
it looks like this thing works by EMG of the muscles used for articulating speech, btw. it's a "neural interface" in the sense that it's measuring neurons, but it's not, like, reading your mind. I'm skeptical they can capture all phonemes without drilling holes. paper here: https://dam-prod.media.mit.edu/x/2018/03/23/p43-kapur_BRjFwE...
Media lab, given its name, is very good to show attractive demos, just like media outlets. The style is just not familiar to a computer scientists or software engineer.
> Beyond that, the system has the potential to seamlessly integrate humans and computers—such that computing, the Internet, and AI would weave into our daily life as a "second self" and augment our cognition and abilities.
Am I the only one to whom this proposition is horrifying even in its most benign version?
I doubt it, given the absolute plethora of dystopian fiction content and real-world push back to technology that we've seen for centuries.
Personally, I find it to be exciting. As the site mentions, this could be absolutely massive for people whose interaction with the world is non-default, ie: those with disabilities. The physical and virtual interfaces we have are often inflexible and designed with 'the norm' in mind. Having an adaptive interface for those people to help them bridge the gap seems huge.
I recall a spoon that would avoid shaking, allowing someone with (as I recall) Parkinsons to eat without spilling their food. I would love to see that same sort of intermediary interface for all sorts of things.
Personally, I have pretty bad eye sight. I can't drive because of it. The premise of self driving cars, or even just street signs that I can read at night, would be immensely helpful, and I'm pretty close to 'the norm'.
There are maybe some valid concerns. Technology is power and power can be abused. We see that today. But ideally we'll adapt and regulate that power for societal good, as we have done with power historically.
AGI is coming. In a contest, the only chance humans have is to work in tandem with AI.
Horror in the face of these changes seems to presuppose that humans have some inherent essence to lose. We're material. Nothing of consequence to our consciousnesses is beyond or beneath that. The more quickly individuals can shed any sense of preciousness with regards to being human, the more quickly they can integrate with the tools we'll need to survive.
All human knowledge is accessible through language. Gopher and GPT-3 emerged within months of each other. Within the next few iterations, one of these models will be able to change and reproduce itself.
The tide is coming in. Better to grab a surfboard than stand at the shore shaking your fist at the ocean.
it looks like this thing works by EMG of the muscles used for articulating speech, btw. it's a "neural interface" in the sense that it's measuring neurons, but it's not, like, reading your mind. I'm skeptical they can capture all phonemes without drilling holes. paper here: https://dam-prod.media.mit.edu/x/2018/03/23/p43-kapur_BRjFwE...