I think you're missing the point here. Because I don't say anything in the line of that.
What I say is, this is a double-use technology and dangerous beyond a certain point, so it might need to be regulated or banned at the end of the day.
In an ideal world, we shouldn't need this, but we don't live that ideal world.
For example, I can't independently develop an amateur rocket which can land to an area of my choosing by the means of actively directing itself, beyond a certain accuracy and precision. Because it'll be a homing missile. On the same essence I can say that this technology can be used to damage other people.
Or, I can't get some enriched uranium to build myself a small, teapot sized reactor to power my house during power outages.
Can we say that we're censoring research in this areas too, because they're low security things?
This is same with latest A.I. developments. However I'm a bit busy to open these cans today.
Nuclear technology is low security indeed, and it's a technical problem, and uranium isn't exactly abundant element. Untrusted data is a problem of stupidity in comparison. But stupidity causes problems with any technology. It "spoons can harm people" tier problem.
It's possible to buy anything given the right price when it's not regulated or banned due to double-use technology restrictions. I'm sure while expensive I would be able to get required equipment for the right money, from the usual suspects (i.e. I'm sure there'll be microcontroller boards for controlling reactors up to 200KW or servo kits for 12 fuel, 12 regulator rod configurations from adafruit for example).
> Untrusted data is a problem of stupidity in comparison.
In the past, wrong data showed itself because of a lack of coherency. With the advanced misinformation operations, it almost have became an alternative reality game. A.I. today allows us to generate convincing lies at the push of a button. I can fathom what kind of misinformation bubbles can be built with technology like that.
These technologies are attacking to lowest level instincts of humans, which ones we deem utterly reliable for thousands of years. They are the same level with the manipulative algorithms in my mind. I put these into dangerous and harmful category.
This is not a case of stupidity. This is plain old, and very dangerous kind of, manipulation.
It's like banning Iliad, because it describes troyan horse.