But have the scans got cheaper?
It’s possibly that acceleration techniques have prevented the cost being greater, but key parts of the cost are not getting cheaper. Staffing costs are just so high.
Pricing a scan based on scanner time doesn’t really work.
That’s a non-sequitur. You don’t need to defend AI, your parent comment isn’t attacking it, simply making an observation.
> doesn't mean you ban hammers
They didn’t suggest banning anything.
> You can kill with hammer
Not if you don’t have a hammer available. Which is the point. Ready access to a tool makes misusing the tool easy. And some tools are more conductive to misuse than others. You can kill maybe a couple of people in a crowd with a hammer, a few more with a handgun, a ton more with a machine gun or a bomb. The tool itself matters, and you should regulate each accordingly to their capacity and likelihood of harm. For example, plenty of countries restrict gun use significantly more than the US. Those countries have much fewer gun-related deaths and violence. This isn’t (shouldn’t be, in an honest discussion) hard to understand.
The tools we use are not neutral. A sword can be made to work like an axe, but we use axes for chopping wood because a sword makes a shitty axe. A sword is designed to kill people. The handle, the mass, the weight distribution, and every other aspect I am not qualified to get in to, means swords are designed to kill. They are a tool, and their use is not neutral.
This is a clear example, but I don't believe any tools are neutral. Your immediate fallback was to a hammer, not a mouse, with the obvious corrollary being to bludgeon, but the same line applies. Tools are not neutral, and that's why when you looked for something that causes harm, you grabbed something that's objectively been serving a dual-purpose for hundreds of years. Nobody's using a computer mouse to bludgeon someone to death; it makes a shitty bludgeon, and the design of the tool reflects that.
That's also why these comparisons always fall back to knives, or hammers, or the AK-47: they are dangerous tools that are designed to make killing easier. Nobody is making these comparisons to more benign tools, like desk lamps, coffee cups, or car stereos, and it's because tools are not neutral, and none of my examples are designed to make direct, bodily harm, easier.
The fact that you had to find an article from three decades ago for an instance of killing with a keyboard is telling. All the others aren’t exactly that recent and are mostly isolated cases. Meanwhile, on gun related deaths, there are entire Wikipedia pages for it:
Meanwhile, pages of deaths perpetrated with household items are curiosities. You parent comment stands: tools are designed for specific purposes and are used for those purposes.
Thank you for the laugh, wasn’t expecting that. Though I have a modern external Apple keyboard, which is not that weighty but it is metal and fairly thin with sharp corners. It could do some damage.
My larger point is that nobody - nobody - defaults to telling us the coffee mug is unregulated, as AI allegedly ought to be. They always compare it to something much more commonly used as a weapon; something that, when asked to name a household object likely to be used as a weapon, the average person would guess.
Instead of comparing AI to any other tool, especially one closer to "useful with a computer", the common comparison is always a weapon of some kind.
If the design of tools are neutral, one tool should do as well as another in this common comparison. But the useful application of tools is inherent in their design.
If tools were neutral, as so many on this site claim, why is AI only ever compared to knives and hammers?
Parent has lots of links to other common objects causing harm, why are they never used as the example when tools are allegedly neutral? That would be a stronger argument opposing AI regulation - ethernet has less regulations that knives, but can still be used as a murder weapon
Hammers are kind of just the prototypical tool, but I've definitely also seen comparisons to keyboards, paintbrushes, and traditional digital tools.
> why are they never used as the example when tools are allegedly neutral? That would be a stronger argument opposing AI regulation
The argument is strongest if pointing to tools that have larger potential impact yet are still widely considered neutral and not/loosely regulated.
"We should consider AI a neutral tool and not heavily regulate it because we do the same for drink coasters" is not convincing, because there's not all that much you can do with a coaster.
Could you ever buy it?
reply