Most organizations don’t have the resources to let people travel without an actual event on the other end.
I could be wrong, but I suspect visas may be affected as well.
The organizers and attendees will try and coordinate something else. However, throwing something together on zoom, at such short notice is going to be a crap show.
The schedules wouldn’t survive unscathed, since speakers will start dropping out at this point.
If they do manage to get something together, they run the real risk of it being a rickety and frustrating event.
The article itself is ambiguous on this point: "At the time of the publication, Meta admitted subcontracted workers might sometimes review content filmed on its smart glasses when people shared it with Meta AI."
That could be moderation, or it could be labelling new examples for training/validation
This feels like an instance of weasel words. One can scarcely imagine any reason to do content moderation over people’s own private and personally consumed data.
I’ve worked in trust and safety - for me this is stupid, but well below the threshold of impossible.
Hell, I know of a major firm that decided QA was not needed for their trust and safety process.
Another common issue will be SEA Arabic speakers tasked with labelling Middle Eastern Arabic content, because accents and cultural dialects are not a thing.
I’ve had people at FAANG firms cry on my shoulder, because they couldn’t get access to engineering resources at their own firms.
There was the famous case of meta executives overriding T&S policy and telling them that what content was news worthy during the Boston bombing. On a separate incident, they told their team that cartel violence was not newsworthy when friends in London complained about it.
When you say this is fantasy, what do you mean precisely?
What I mean is: I'm not sure what they base their statement that it's "a common practice among other companies" on. Unlikely they are talking about their peer companies. I suppose if you read the sentence literally, there surely exist one or more "other companies" in the broad universe of "other companies" that routinely do this kind of stuff. But I wouldn't think anywhere serious.
I mean, given this happened and it was sent to Sama it seems pretty clear that the images being generated from this were being sent to a labelling pipeline somewhere.
There’s probably an opt out / opt in clause somewhere in the terms and conditions, which makes it feasible for Meta (and other firms) to use this data.
Meta could at least pretend that they don't intend to capture people in their most intimate and vulnerable moments instead of slobbering on the sideline like "mm... Data..."
Imagine someone pulling up a smartphone and then recording everything that happens around them. Contrast that with someone wearing smart glasses and doing that exact same thing.
On a separate note, (and this is a genuine question) are you by any chance aware the term Non-consensual intimate imagery / NCII?
I am beginning to suspect that the average HN goer isn’t aware of the scope and scale of the Trust and Safety problem.
Someone pulling up a smartphone on me would feel hostile because it's violating a social contract. Maybe I'd feel betrayed and attacked if it turned out someone was recording me using glasses, but I don't know, I don't care about dashcams and this is not that much different. I imagine it feels bad and scary for women when someone takes creepshots of them, and this tech does open opportunities for that. Maybe that would be enough for me to hate glasshats if I had a bit of empathy. But isn't the genie already out of the bottle with 'deep nude' models available for everyone forever?
No, i don't think I've heard about NCII before, and Trust and Safety sounds like some corporate PR whitewashing term to me.
They don't care. Or they refuse to realize that tech isn't the solution to it, but an amplifier of it's scale.
Can tell you that my urge to take photos/record drastically dips around other people. Particularly if it were meant for any sort of commercial exploitation. Stephenson called people wired for max indiscriminate data collection/processing "gargoyles". Personally I prefer glassholes.
Safety and user pain is a part of tech which seems largely ignored, even on sites like HN.
I really have no idea why this ignorance prevails; commenters seem to genuinely be unaware of what goes on in Trust and Safety processes.
I mean, most users would complain about content moderation, but their experience would be miles ahead of what most of humanity enjoys when it comes to responsiveness.
I believe this lack of knowledge, examples, and case history is causing a blind spot in tech centric conversations when it comes to the causes of the Techlash.
Unfortunately this backlash is also the perfect cover for authoritarian government action - they come across as responsive to voters while also reigning in firms that are more responsive to American citizens and government officers than their own.
I could be wrong, but I suspect visas may be affected as well.
The organizers and attendees will try and coordinate something else. However, throwing something together on zoom, at such short notice is going to be a crap show.
The schedules wouldn’t survive unscathed, since speakers will start dropping out at this point.
If they do manage to get something together, they run the real risk of it being a rickety and frustrating event.
reply