Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These are two different, but slightly related topics, which are being conflated with a third.

Google is not censoring based on moral grounds here. Its purely financial. If they are caught hosting "how to circumvent DRM", then a number of licensing agreements they have with major IP owners that allows them to profit off music, video and other IP disappears. Most of the take down stuff is either keyword search or automatic based on who is reporting.

The Online safety act is utterly flawed, to the point that even ofcom really don't know how to implement it. They are reliant on consultants from delloite or whatever, who also have no fucking clue. The guidelines are designed for large players who have a good few million in the bank, because in all reality thats how ofcom are going to take to court.

There are a number of thing the act asks to happen, most of them are common sense, but require named people to implement (ie moderate, provide a way to report posts, allow transparent arbitration, etc, etc) along with defined policies. In the same way that charities are allowed to have a "reasonable" GDPR policy, it seems fair that smaller site should also have that. but this would go down badly with the noise makers.

As for age protection, they also really don't know how to do it practically. This means that instead of providing a private (as in curtains no peaking) age assurance API, they are relying on websites to buy in a commercial service, which will be full of telemetry for advertising snooping.

Then there is moral/editorial censorship, which is what you go to a media platform for. Like it or not, you choose a platform because the stuff you see is what you expect to see there, even if you don't like it. Youtube is totally optimising for views, even if it means longterm decline. (same with facebook, instgram and tiktok)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: