I would love to be a fly-on-the-wall at some of their meetings when someone says “So we are going to do data collection on minors without parental consent now” and all the developers just say “great I’ll code that up straight away”. Like is there no dissent at all? Is there no one at all who says “guys this is not OK”?
There's some powerful dissonance going on. I once interviewed an engineer. We spent a good 15m of the hour-long interview talking about the moral implications of social networking, the potential for powerful technology to cause harm, and what a developer's imperatives were. He was completely on the "side of right" - must use powers for good, data harvesting is bad, Facebook is an immoral business etc.
He turned down our offer to take a job building a data analytics platform at Facebook. "Oh it was just too exciting tech not to work on."
Facebook officially prohibits users younger than 13, mainly because US law has a lot of special requirements for children under that age ( https://en.wikipedia.org/wiki/Children%27s_Online_Privacy_Pr... ). I’m sure the app doesn’t have any special age-based code because if someone has a Facebook account, they’ve already told Facebook they’re 13 or older.
Amazonians reading this: I'd like to know how often does it cause someone to rethink their proposal when you use the magic words "disagree and commit"? Does it ever happen?
I mean consider this scenario: A, B, C, D, E are in a meeting to discuss a new project that A has planned. D and E like the idea. B is not committed either way. C says they can't see the project being a good thing but uses the magic words "disagree and commit". How often does it cause A to go back and say ok maybe my idea was not good?
In reality it's more like 'agree and submit' to the manager/pm making the proposal. Don't want to do the work because you find it goes against morals or personal beliefs? That's fine, they'll pivot you out of the company real quick.
There is an assumption that developers "knew" how the code being implemented is going to be used. In a big company like FB, as an IC or team, the context under which code gets developed may be far removed from how it gets used. I can imagine a scenario where someone developed this code for testing FB app on devices, another engineer had similar need and morphed it into different product etc.
The solution to these problems is oversight from security, compliance, and privacy on all systems dealing with consumer information, and having privacy education for all employees on regular basis. GDPR is a step in that direction.
Turns out a lot of people are willing to be unethical for a lot of money. If you can convince people to murder people and render them down to soap you can probably convince people to build spyware. You can even pretend that such data will never be used in a prejudicial fashion say to deny people employment in the future based on some social score.
Come on guys we just want to build a better ad what could possibly go wrong!