> You're telling me cloudflare has to store something on my computer to let them know I passed a captcha?
Yes?
HTTP is stateless. It always has been and it always will be. If you want to pass state between page visits (like "I am logged in to account ..." or "My shopping cart contains ..." or "I solved a CAPTCHA at ..."), you need to be given, and return back to the server on subsequent requests, cookies that encapsulate that information, or encapsulate a reference to an identifier that the server can associate with that information.
This is nothing new. Like gruez said in a sibling comment; this is what session cookies do. Almost every website you ever visit will be giving you some form of session cookie.
Then don't visit the site. Cloudflare is in the loop because the owner of the site wanted to buy not build a solution to the problems that Cloudflare solves. This is well within their rights and a perfectly understandable reason for Cloudflare to be there. Just as you are perfectly within your rights to object and avoid the site.
What is not within your rights is to require the site owner to build their own solution to your specs to solve those problems or to require the site owner to just live with those problems because you want to view the content.
That would be a much stronger line of argument if cloudflare wasn't used by everyone and their consultant, including on a bunch of sites I very much don't have an option of not using.
When a solution is widely adopted or adopted by essential services it becomes reasonable to place constraints on it. This has happened repeatedly throughout history, often in the form of government regulations.
It usually becomes reasonable to object to the status quo long before the legislature is compelled to move to fix things.
Why? This isn't a contrarian complaint but the problems that Cloudflare solves for an essential service require verifying certain things about the client which places a burden on the client. The problems exist in many cases because the service is essential which makes it a higher profile target. Expecting the client to bear some of that burden for interacting with the service in order to protect that service is not in my mind problematic.
I do think that it's reasonable for the service to provide alternative methods of interacting with it when possible. Phone lines, Mail, Email could all be potential escape hatches. But if a site is on the internet it is going to need protecting eventually.
That's a fair point, but it doesn't follow that the current status quo is necessarily reasonable. You had earlier suggested that the fact that it broadly meets the needs of service operators somehow invalidates objections to it which clearly isn't the case.
I don't know that "3rd party session cookies" or "JS" are reasonable objections, but I definitely have privacy concerns. And I have encountered situations where I wasn't presented with a captcha but was instead unconditionally blocked. That's frustrating but legally acceptable if it's a small time operator. But when it's a contracted tech giant I think it's deserving of scrutiny. Their practices have an outsized footprint.
> service to provide alternative methods of interacting with it when possible
One of the most obvious alternative methods is logging in with an existing account, but on many websites I've found the login portal barricaded behind a screening measure which entirely defeats that.
> if a site is on the internet it is going to need protecting eventually
Ah yes, it needs "protection" from "bots" to ensure that your page visit is "secure". Preventing DoS is understandable, but many operators simply don't want their content scraped for reasons entirely unrelated to service uptime. Yet they try to mislead the visitor regarding the reason for the inconvenience.
Or worse, the government operations that don't care but are blindly implementing a compliance checklist. They sometimes stick captchas in the most nonsensical places.
This sounds like "we only save hashed minutiae of your biometrics"