- cross-posted to:
- fuck_ai@lemmy.world
- cross-posted to:
- fuck_ai@lemmy.world
that s3kr1t method: https://www.jwz.org/blog/2025/05/user-agent-blocking/#comment-259266
that s3kr1t method: https://www.jwz.org/blog/2025/05/user-agent-blocking/#comment-259266
It’s so ridiculous - supposedly these people have access to a super-smart AI (which is supposedly going to take all our jobs soon), but the AI can’t even tell them which pages are worth scraping multiple times per second and which are not. Instead, they appear to kill their hosts like maladapted parasites regularly. It’s probably not surprising, but still absurd.
Edit: Of course, I strongly assume that the scrapers don’t use the AI in this context (I guess they only used it to write their code based on old Stackoverflow posts). Doesn’t make it any less ridiculous though.