this post was submitted on 19 Aug 2025
853 points (99.3% liked)

Technology

74289 readers
4339 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] r00ty@kbin.life 5 points 2 days ago (1 children)

Yes, but my point is I cannot tell the difference. If they can convince cloudflare they deserve special treatment and exemption then they can probably get it.

I would argue there being a difference "depends" though. There's two problems I see. They are only potentially not guilty of one.

The first problem is, that AI crawlers are a true DDoS and this is I think the main reason most (including myself) do not want them. They cause performance issues by essentially speed running collecting every unique piece of data from your site. If they're dynamic as the article says then they are potentially not doing this. I cannot say for sure here.

The second problem is, many sites are monetized from advert revenue or otherwise motivated by actual organic traffic. In this case, I would bet some money that this company is taking the data from these sites, not providing ad revenue or organic traffic and serving it to the querying user with their own ads included. In which case, this is also very very bad.

So, their beef is only potentially partially valid. Like I say, if they can convince cloudflare, and people like me to add exceptions for them, then great. So far though, I'm not convinced. AI scrapers have a bad reputation in general, and it's deserved. They need to do a LOT to escape that stigma.

[–] FauxLiving@lemmy.world -1 points 2 days ago (1 children)

This isn’t about AI crawlers. This is about users using AI tools.

There’s a massive difference in server load between a user summarizing one page from your site and a bot trying to hit every page simultaneously.

The second problem is, many sites are monetized from advert revenue or otherwise motivated by actual organic traffic.

Should Cloudflare block users who use ad block extensions in their browser now?

The point of the article is that Cloudflare is blocking legitimate traffic, created by individual humans, by classifying that traffic as bot traffic.

Bot traffic is blocked because it creates outsized server load, this is something that user created traffic doesn’t do.

People use Cloudflare to protect their sites against bot traffic so that human users can access the site without it being ddos’d by bot traffic. By classifying user generated traffic and scraper generated traffic as the same thing, Cloudflare is incorrectly classifying traffic and blocking human users from accessing websites,

Websites are not able to opt out of this classification scheme. If they want to use Cloudflare for bot protection then they have to also agree that users using AI tools cannot access their sites even if the website owner wants to allow it. Cloudflare is blocking legitimate traffic and not allowing their customers to opt out of this scheme.

It should be pretty easy to understand how a website owner would be upset if their users couldn’t access their website.

[–] r00ty@kbin.life 3 points 2 days ago (1 children)

And their "AI tool" looks just like the hundreds of AI scraping bots. And I've already said the answer is easy. They need to differentiate themselves enough to convince cloudflare to make an exception for them.

Until then, they're "just another AI company scraping data"

[–] FauxLiving@lemmy.world 2 points 2 days ago (1 children)

Well, Cloudflare is adding, to the control panel, the ability to whitelist Perplexity and other AI sources (default: on).

Looks like they differentiated themselves enough.

[–] r00ty@kbin.life 1 points 2 days ago

That option is only likely to be for paid accounts. The freebie users like me have to make our own anti bot WAF rules. Or, as I do, just toss every page I expect a user to be using via managed challenge. Adding exceptions uses up precious space in those rules which I've used to put in exceptions for genuine instance to instance traffic.

But I am glad they were able to convince cloudflare. Good for them.