• Resolved nealumphred

    (@nealumphred)


    My websites are blogs. Needless to say, I do not want any AI anywhere having uninvited access to “scrape” my blogs for data.

    I have installed and activated Drak Visitors and am on Step 4. Which options should I check to stop AI scraping, etc.?

    • Block AI Assistants
    • Block AI Data Scrapers
    • Block AI Search Crawlers
    • Block Undocumented AI Agents

Viewing 8 replies - 1 through 8 (of 8 total)
  • Plugin Author gavindarkvisitors

    (@gavindarkvisitors)

    For model training data, would be “AI Data Scrapers”.

    On each agent detail page (https://darkvisitors.com/agents), there’s a “Should I Block … ?” section that goes into more detail about each type.

    Hope that helps.

    Thread Starter nealumphred

    (@nealumphred)

    GAVIN DARKVISTORS

    Thanks for responding.

    • As per your suggestion, I have ticked the box for AI Data Scrapers.

    • There is no “Should I Block … ?” section on the page https://darkvisitors.com/agents on my computer.

    NEAL

    Plugin Author gavindarkvisitors

    (@gavindarkvisitors)

    I mean on each individual agent detail page, not the page I linked to. You need to click on an agent in that list to see the agent detail page. On that page, you will see that header. Let me know if that makes sense. Sorry for the confusion.

    Thread Starter nealumphred

    (@nealumphred)

    That is far too complicated. There are hundreds—perhaps thousands—of them.

    Dark Visitors should have a default setting or recommendation for non-tech users like myself.

    Please tell me which options you think I should activate to protect a blog.

    Plugin Author gavindarkvisitors

    (@gavindarkvisitors)

    Ticking the “AI Data Scrapers” box will do what you want. You already did that, so you are all set.

    Thread Starter nealumphred

    (@nealumphred)

    Good enough.

    Next question: Does Dark Visitors complement or conflict with Cloudflare’s “Block AI Scrapers and Crawlers”?

    Plugin Author gavindarkvisitors

    (@gavindarkvisitors)

    They will complement each other. Dark Visitors will be a first line of defense by setting robots.txt rules. CloudFlare will stop any agents that try to ignore the robots.txt rules.

    Thread Starter nealumphred

    (@nealumphred)

    Excellent!

    Thanks and have a groovy weekend …

Viewing 8 replies - 1 through 8 (of 8 total)

The topic ‘Which options should I check to stop AI scraping?’ is closed to new replies.