Hi @wenlu,
The screenshot shows the URLs that contain URL parameters (search filter) are reported as duplicate content. To get Google to stop reporting them, making a rule in your robots.txt file is a good solution. This guide explains how to edit the robots.txt: https://yoast.com/help/how-to-edit-robots-txt-through-yoast-seo/. Use this guide to review the disallow rules you may need to create: https://searchengineland.com/a-deeper-look-at-robotstxt-17573
When I done the modification, is it the output will be reflected immediately ?
Hi,
If you make the modifications to your robots.txt file, the output should reflect immediately when you check your robots.txt file. However, it depends on when Google and the other bots will be able to crawl and view your website again and check for the new robots.txt rules and know they won’t be able to access the URLs with those parameters of the search filters, before it stops flagging it as duplicate content.
@wenlu You’re welcome. We are going ahead and marking this issue as resolved but please let us know if you require any further assistance.