Search spiders see what your browser sees when it gets a page from your site. (assuming your php code doesn’t play games by delivering different content to search spiders) If you view the page source in your browser and see any links, even in comments or with display: none; styling, then search spiders see them too. If not, then search spiders don’t either.
If spiders have already recorded links to those pages, they will come back looking for updated versions. To keep the pages from being indexed, you would have to redirect spiders elsewhere. Or you could try adding a robots noindex meta tag to the page’s head.
To remove undesired links, either edit your theme’s template or create a child theme where your edited template will replace the parent’s version.
Alternately, you can try adding the rel="nofollow"
attribute to your link tags, but there’s no guarantee search spiders will honor it. And/or you could disallow the page destinations in robots.txt, but again, no guarantee.
Thread Starter
Dezio
(@dezio)
Hello,
I know what you say but, if someone try to access by browser from http://www.website.com/page/2 can see the content.
Is there a way to remove pagination or I just can “hide” it?