• Resolved seb06

    (@seb06)


    Hello,

    I checked on some page “Disable on this page/post” that should disable seo.

    that page/url I checked is already in google index, and since some days/weeks, I do not see any change and google put it higher in ranking !!!

    Why do not write the Disallow rule for that page in the robots.txt file, in order google know that it doesn’t have more to index that url??

    Thanks for your answer?
    Best

    https://wordpress.org/plugins/all-in-one-seo-pack/

Viewing 12 replies - 1 through 12 (of 12 total)
  • Plugin Support Steve M

    (@wpsmort)

    Thank you for the suggestion, we will consider this for a future release.

    Thread Starter seb06

    (@seb06)

    Ok, do you know what I have to do to say google to not index a page that is already indexed?

    Best

    Plugin Support Steve M

    (@wpsmort)

    There is a Noindex checkbox in the All in One SEO Pack section of the Edit Page screen – http://semperplugins.com/documentation/page-settings/

    Thread Starter seb06

    (@seb06)

    Yes, that is what I checked, but it doesn’t change anything, the page is still indexed.

    Anyway, do you think the “no index” tag is sufficient to say google to deindex the page or is the Disallow rule in the robots.txt must be also implemented??

    Thanks

    Plugin Support Steve M

    (@wpsmort)

    You have to give Google time, it could be days, weeks or even months depending on how often they crawl your site. You can always use Google Webmaster Tools to submit a Remove URL request.

    Thread Starter seb06

    (@seb06)

    Yes, sorry I forgot to precise that I did the request from google webmaster and in general google is fast to change position in its index.

    Thanks for your help and answers,

    Best

    Thread Starter seb06

    (@seb06)

    I have another question:

    Why the robots.txt file doesn’t includes the

    Sitemap: http://mydomain.com/sitemap.xml

    ???

    Best

    Thread Starter seb06

    (@seb06)

    For your information I checked the “Link From Virtual Robots.txt”

    But it has no effect into the robots.txt

    Plugin Support Steve M

    (@wpsmort)

    It depends on whether you are using the robots.txt file generated by WordPress or a different robots.txt file. We can currently only insert our XML Sitemap link into the WordPress robots.txt file, so if that option is not working then you know that you’re using a different robots.txt file.

    Thread Starter seb06

    (@seb06)

    I am using your plugin and I did not set any other robots.txt file.

    Any ide?

    Plugin Support Steve M

    (@wpsmort)

    Sorry, we cannot help you with that. You could be using a different plugin for this or it could be just a file on your server.

    Thread Starter seb06

    (@seb06)

    I do not use any other plugin.

    Before installing your plugin, I had not robots.txt file.

    I can see my file robots.txt on my ftp server and it is the one that is generated by AIO seo, but the “Sitemap” keyword is not included into…

    Any idea?

Viewing 12 replies - 1 through 12 (of 12 total)
  • The topic ‘Disabling SEO form each page doesn't block google index’ is closed to new replies.