Hi Mudassar,
That’s correct, it has to do with that limit. You can increase that number since 3.1.0; it’s found in the Sitemap Settings meta box.
Hi,
Thank you for the response. Alright, i have increased the value to 5000 as shown in the screenshot here; http://prntscr.com/l6k8ye
But, increasing the value won’t put a pressure on the server resources? You have also mentioned this line underneath; “Consider lowering this value when the sitemap shows a white screen or notifies you of memory exhaustion”
Nevertheless, I have increased the value, how long it will take to show all the URLs in the sitemap as well as submit all of the URLs to the webmaster tool since it hasn’t updated after 20 minutes of change?
Hi Mudassar,
I see you have, great! It’s working as intended 🙂
It does put extra strain on the server, that’s why it was always limited at 1200; small hosting packages would falter under higher pressure.
However, with the transient caching option (found under General -> Performance), this strain is negligible. It’s just that the generation might fail; thereafter, it’s just a string-fetch from the database.
The sitemap’s updated whenever you update a page, term, permalink setting, site setting, SEO settings, or when 7 days have passed. Search engines crawl the sitemap occasionally, and they’ll process all the URLs therein afterward.
Note that the sitemap isn’t needed for discovery anymore. Search engines now have intelligent spiders that crawl your pages and follow the links therein automatically, without the need for a sitemap.
Since you’re sharing the search engine services with billions of other websites, you’ll have to wait in their queue. When you’re next in line, they’ll crawl and process your sitemap, which will add the newly found/updated URLs into their queue. This process is repeated indefinitely.
Note that sites have a “crawling budget”. This budget is determined by the popularity and classification of your website. Small local-business sites tend to get crawled slower and less often than large news corporations’ sites.
Hi,
Great. It started to submit double the links. And I was wondering about the pressure it puts on the server. I have enabled the “search query alteration” (in the database). And also I can see an option on my cache plugin (wp-rocket) to cache the sitemap.
http://prntscr.com/l6xzcf
Do you think it would be a good idea to cache the sitemap?
Hi Mudassar,
With the performance options, it’s up to you to find out what works best for your website. With an infinite number of plugin, theme, and server configurations, I can’t predict what works best; however, I do know that the default options will yield expected and intended behavior of the plugin.
Since The SEO Framework can cache the sitemap, there’s no need for a second layer of cache. I’m not sure if WP Rocket’s cache will block real-time updates. However, I am sure TSF handles this properly.
TSF can ping search engines after the sitemap’s updated, and the search engines won’t find anything new if the second layer of cache imposed by WP Rocket obstructs the changes.
So, I don’t recommend using that setting, nor enable sitemap caching in WP Rocket; but, it’s not very harmful either.
Hi,
Much thank. I took a lot of your time and it was really helpful.
No problem Mudassar, don’t worry about it! 🙂