The way QMT deals with urls appears to be problematic from a search-bot perspective, especially when there are many taxonomies. With 5 url parameters, your average search bot has an essentially infinite tree of “pages” to traverse. This results in a lot of wasted server resources. I’ve tried to mitigate with robots.txt and IP blocking, but am still finding that bot-traffic maxes out MySQL query limits on my shared host. Needless to say, WordPress complains when you hit a query limit, by unceremoniously redirecting to the install page. Can anyone suggest possible solutions?
- The topic ‘Search bots and multi-parameter urls’ is closed to new replies.