WordPress.org

Ready to get started?Download WordPress

Forums

google and other robots problem (1 post)

  1. nfa1218
    Member
    Posted 1 year ago #

    Hello everybody. My name is Nico.

    I have a site set up for a client of mine using the latest wordpress on a child theme by me and it's set up on a server that they already had...

    I've been working with this client for 12 months and 4 of those 12 months i've had bandwidth exceeding limits problems...

    This has caused a really bad relationship between my client and me because there is a lot of blame thrown at every meeting we have and I really feel disrepected by him.

    I think I've been doing a really good job for the low pay with these people but the fact of the matter is that 4 of those 12 months the site went down 10 days prior to month's end because of bandwidth limit problems and that's not good for business and caused them to think bad of me...

    The site is very simple and web standard coding is used, HTML/css and basic wordpress php...

    The site is like a phoneguide... they basically only have a search box and I set it up with about 2000 entries, each one of them is a separate client of theirs. Each client has a title (their name) and the content is a paragraph which has their phone number, address, link to their Facebook and/or mail or personal web-site... Nothing else, nothing fancy. We average 1500-3000 visits per month @ 13000 pages viewed with 90000 solicitudes and 3-6 gb of traffic...

    Overall my WordPress database is 11 mb.

    I have been using a pluging for a calandar which at first i thought caused the problema with the bandwidth but then came to realize that it wasn't the problem.

    MY client blames me for bad coding, and I defend myself saying that I have to do some research to come up with the correct diagnostic...

    So I did...

    And came to notice that each month, their site averages those 2000 visits @ around or between 3-6 gb of traffic... Which I think is normal or maybe a Little high but now that i compare to their stats from before I took over their site, it's pretty much on par... Except that I increased their visits by 100% basically from 2011 to 2012...

    The limit of bandwith was set at 5 gb... after the first limit break they then set it to 8gb... and now 16 gb!!! Which their hosting office tell us is redicously high compared to other clients that average even 3 times more visits than us! Which i kind of agree but have no comparison site to draw an adecuate conclusiĆ³n...

    But anyway, it didn't make sense that I'm seeing on the 20th of the month of march that traffic is at 4,3 gb when our limit is now 16 gb and the site is down!

    And i see that 12 gb of traffic is coming from unseen traffic, which is generated by bad coding, worms, and/or robots!

    At first i blamed my bad coding, which MY client did too! This cuased a heated argument which did nothing but anger me because my client didn't trust my judgment and instead made up ignorant conclusions...

    Then I saw that google bot used 9 gb, msn bots another 2 gb... And like 7 other unidentified bots used the rest, adding up to those 11-12 gb! The rest of the bots are fine with traffic of less than 1 mb... but google and msn are at gb level!!

    Feb and march were critical months in which i residigned their whole database! I basically erased EVERYTHING and started over! So that means that I was adding from zero all 2000 clients and blog news and farmarcy dates to that calendar plugin all from zero... A fresh start.

    So now my hypothesis is that it really wasn't bad coding or anything wrong that I did with my workflow! I'm thinking that the problema is that google and other search engines are bombarding my site with robots to try and index all 2000 NEW entries!!

    So now the site is down, but im guessing so is this process of bots indexing, right? Once they index they won't bombard my site with 9gb of traffic right? I'm not sure...

    Any thoughts or suggestions?

    I don't want to disable bots because it's a great benefit to my client that THEIR clients are indexed by google and others... But we can't afford having 6 gb of real traffic with 20 gb of bots traffic!

    Anybody have any suggestions or had this problema before?

    Thanks for your time and sorry for the long post...

Topic Closed

This topic has been closed to new replies.

About this Topic