Google's Bot Imposes a Large Impact on the Performance of SteemYY.com

in #witness-category2 years ago

image.png

In the last 24 hours, I have observed several alarms of the high CPU load average of steemyy.com. I checked the logs and found out there were a huge spikes of the bots from Google (in the User-Agent header, it says Googlebot).

I checked the logs, most visits are the Blockchain Explorer of STEEM where there are many many many pages i.e. each block has a page which will be seen unqiue to Google bots.

There used to be a setting in Google's Webmaster to slow down the crawler access, however this was hidden in the new Version of Google's Web Master.

Anyway, a few solutions:

  1. Do nothing - the site experiences short downtime and will be back to normal once the bots are gone.
  2. Tell Googlebots to slow down by sending 429 Too Many Requests (e.g. by implementing a Rate Limiter)
  3. Buy a much powerful Server (e.g. Dedicated servers)

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Thank you for reading ^^^^^^^^^^^^^^^

NEW! Following my Trail (Upvote or/and Downvote)

Follow me for topics of Algorithms, Blockchain and Cloud.
I am @justyy - a Steem Witness
https://steemyy.com

My contributions

Steem/Swap to USDT Swap

I also made this Super Easy/Handy Service to Convert your STEEM or SBD to USDT (TRC-20)

Delegation Service

Voting Power Considered in Voting Schema and Important Update of Delegation Service!

  • Delegate 1000 to justyy: Link
  • Delegate 5000 to justyy: Link
  • Delegate 10000 to justyy: Link

Support me

If you like my work, please:

  1. Delegate SP: https://steemyy.com/sp-delegate-form/?delegatee=justyy
  2. Vote @justyy as Witness: https://steemyy.com/witness-voting/?witness=justyy&action=approve
  3. Set @justyy as Proxy: https://steemyy.com/witness-voting/?witness=justyy&action=proxy
    Alternatively, you can vote witness or set proxy here: https://steemit.com/~witnesses

Sort:  

I had the same issue!!! google bot and also Bing bot impacted my site.
I changed the robots.txt and limited the pages they can crawl and it seems to be working... not 100% because some bots ignore robots.txt

For me, I don't want to ban entirely the Bot, is there a way to rate limit bots in robots.txt?

I am not a robots.txt expert... but I think so...
at least you tell them which bots are allowed what pages and others are delayed crawling... or something like that

Coin Marketplace

STEEM 0.23
TRX 0.24
JST 0.038
BTC 95409.58
ETH 3284.65
USDT 1.00
SBD 3.30