Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?

Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.

Edit: As Thunraz points out below, there’s a footnote that reads “Numbers after + are successful hits on ‘robots.txt’ files” and not scientific notation.

Edit 2: After doing more digging, the culprit is a post where I shared a few wallpapers for download. The bots have been downloading these wallpapers over and over, using 100GB of bandwidth usage in the first 12 days of November. That’s when my account was suspended for exceeding bandwidth (it’s an artificial limit I put on there awhile back and forgot about…) that’s also why the ‘last visit’ for all the bots is November 12th.

  • dual_sport_dork 🐧🗡️@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    26 days ago

    Negative. Our solution is completely home grown. All artisinal-like, from scratch. I can’t imagine I reveal anything anyone would care about much except product specs, and our inventory and pricing really doesn’t change very frequently.

    Even so, you think someone bothering to run a botnet to hound our site would distribute page loads across all of our products, right? Not just one. It’s nonsensical.

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      26 days ago

      Yeah, that’s the kind of weird shit I don’t understand. Someone on the other hand is paying for servers and a residential proxy to send that traffic too. Why?