The Very Hungry Robot

Yesterday evening, one of our own websites broke-through the monthly bandwidth limit that applies to its web hosting account. Once the web hosting system recognised the issue an alert was sent to us and the website automatically went offline.
An instant remedy could have been to simply buy more monthly bandwidth, yet the smart solution was to properly investigate how this happened before switching the website back on again.
This morning, the cause was found to be one piece of software (a robot), from one IP address in the USA, making over 220,000 page requests of the website over a 15 hour period. This rather dogged assault doesn't bear the hallmarks of what we recognise as a DDOS attack and, therefore, the remedy was quite straightforward.
A block on the IP address from which the requests originated was placed into the .htaccess file of the Apache Web Server and then we bought more bandwidth for the month - safe in the knowledge that an identical attack couldn't knock the website offline again.
This pattern of increased bandwidth consumption is something we've begun to notice across many of our Customers' websites. And, perhaps, it's something we're all going to have to get used to. With the emergence and growth of AI and the omnipresence of cybercrime there is simply more things out there crawling the web, looking for data, looking for vulnerabilities to exploit.
The good news is that, firstly, because responsive Sub@omic websites are built with less code and are, typically, 7x smaller than comparison websites, they consume significantly less bandwidth and are, therefore, cheaper to host.
Secondly, because we don't build sites with common CMS (such as Wix, Squarespace or Wordpress) then their flaws and vulnerabilities, which the 'bots continually trawl the web in search of, will never be found in one of our websites - making them and keeping you safe.
Choose a website that's faster, safer and 7x smaller.
[Published: ]