Hackaday reported on a report that claims nearly 50% of the Web’s traffic comes from bots:
A recent report by cyber security services company Imperva pins the level of automated traffic (‘bots’) at roughly fifty percent of total traffic, with about 32% of all traffic attributed to ‘bad bots’, meaning automated traffic that crawls and scrapes content to e.g. train large language models (LLMs) and generate automated content as well as perform automated attacks on the countless APIs accessible on the internet.
Bots are not new—hell, I use and work with plenty of them in my job. But it’s the increase in “bad” bots that is disheartening. I’m starting to see the Web like that scene in The Matrix with the harvester machines collecting all the fetus pods to fuel themselves. Or maybe the Sentinels seeking to destroy all human life. I tell you, that film gains extra layers as the years go on and for once, I will not apologise for the digression. Block the bots!
Filed under: Hackaday language models security