However, I got a bit of a nasty surprise when I looked into how much traffic this had consumed - a single roughly ~3KB POST to Mastodon caused servers to pull a bit of HTML and… fuck, an image. In total, 114.7 MB of data was requested from my site in just under five minutes - making for a traffic amplification of 36704:1.
That’s peak activity of about 30mbps for five minutes. If the server has a gigabit connection, this should take about a second of data transmission at full speed. Of course, there’s TCP slow start to deal with, and I doubt many Fediverse clients do requests in the form of HTTP/3 by default, but this doesn’t seem all that high? I don’t know what the nornal “background” traffic of random clients visiting looks like, but mathematically this seems like it shouldn’t take more than a second or two with a RAM cache.
If this were some random independent website that avoids services like Cloudflare because of their status as the gatekeeper of the internet, I would sympathise, but they already use Cloudflare. Their website, like many on the internet, just isn’t ready for bursts of visitors, it seems.
This could also be a bug in Ghost CMS, of course.
In theory, content like this could be federated directly; a Fediverse Article could be offered to the wider Fediverse and servers would distribute the content rather than a link with preview. However, this would also prevent ads from showing up, trackers from collecting visitor information, and Mastodon has chosen not to implement more than microblogging objects either. I also don’t think Lemmy supports that kind of post, but it’d be a solution in theory.
thanks for saying this! i really don’t want to victim blame itsfoss for getting traffic spikes but if you cant handle ~20MB in one minute (~400kbps) of traffic you’re doing something really really wrong and you really should look into it, especially if you want to distribute content. crying “dont share our links on mastodon” also sounds like hunting windmills, block the mastodon UA and be done with it, or stop putting images in your link previews for mastodon, or drop link previews completely. a “100 mb DDOS” is laughable at best, nice amplification calculation but that’s still 100 megs
I doubt they actually want people to stop sharing their content on Mastodon, as they share the content on Mastodon themselves. I think they want to get more attention for this issue.
Nobody seems to have done so, but it’d be trivial to use ActivityPub as an amplification factor for attacking small publications. Just register free accounts with a couple hundred servers, post links to articles (with unique garbage added to the end of the URL to bust basic server side caching), and tag a couple dozen random users from other servers. Every server, as well as every server whose user was tagged, will fetch the page, and if present, a header image. You can easily send out dozens of links per second to thousands of servers, enough to overwhelm any site that doesn’t have their content gatekept by internet giants like Cloudflare.
If the website is hosted on a server with expensive egress fees (“serverless”, Amazon, GCloud, Azure, hosters that don’t disconnect your server when you hit your bandwidth limit) you can run up a bill of tens of thousands. If the hoster does apply an egress cap, you can shut down a website for a couple of days at the very least.
I don’t have a workable solution to this problem, but the way the Fediverse seems to be built with the rather naïve idea that every request that passes the signature requirement is done in good faith has major implications on the wider internet. If we don’t find a solution to this problem, I expect websites to start blocking Fediverse user agents when the first DDoS waves start.
The only bit of data I could find:
That’s peak activity of about 30mbps for five minutes. If the server has a gigabit connection, this should take about a second of data transmission at full speed. Of course, there’s TCP slow start to deal with, and I doubt many Fediverse clients do requests in the form of HTTP/3 by default, but this doesn’t seem all that high? I don’t know what the nornal “background” traffic of random clients visiting looks like, but mathematically this seems like it shouldn’t take more than a second or two with a RAM cache.
If this were some random independent website that avoids services like Cloudflare because of their status as the gatekeeper of the internet, I would sympathise, but they already use Cloudflare. Their website, like many on the internet, just isn’t ready for bursts of visitors, it seems.
This could also be a bug in Ghost CMS, of course.
In theory, content like this could be federated directly; a Fediverse Article could be offered to the wider Fediverse and servers would distribute the content rather than a link with preview. However, this would also prevent ads from showing up, trackers from collecting visitor information, and Mastodon has chosen not to implement more than microblogging objects either. I also don’t think Lemmy supports that kind of post, but it’d be a solution in theory.
thanks for saying this! i really don’t want to victim blame itsfoss for getting traffic spikes but if you cant handle ~20MB in one minute (~400kbps) of traffic you’re doing something really really wrong and you really should look into it, especially if you want to distribute content. crying “dont share our links on mastodon” also sounds like hunting windmills, block the mastodon UA and be done with it, or stop putting images in your link previews for mastodon, or drop link previews completely. a “100 mb DDOS” is laughable at best, nice amplification calculation but that’s still 100 megs
I doubt they actually want people to stop sharing their content on Mastodon, as they share the content on Mastodon themselves. I think they want to get more attention for this issue.
Nobody seems to have done so, but it’d be trivial to use ActivityPub as an amplification factor for attacking small publications. Just register free accounts with a couple hundred servers, post links to articles (with unique garbage added to the end of the URL to bust basic server side caching), and tag a couple dozen random users from other servers. Every server, as well as every server whose user was tagged, will fetch the page, and if present, a header image. You can easily send out dozens of links per second to thousands of servers, enough to overwhelm any site that doesn’t have their content gatekept by internet giants like Cloudflare.
If the website is hosted on a server with expensive egress fees (“serverless”, Amazon, GCloud, Azure, hosters that don’t disconnect your server when you hit your bandwidth limit) you can run up a bill of tens of thousands. If the hoster does apply an egress cap, you can shut down a website for a couple of days at the very least.
I don’t have a workable solution to this problem, but the way the Fediverse seems to be built with the rather naïve idea that every request that passes the signature requirement is done in good faith has major implications on the wider internet. If we don’t find a solution to this problem, I expect websites to start blocking Fediverse user agents when the first DDoS waves start.