Pirated media(images, movies, ebooks, ROMs) uses binary posts, not text. There are different limits and retention policies for binary versus text articles, and most Usenet servers, particularly cheap or free ones, don’t host a lot of the categories a pirate would want at all.
Please don’t imply that all Usenet providers facilitate piracy.
Thanks for the pointer! I took the opportunity to learn a bit about more recent NNTP by reading the standard: RFC 3977. It looks like nntp v2 circa 2006 added MIME encoding, so I would guess that may be how a service provider would differentiate.
I haven’t used Usenet since the turn of the century. Back then it was all text (including every article under alt.binaries), and even pirated media needed to be split into a multi-part format (often rar) then each part uuencoded so it could be included in an article.
What do you consider large files? Isn’t the article size usually limited to something like 1mb (it’s been a while since I used Usenet)?
So it would technically be about the number of articles rather than the eventual size of the combined archive? At the core it’s all still text right?
Pirated media(images, movies, ebooks, ROMs) uses binary posts, not text. There are different limits and retention policies for binary versus text articles, and most Usenet servers, particularly cheap or free ones, don’t host a lot of the categories a pirate would want at all.
Please don’t imply that all Usenet providers facilitate piracy.
Thanks for the pointer! I took the opportunity to learn a bit about more recent NNTP by reading the standard: RFC 3977. It looks like nntp v2 circa 2006 added MIME encoding, so I would guess that may be how a service provider would differentiate.
I haven’t used Usenet since the turn of the century. Back then it was all text (including every article under alt.binaries), and even pirated media needed to be split into a multi-part format (often rar) then each part uuencoded so it could be included in an article.