As a teenager in this era this sort of thing was awful especially when downloading certain ahhhhh videos…
There was a program I found that really helped called NetAnt.
You see back then you could download a small file quickly, but for larger files the distant server would throttle your connection so you wouldn’t hog all their bandwidth. Enter NetAnt.
NetAnt would send several different ‘ants’ to the server and request different parts of the same file. Then it would download it in chunks and create a file when it was done. You could also queue up different urls to pull from, you just had to know the file structure of the websites content and back then that was rarely obscured and was often logical and predictable. This ended being one of my first applications of my C++ classes to solve a problem I had. I made a little program that generated strings of urls based on what I thought the file structure was and then I dumped a thousand file requests into NetAnt to download while I was in school.
It worked! And I ended up with gigs and gigs of full porn videos from one of the nastiest websites back then by scraping downloads from the sample pages of 30 second clips.
I don’t think I actually watched much of it, the acquisition was the important part.
There was a bunch of download managers like that! I personally used GoZilla, and as many other apps, it would download in multiple pieces, and there were retries (so you wouldn’t lose the progress if you disconnected for a second, or something like that).
But I did download a pirated version of Ultima Online, to play on a pirate server, through standard Internet Explorer dialog. The file was 314MB, it took a couple days while I was feeding my Internet Provider with time codes and asking my mom not to even think of touching the phone.
I had GetRight to do that. The great thing about it was that it could also work with multiple mirrors. So you would download the same file from multiple servers at once.
Then I got fast internet (we’re talking 25 MBit DSL here) and servers stopped being so stingy and download managers slowly became a thing of the past.
And then, a year or so ago, I had to work on a company VM that would randomly reset the connection to GitHub, from which I needed a rather large file. So I wrote a super cheap download manager and named it GetWrong in honor of the hero of my ISDN days.
It’s people like you who send all their web queries chunked as millions of different DNS queries to be assembled later, just so they don’t have to sign up for the free airport wifi.
As a teenager in this era this sort of thing was awful especially when downloading certain ahhhhh videos…
There was a program I found that really helped called NetAnt.
You see back then you could download a small file quickly, but for larger files the distant server would throttle your connection so you wouldn’t hog all their bandwidth. Enter NetAnt.
NetAnt would send several different ‘ants’ to the server and request different parts of the same file. Then it would download it in chunks and create a file when it was done. You could also queue up different urls to pull from, you just had to know the file structure of the websites content and back then that was rarely obscured and was often logical and predictable. This ended being one of my first applications of my C++ classes to solve a problem I had. I made a little program that generated strings of urls based on what I thought the file structure was and then I dumped a thousand file requests into NetAnt to download while I was in school.
It worked! And I ended up with gigs and gigs of full porn videos from one of the nastiest websites back then by scraping downloads from the sample pages of 30 second clips.
I don’t think I actually watched much of it, the acquisition was the important part.
There was a bunch of download managers like that! I personally used GoZilla, and as many other apps, it would download in multiple pieces, and there were retries (so you wouldn’t lose the progress if you disconnected for a second, or something like that).
But I did download a pirated version of Ultima Online, to play on a pirate server, through standard Internet Explorer dialog. The file was 314MB, it took a couple days while I was feeding my Internet Provider with time codes and asking my mom not to even think of touching the phone.
I had GetRight to do that. The great thing about it was that it could also work with multiple mirrors. So you would download the same file from multiple servers at once.
GetRight was the bomb.
Then I got fast internet (we’re talking 25 MBit DSL here) and servers stopped being so stingy and download managers slowly became a thing of the past.
And then, a year or so ago, I had to work on a company VM that would randomly reset the connection to GitHub, from which I needed a rather large file. So I wrote a super cheap download manager and named it GetWrong in honor of the hero of my ISDN days.
It’s people like you who send all their web queries chunked as millions of different DNS queries to be assembled later, just so they don’t have to sign up for the free airport wifi.
EDIT: YOU TOOK DOWN CLOUDFLARE, DIDN’T YOU!?
Yes, I wanted to download the demo for Alpha Centauri
Called a download manager, lots of different options.