“I’ll be there in a Microsoft minute.”
I have no idea when I’ll be there and there’s a good chance I’ll take a nap.
Someone feeling the need to explain this feels weird
I love that they’re using IE to download Netscape. Some things never change…
real Gs used download managers to be able to resume later.
GetRight ftw. I just googled it and it’s still a thing.
Now, burning CDs at 2x speed on windows 98 was a chore. Converting a whole disc with of mp3’s and not breathing on your computer for 40 minutes so it didn’t freeze and ruin an expensive CDR sucked.
And concurrency. Pulling a mean 30k/s on that v90 modem.
When you came back in the room it wouldn’t be saying 3 minutes or 52 years. It would be saying ‘are you sure you want to move file xxxxx.xxx’ and it would have been saying that since you left the room.
Isn’t that how Windows still works?
Showing off with their 56.6kbps modem.
13.3k clan here to represent.

I came to the comments hoping someone posted this. Its a classic and many young’ins will never know the pain.
The person writing this post also doesn’t sell it.
This wasn’t just for a download. This was for moving files generally.
See how it’s an FTP transfer? As far as the system cares, it’s like moving a file from one folder to another. And moving large files was huge back when you had a maximum drive size of 60 gigs in the OS, so you had to partition your large harddrives into like a dozen partitions and had to regularly move huge chunks of data to manage storage.
Don’t most torrent clients do these types of estimates? (Of course you can resume a torrent though.)
Yeah, but the estimates are not as terrible. You have to average the download speed over some time period, so that the estimate doesn’t shoot towards infinity when the download speed falters for a moment.
Not awfully complex to implement, but for whatever reason Microsoft didn’t bother for quite while, even after they already had a reputation for terrible estimates.
Using the Transmission torrent client, I’ve seen the estimates go into years.
Try downloading peer hosted torrents. If there’s only one person hosting and they stop, your estimated time will creep to 40+ years before it shows it as not downloading.
I had some media that I really wanted but couldn’t find anywhere (the Latin American Spanish dub of a movie). I could find unseeded torrents though.
I put the most promising one in my server and waited. Nothing for a month or so, then one day I check in and see it’s at 5%. And it would sporadically continue for 3 more months until it completed.
As far as I could tell basically just some random person who happened to have this file was a both a casual torrenter and a leech. But unlike most leeches they didn’t bother stopping the seeding of completed files, they just shut down the program altogether when they weren’t downloading.
So they (with either throttled upload in their client or shitty upload speeds in general) would go on, open their torrent program, download something, and in doing so keep their program up long enough to seed a few MB before their download completed and they turned it off.
That file went on to be my highest seed ratio by far after lol
As a teenager in this era this sort of thing was awful especially when downloading certain ahhhhh videos…
There was a program I found that really helped called NetAnt.
You see back then you could download a small file quickly, but for larger files the distant server would throttle your connection so you wouldn’t hog all their bandwidth. Enter NetAnt.
NetAnt would send several different ‘ants’ to the server and request different parts of the same file. Then it would download it in chunks and create a file when it was done. You could also queue up different urls to pull from, you just had to know the file structure of the websites content and back then that was rarely obscured and was often logical and predictable. This ended being one of my first applications of my C++ classes to solve a problem I had. I made a little program that generated strings of urls based on what I thought the file structure was and then I dumped a thousand file requests into NetAnt to download while I was in school.
It worked! And I ended up with gigs and gigs of full porn videos from one of the nastiest websites back then by scraping downloads from the sample pages of 30 second clips.
I don’t think I actually watched much of it, the acquisition was the important part.
There was a bunch of download managers like that! I personally used GoZilla, and as many other apps, it would download in multiple pieces, and there were retries (so you wouldn’t lose the progress if you disconnected for a second, or something like that).
But I did download a pirated version of Ultima Online, to play on a pirate server, through standard Internet Explorer dialog. The file was 314MB, it took a couple days while I was feeding my Internet Provider with time codes and asking my mom not to even think of touching the phone.I had GetRight to do that. The great thing about it was that it could also work with multiple mirrors. So you would download the same file from multiple servers at once.
GetRight was the bomb.
Then I got fast internet (we’re talking 25 MBit DSL here) and servers stopped being so stingy and download managers slowly became a thing of the past.
And then, a year or so ago, I had to work on a company VM that would randomly reset the connection to GitHub, from which I needed a rather large file. So I wrote a super cheap download manager and named it GetWrong in honor of the hero of my ISDN days.
It’s people like you who send all their web queries chunked as millions of different DNS queries to be assembled later, just so they don’t have to sign up for the free airport wifi.
EDIT: YOU TOOK DOWN CLOUDFLARE, DIDN’T YOU!?
Yes, I wanted to download the demo for Alpha Centauri
Called a download manager, lots of different options.
Download? This happened even while copying files or installing games/software off a CD.
Even other adults that lived through this with me have forgotten
NTFS is still soo slow compared to any file operation i do on Linux.
Linux: “I’m gonna copy this GB file instantly, and hope you don’t call sync any time soon”
Absolutely awful when it’s some flash drive that just slows down to nothing at which point you’d rather cancel the tranfer, but you can’t, but you also can’t unmount because it’s busy, so you just accept the loss, unplug and then re-format.
I only know how to add stuff to fstab with sync option, but what about other drives? I don’t want to manually mount everything.
I don’t want to manually mount everything.
Heh, I literally do – it’s become so automatic it’s not even a pain point anymore
This is why I bought GetRight.
What a great program!









