Naked Science Forum
Non Life Sciences => Geek Speak => Topic started by: neilep on 26/02/2010 20:43:31
-
arrrrrrrrrrrrrrrgggggggggggggggggggggghhhhhhhhhhhhhhhhhhhhhhhhhhhh !!!
why oh why oh why oh why oh why oh why oh why is it that when I download a BIG file....that it ALWAYS (OK 50%) stops at the last friggin single kb and then just does nothing forever !! ?
I HATE IT !!
ARGGGGHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH !!!!
-
We share the same source of frustration. On several occasions I have been trying to open huge pdf files, but the dowloand crashes exactly on Kb before 100%
As for the reasons, the net is a big place. If I was a kilobyte, I'd easily get lost.
-
'cos your neighbor is tapping your broadband connection?
-
I've noticed (by actually monitoring the network traffic) that the download progress indicators on many windows programs (including Microsoft's own software) are not particularly accurate and will often show continuing progress when nothing is actually happening. It's quite likely that the download had actually stopped well before that last 1kb/1%.
What you could try doing is to open the performance monitor (Ctrl-Alt-Del on XP) and select the networking tab to monitor your network traffic before you start the download.
-
Could it be that the last 1K has the checksum or some such?
-
I sometimes think the computer dozes off after the strain of loading down a large file and can be awakened with a ctrl-alt-del.
-
A bit too much anthropomorphism, methinks.
-
I suspect the cause has a lot in common with why no PC that I've ever owned/used has reliable power management capabilities. It's called "crappy software".
IMHO, most high level software "programmers" are unencumbered by understanding of how hardware actually works. ( Their standard approach to handling complex error situations is to assume that someone will hit the reset button!
(As I anticipate my comments will provoke an onslaught of hate mail, I'm sending this from my underground bunker.)
-
I think you're right Geezer. Fortunately I'm impervious to hate mail (for if someone is prepared to make the effort to send me a 'hate mail' then all they've achieved is to demonstrate their inferiority).
-
In fairness to software engineers, I think some of this is a consequence of training, but a lot of it is to do with the fact that corporations are under a lot of pressure to "just get it working". That tends to result in a lot of focus on the non-error paths and insufficient analysis of the possible error scenarios.
Consequently, much of the error handling is glued on because a "bug" showed up during test. Of course, the "bug" is really a design defect, but by then it's too late to go back and do a redesign, so the project enters a lengthy test/fix/test/fix cycle until it's deemed good enough when the test guys can't seem to break it any more.
-
I guess it is a bad download then? [?] [B)] [:I]
-
I think the software in the computers at each end get into a "deadly embrace" where each one is waiting for the other end to do something, forever. So, yes, it's time to abort manually and try again.