[nSLUG] Re: wget & interrupted fetch
mspencer at tallships.ca
Wed Sep 28 02:42:53 ADT 2016
On Sat, Sep 24, 2016 at 04:10:35AM -0300, Mike Spencer wrote:
mds> Is there a technical reason why this doesn't work with
mds> --no-check-certificate and an HTTPS URL? [with restart from byte
mds> 0 after interrupted transfer]
Damien Robichaud <damienr74 at gmail.com> wrote:
> You can try completing the request with curl to see if it's a wget issue.
Hah! I forgot about curl. Must become familiar with that.
dr> You can also check to see if the specific site accepts range
dr> requests by doing `curl -I https://url.com/path-to/file` and if
dr> 'Accept-Ranges: bytes' is shown, then there shouldn't be a reason
dr> for the server not to resume. I have tried this on https and I
dr> have seen the Accept-Ranges attribute set.
Hah. Good. Something to experiment with next time it comes up.
Then "George N. White III" <gnwiii at gmail.com> wrote:
gnw> In my experience, "wget -c" works reliable with some sites and
gnw> not with others. Some sites that work do use https, and in fact
gnw> there are many more interrupted transfers using https so without
gnw> continuation it can be nearly impossible to download a large file
gnw> on a system with an unreliable network connection.
Okay, good. So, probably not just me or just my software or (oldish)
system or dialup connection. SlashDot and maybe one or two other
sites seem to be a special case: After serving some of the data --
maybe nearly all -- they send RST packets and the browser aborts the
fetch. Can't figure that one out, maybe something to do with server
or Akamai or other CDN optimizing something?
gnw> What version of wget?
gnw> Is there a firewall involved?
Not locally, only my amateur effort with iptables. Presumably not
remotely at a site with a publicly accessible web server.
I'll work on it.
Michael Spencer Nova Scotia, Canada .~.
mspencer at tallships.ca /( )\
More information about the nSLUG