[nSLUG] wget & interrupted fetch
damienr74 at gmail.com
Mon Sep 26 08:23:29 ADT 2016
On Sat, Sep 24, 2016 at 04:10:35AM -0300, Mike Spencer wrote:
> Is there a technical reason why this doesn't work with
> --no-check-certificate and an HTTPS URL?
I don't know of a technical reason. I just tried:
`curl -r 0-20000
-o 2000px-Tux.svg.png` (large image)
and resumed with `wget -c --no-check-certificate ...` and it resumed from
where I left off. So unless wikipedia isn't followind the standard, then there
shouldn't be a technical reason for it not working.
> Is there something I'm overlooking in the manpage or some hack for a
You can try completing the request with curl to see if it's a wget issue. To
do so, you can specify the range in bytes `curl -r resumefrom-end site -o file2`
and then `cat file1 file2 > actualfile`
> I'm aware the RFC 2616 (14.35.2 Range Retrieval Requests) says, "A
> server MAY ignore the Range header." Is it the case that serving data
> referenced by an HTTPS URL always ignores a range header that says,
> essentially, "Got >this-much<, send me the rest"?
You can also check to see if the specific site accepts range requests by doing
`curl -I https://url.com/path-to/file` and if 'Accept-Ranges: bytes' is shown,
then there shouldn't be a reason for the server not to resume. I have tried
this on https and I have seen the Accept-Ranges attribute set.
Hope this is useful information,
More information about the nSLUG