[nSLUG] Wikipedia changes/limits protocols?
mspencer at tallships.ca
Mon Jul 13 15:56:48 ADT 2015
me> Wikipedia recently went to HTTPS-only. That was a bother.
me> Now I can't access it at all except with "wget --no-check-certificate".
me> My "new" browser is an old version of Firefox. FF reports:
me> The connection was reset
me> The connection to the server was reset while the page was loading.
me> and Wireshark shows repeated instances of this transaction:
me> TCP 32998 > https [SYN]
me> TCP https > 32998 [SYN, ACK]
me> TCP 32998 > https [ACK]
me> SSLv2 Client Hello
me> TCP https > 32998 [ACK]
me> TCP https > 32998 [RST, ACK]
and Donald replied:
dt> I would guess your wget is out of date. Mine on Debian 7 or 8
dt> works fine with:
dt> wget https://en.wikipedia.org/wiki/Main_Page
I wasn't clear enough. From what you say, I suppose that more recent
versions of wget(1) support HTTPS while the older versions do not
and use the --no-check-certificate switch to ignore or work around it.
That's not the problem. The problem is that the FF GUI browser
"supports" HTTPS but fails after offering to do SSLv2. I mentions
wget just to show that I was not cut off from Wikipedia by
misconfigured firewall, some unknown IP block, failed DNS or whatever.
Then Johann Tienhaara wrote:
jt> It seems to me I've seen these types of error messages due to PKCS
jt> differences. For example maybe the server's key is ecnrypted
jt> using PKCS #12 but your FireFox only handles PKCS #7. Just a
Huh. Okay. lessee... So does the 'S' in "PKCS" mean "Syntax" or
"Standards"? AFAICT, the TCP negotiation never gets past the line
where my host attempts to initiate SSLv2. But, also AFAICT, PKCS #12
is a spec for a *file format* for storing crypto certs. It's unclear
how FF could get as far as that when it gets a RST right after the
Oddly, the problem with Wikipedia seems to have gone away today. 
But big players such as Google and Wikipedia make arcane changes on
the fly on the assumption that everybody is doing the Latest Thing and
no one will notice. (E.g. The IP address for www.google.ca changes
from time to time from addresses in their 184.108.40.206/16 block to
addresses in Eastlink's 24.x.x.x block.) So I have no idea whether I'm
good to go or can look forward to more hassles.
I'd still like to know if Wikipedia is/has been/will be rejecting
SSLv2 requests -- if that's what's actually happening.
I've read Schneier's book, of course didn't by any means absorb all of
it, but that didn't cover the actual implementations, for which I
suppose I have to read a whole lot of RFCs. I'm not really looking
forward to that.
Any further comments welcome.
 But a new annoyance today: Google.ca has begun responding with
"403 Forbidden" to my HTTP connections when I send "User-Agent:
Walled-City" but seems happy with no User-Agent request header.
Michael Spencer Nova Scotia, Canada .~.
mspencer at tallships.ca /( )\
More information about the nSLUG