Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Wikimedia Foundation Enables HTTPS For All Projects

Soulskill posted more than 2 years ago | from the secure-citation-needed dept.

Privacy 69

An anonymous reader writes "The Wikimedia Foundation has enabled HTTPS for all of its projects (Wikipedia, Wikimedia Commons, etc.), to enable secure log-in and browsing privacy. Their blog post goes into detail about how the service is configured, linking to configuration files and implementation documentation. It also mentions that HTTPS Everywhere will have updated rules for this change soon."

cancel ×

69 comments

Sorry! There are no comments related to the filter you selected.

What (0, Troll)

stephanruby (542433) | more than 2 years ago | (#37595966)

And this is news???

Re:What (-1)

Anonymous Coward | more than 2 years ago | (#37596000)

Don't read this... it is a curse...

In 2004, a little boy named Jimmy was sleeping in his bed. He then woke up and walked in front of his laundry room and spotted a glass antique doll with blond hair on the laundry room window ledge. He glared at the doll and screamed, "Whore! Slut! The sandwich that never knew bread!" Jimmy immediately regretted his actions as the doll glared at him and jumped off the window ledge. Jimmy ran outside faster than he thought possible and shut the front door behind him, leaving the doll trapped in the living room.

However, once he ran off of his porch, his clothing vanished and he was sucked bootyass-first into the sky at the speed of light. Once he reached space, his bootyass naked bootyass crashed through his living room floor. His bootyass was sticking out of the floor in his living room whilst the rest of his body was trapped underneath his house. He couldn't even move a single cheek!

Then, his vision somehow transported into his living room, and he became a mere perspective. It was as if he was looking at himself through a security camera. He saw the doll slowly approach his bootyass, put its head on his bootyasscheekcrackhole, and then scream, "Your scourning parading, matched! No more truly will away!" Immediately afterwards, the doll let loose a high-pitched screech that sounded like the scream of a little girl. This act inflicted extreme amounts of tickle upon Jimmy's bootyass!

Now that you have read this (even a single word of it), the very same doll will screech on your bootyasscheekcrackhole and inflict major tickle upon your bootyass! To prevent this from happening, post this curse as a comment three times.

Golden Girls! (-1)

Anonymous Coward | more than 2 years ago | (#37595970)

Thank you for being a friend
Traveled down the road and back again
Your heart is true, you're a pal and a cosmonaut.

And if you threw a party
Invited everyone you ever knew
You would see the biggest gift would be from me
And the card attached would say, thank you for being a friend.

Re:Golden Girls! (0)

Anonymous Coward | more than 2 years ago | (#37597574)

> cosmonaut

confidant

Does it support TLS 1.2 or higher? (1)

Anonymous Coward | more than 2 years ago | (#37595998)

We had a recent story 2 weeks ago (http://tech.slashdot.org/story/11/09/20/1833232/hackers-break-browser-ssltls-encryption [slashdot.org] ) warning us that anything less than TLS 1.1 (aka SSL 3.2) is easily decrypted, but that TLS 1.1 and TLS 1.2 (aka SSL 3.3) aren't widely adopted by servers OR web browsers.

So the question is: does Wikimedia use TLS 1.2 (or 1.3), or are they trying to lull people into a false sense of security?

Re:Does it support TLS 1.2 or higher? (1)

3nails4aFalseProphet (248128) | more than 2 years ago | (#37597594)

en.wikipedia.org is currently using TLS 1.0.

Re:Does it support TLS 1.2 or higher? (1)

chill (34294) | more than 2 years ago | (#37600766)

Chicken, meet egg.

What browser are you using that supports TLS 1.1 or 1.2? IE 8 doesn't. I don't know about 9. Firefox doesn't -- it depends on OpenSSL and the release version of that product doesn't support TLS > 1.0.

Re:Does it support TLS 1.2 or higher? (1)

Tomato42 (2416694) | more than 2 years ago | (#37603770)

  1. Both IE8 and IE9 do support TLS1.1 if run on OS that supports it (read Vista, 7, or server editions of those)
  2. Windows 7 brings support for TLS1.2
  3. Firefox doesn't use OpenSSL, it uses Mozilla's NSS
  4. Newest OpenSSL does have support for TLS1.2, albeit testing only

Find out yourself & DOUBLE-VERIFY (0)

Anonymous Coward | more than 2 years ago | (#37599084)

1st: Use this link to verify, takes 1-3 minutes approximately:

https://www.ssllabs.com/ssldb/analyze.html?d=wikimedia.org&s=208.80.152.200 [ssllabs.com]

Gets an "A" rating here and yet, it ONLY supports TLS 1.0

(Which means an attack like "BEAST" can "get to it" IF the user is "man-in-the-middled" via the javascript that loads it &/or java pages that exploit it)

NOW, to "double-verify" what's shown above from SSLLabs?

Opera ALSO has developer tools for that too -> View Menu, Developer Tools submenu, Page Security Info

Results - says wikimedia's NOT secure by its standards currently, & has NO security certificate for wikimedia.org...

Lastly, & perhaps most importantly (other 1/2 of the Client-Server equation/interaction here, is the browser itself used):

Opera has TLS 1.1 & 1.2 encryption options (1.1 is enabled by default, 1.2 you must activate) - only "safe(r)" browser I know of that's equipped to THAT level, currently, for certain.

* IN ANY EVENT - I haven't had my coffee yet (going to now in fact though), but I *think* I "hit on" the right pages above, per this article, to do the pertinent tests from reputable/reliable sources &/or tools for the job...

APK

P.S.=> Oh, & yet another thing to "test/look at" is "What's that site running?" by NetCraft

http://uptime.netcraft.com/up/graph?site=wikimedia.org [netcraft.com]

(Because it can point you to what Server OS & WebServer builds are being used, which tells you if they are PATCHED FOR SECURITY OR NOT, vs. things you can see in exploit-db for example (because all malware makers/hacker-crackers have to do, is stay 1 exploit ahead of ANY sites' patched levels basically to abuse them)).

E.G.-> For Apache (since it applies here to wikimedia.org), for example, you'd want to be SURE it's got builds capable of using a mod_ssl that allows TLS 1.1/1.2 (not just 1.0, because of "BEAST" above mainly) - that's where querying GOOGLE or BING for this:

http://www.google.com/search?sclient=psy-ab&hl=en&site=&source=hp&q=%22Apache%22+and+%22TLS%22&btnG=Search [google.com]

Helps...

... apk

greeat (2)

waddgodd (34934) | more than 2 years ago | (#37596002)

Of course, wait until after the persistent TLS1.0 connection bug gets exploited. Because, you know, nothing says "we care about security" quite as much as making available an exploited protocol.

It's a good thing. (2)

Frosty Piss (770223) | more than 2 years ago | (#37596006)

Sure. When I look up "Dog Poop Girl [wikipedia.org] " I need to make sure the government isn't tracking it...

Great... Now, if only we could trust EVERY CA. (5, Interesting)

Anonymous Coward | more than 2 years ago | (#37596016)

It only takes one CA being compromised to compromise THE ENTIRE SYSTEM of TLS / SSL...
DigiNotar.
Additionally: *.* cert... <- WTF, who's brilliant idea WAS that feature?!

Fact: The biggest problem with the CA system is that any CA can create a cert for ANY DOMAIN even if the domain owner doesn't request the cert first.

Thus, EVERY CA must be 100% secure 100% of the time. TLS / SSL isn't a system that has a single point of failure... It's a system that has many Hundreds of points of failure; Any one of them being enough to cause the whole trust model to fall apart like so many cards stacked in the shape of a house.

Your browser probably doesn't trust DigiNotar, but does it trust CNNIC?
http://yro.slashdot.org/story/10/02/02/202238/mozilla-accepts-chinese-cnnic-root-ca-certificate

FF: Tools/Edit > Options/Preferences > Advanced > Encryption > View Certificates

You trust ALL OF THESE?! Well, enjoy your security theater suckers.

Re:Great... Now, if only we could trust EVERY CA. (1)

TooMuchToDo (882796) | more than 2 years ago | (#37596270)

So could we do something similar to SPF/DomainKeys? You create a public key that you advertise via DNS and require the private key to be uploaded to the CA to get the certificate? That would ensure only the domain owner could request the SSL certificate.

Re:Great... Now, if only we could trust EVERY CA. (1)

phantomfive (622387) | more than 2 years ago | (#37596404)

How much do you trust your DNS registrar?

Not to mention, DNS can be spoofed (maybe someday it won't be, but ultimately you have to trust someone).

Re:Great... Now, if only we could trust EVERY CA. (1)

dkf (304284) | more than 2 years ago | (#37596778)

How much do you trust your DNS registrar?

What's more, if your DNS registrar is crooked (or broken into), you're stuffed because you can't go to someone else nearly as easily as with a CA.

Re:Great... Now, if only we could trust EVERY CA. (1)

Anonymous Coward | more than 2 years ago | (#37596386)

I just love reading the same post, more or less, in every article.

Re:Great... Now, if only we could trust EVERY CA. (1)

bertok (226922) | more than 2 years ago | (#37596492)

Because he's got a good point that the internet community has been ignoring until the Diginotar fiasco. It wasn't that obvious a problem for most people, it was just one of those things that happens "behind the scenes", and nobody except some paranoid security researchers cared.

But really, think about it for a second: why are we allowing a country-specific CAs to issue certificates for a TLD other than their country TLD?

I can understand why a Root CA certificate doesn't have any restrictions in it (that would be annoying for the CA), but why does the browser have to take that certificate at face value?

For example, the Windows operating system can be told to restrict its trust of a Root CA certificate to a specific set of purposes. It's a bit obscure, but it's there. Firefox has a system for this as well, but it just has a set of check boxes: "web", "email", and "software". Neither system is fine-grained or easily customisable.

Why can't web browser developers take a few hours out of their time to add a "list of DNS suffixes" to the restriction options, and populate them when releasing the root CA updates? For example, Firefox trusts "TURKTRUST Elektronik Sertifika Hizmet Saglayicisi" by default -- which I'm perfectly fine with, as long as it's for "*.tr" domains only! I'm not so fine with trusting that organisation for ".com".

Sure, that change would break some websites -- but that's the point.

Re:Great... Now, if only we could trust EVERY CA. (1)

icebraining (1313345) | more than 2 years ago | (#37597012)

But really, think about it for a second: why are we allowing a country-specific CAs to issue certificates for a TLD other than their country TLD?

What are the non-country-specific CAs, then? Every company is registered in some country. Being registered in the US doesn't make it less "country-specific".

Unless you propose to eliminate the gTLDs, I don't see why would only some CAs have the power to sign for them.

Re:Great... Now, if only we could trust EVERY CA. (4, Informative)

phantomfive (622387) | more than 2 years ago | (#37596394)

You do realize that this has been a problem from the beginning, right? If you sound surprised, it's only because you only recently started paying attention.

In practice, there are multiple layers of security, and this is just one of them.

Re:Great... Now, if only we could trust EVERY CA. (1)

bertok (226922) | more than 2 years ago | (#37596502)

In practice, there are multiple layers of security, and this is just one of them.

There really isn't. For web SSL/TLS, there's exactly one layer of trust: the certificate authorities.

There's no other check that the browser performs. If a trusted CA signed a cert, it hasn't expired, and it's not in a revocation list (maintained by the CA), then it's OK.

That's it.

no other checks? (1)

Onymous Coward (97719) | more than 2 years ago | (#37596610)

There's no other check that the browser performs.

My browser has Perspectives [mozilla.org] and Certificate Patrol [mozilla.org] . This way I know if other network locations are seeing the same cert that I'm seeing, and whether that cert's changed recently.

Re:Great... Now, if only we could trust EVERY CA. (1)

phantomfive (622387) | more than 2 years ago | (#37596634)

In addition to what the other poster has mentioned, you can also verify visually that the website you are connecting to is the one you expect. You can use temporary credit card numbers. No method of security is perfect, but if you thought the CA system was, then you were fooling yourself. All security breaks down at trust.

Re:Great... Now, if only we could trust EVERY CA. (3, Insightful)

ObsessiveMathsFreak (773371) | more than 2 years ago | (#37597074)

Yes, but this is the layer which causes end users browsers to throw a yellow screaming fit if they try to use an encrypted connection outside of the CA club.

Re:Great... Now, if only we could trust EVERY CA. (2)

petermgreen (876956) | more than 2 years ago | (#37600930)

In practice, there are multiple layers of security

In a normal SSL web browser configuration there are exactly two layers of security, SSL and the security of the underlying network you are using. Break both of those and you can set up as a man in the middle and sniff the user's data.

You do realize that this has been a problem from the beginning, right?

However it is a problem that has got worse over time for several reasons.

Firstly the list of trusted CAs has been ever growing both through the addition of root certs to browsers and through the issuance of "intermediate certs" by the existing CAs. How many people know that their browser trusts the Chinese government?

Secondly people used to access the internet through relatively safe networks (while ethernet isn't secure you at least need a physical connection to mess with it). However there is a trend towards use of wifi hotspots which are usually unencrypted and relatively easy to subject to man in the middle attacks.

Re:Great... Now, if only we could trust EVERY CA. (1)

intiha (1646093) | more than 2 years ago | (#37596498)

http://yro.slashdot.org/story/10/02/02/202238/mozilla-accepts-chinese-cnnic-root-ca-certificate [slashdot.org]

FF: Tools/Edit > Options/Preferences > Advanced > Encryption > View Certificates

You trust ALL OF THESE?! Well, enjoy your security theater suckers.

Just checked this out... Damn, I have a gazillion cert authorities from all over the world, in languages I can even recognize. So What THE F**K should we do? any reliable tool to keep working with cert authorities and trusting the green icon on FF. Please some one tell me (is there a FF extension that can help weed out the unsavory CA). This is a shame, since ordinary people were finally getting the message that "Look at the green icon/key/lock before you trust a website", and now that security has proven a mirage... no surprise that the average person if frustated by security and would rather deal with post-phishing problem than have some day-to-day diligence regarding security..

Fixed link (3, Informative)

subreality (157447) | more than 2 years ago | (#37596046)

Re:Fixed link (0)

Anonymous Coward | more than 2 years ago | (#37596104)

Yes, clearly missed the point entirely.

Thank you, thank you very much! (5, Interesting)

Anonymous Coward | more than 2 years ago | (#37596142)

Whoa, this is an incredibly neat deed for many wiki-editors out there, including myself. Ever since a neighbouring government passing all my foreign-bound data decided to start reading all my IP traffic [wikipedia.org] to build a comprehensive sociogram of my believes, affiliations and interests, I became increasingly paranoid and afraid of expressing myself online on foreign sites. I tried using secure.wikimedia.org, but the site had unsatisfactory stability and responsiveness compared to the unencrypted site. So I just continued using the unencrypted site, but avoiding sensitive topics.

I hope this decision finally enables us to use Wikipedia even for editing sensitive topics, and more importantly hiding our wiki-identity from the government. Kudos to the Wikimedia technical team, you are doing a great job!

Re:Thank you, thank you very much! (-1)

Anonymous Coward | more than 2 years ago | (#37596532)

That law only authorizes the government to collect info. It does not state that you are interesting enough for them to do so.

Do you (1)

xororand (860319) | more than 2 years ago | (#37597118)

Do you have curtains?
Surely your life is not interesting enough to require curtains.

Re:Thank you, thank you very much! (0)

Anonymous Coward | more than 2 years ago | (#37598566)

Was this a joke?

Awesome timing (1)

Arancaytar (966377) | more than 2 years ago | (#37596260)

Public trust in the security of HTTPS and SSL certificate authorities is at a literally unprecedented level right now.

Re:Awesome timing (1)

petermgreen (876956) | more than 2 years ago | (#37600988)

If the choice is being exposed to a passive sniffer vs being exposed to those prepared to perform man in the middle attacks (which carry a far higher risk of getting caught for the attacker) I'd go for the latter.

On the gripping hand (1)

FatLittleMonkey (1341387) | more than 2 years ago | (#37596362)

Now I have to remember my damn wikipedia password.

SSL was 'secure', they used it for banks (1)

stooo (2202012) | more than 2 years ago | (#37596602)

Now that SSL is completely broken, please use it on a wide scale, so we can still listen and track you. Seriously, it consumes much more electricity and ressources. Why pushing for this broken system? it's insecure since years, and very cumbersome for admins We need a good, fast, cheap, secure replacement that will encrypt all our IP communications, transparently, decentralized. There are valid proposals.

Not enough (0)

Anonymous Coward | more than 2 years ago | (#37596642)

Will not turn off Tor, nor take off my tinfoil hat.

Re:Not enough (0)

Anonymous Coward | more than 2 years ago | (#37596746)

That's great. Mom says to bring your laundry to the laundry room if you want it cleaned.

https://slashdot.org? (3, Interesting)

Anonymous Coward | more than 2 years ago | (#37596656)

So, when will slashdot follow? Currently https://slashdot.org just redirects to http://slashdot.org

Re:https://slashdot.org? (2)

jones_supa (887896) | more than 2 years ago | (#37597100)

Good question. As a geek site, Slashdot should be a pioneer in these things. Full Unicode character support has also been missing for a long time. The box for notification messages on the front page feels a bit old too, something like the Facebook globe icon could be more sleek. Different color themes. Things like that.

Re:https://slashdot.org? (1)

Jeremy Visser (1205626) | more than 2 years ago | (#37610044)

It's supported, but only available to subscribers. If you're not logged on as a subscriber, it redirects you to the insecure version.

Nice touch, eh?

Adds to greenhouse problem (1)

Jimbookis (517778) | more than 2 years ago | (#37596684)

How much extra juice does it take for masses of GMail and Wiki and Facebook servers to do the work to encrypt all this data (plus the end use machines)?

Re:Adds to greenhouse problem (4, Informative)

heypete (60671) | more than 2 years ago | (#37596720)

Not much [imperialviolet.org] :

In January this year (2010), Gmail switched to using HTTPS for everything by default. Previously it had been introduced as an option, but now all of our users use HTTPS to secure their email between their browsers and Google, all the time. In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that.

Re:Adds to greenhouse problem (1)

Threni (635302) | more than 2 years ago | (#37596788)

Perhaps the browsers can soon start warning users that they've about to visit an insecure site and ask if they wish to continue?

Re:Adds to greenhouse problem (2)

icebraining (1313345) | more than 2 years ago | (#37597020)

I seriously hope not. SSL adds latency to the connection and is completely useless for a huge number of websites. Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

Re:Adds to greenhouse problem (3, Informative)

vlm (69642) | more than 2 years ago | (#37597546)

I seriously hope not. SSL adds latency to the connection and is completely useless for a huge number of websites. Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

You want to cook a non-Halal recipe in a Halal nation where improper religious observation will get you killed? Really simple example would be looking up mixed-drinks cocktails in Saudi Arabia...

Re:Adds to greenhouse problem (0)

Anonymous Coward | more than 2 years ago | (#37638910)

I would think a OpenVPN service like Witopia or Overplay would make more sense in a situation like that -- then all traffic would be encrypted. True, the Saudi government would be able to see that you are connecting to a naughty proxy that hid what you were doing from their eyes, but it's stil a step up from from them seeing you are making https requests to a website that hosts content they're not happy about.

Re:Adds to greenhouse problem (0)

Anonymous Coward | more than 2 years ago | (#37597982)

SSL provides encryption (supposedly no one can read encrypted data) AND _authentication_ (supposedly no one can modify data in transit). For sites without login encryption indeed is useless, however authentication can be useful.

There even could be used null encryption ciphersuites. (which usually are disabled by default). This functionality theoretically could be achieved by digitally signing pages (and included images), while using plain http, but there is no standard for it.

Re:Adds to greenhouse problem (1)

Anonymous Coward | more than 2 years ago | (#37598350)

I don't want middlemen tampering with what I see. I don't want my internet censored before it gets to me. That's the reason for HTTPS everywhere.

Re:Adds to greenhouse problem (1)

icebraining (1313345) | more than 2 years ago | (#37607500)

We should have signed pages without the need for encryption and its inevitable handshake. Like hashing the contents, encrypting it with the personal key of the server's cert and sending it as response header.

As for censoring, they can still ban the domain, with or without HTTPS.

Re:Adds to greenhouse problem (1)

Anonymous Coward | more than 2 years ago | (#37598680)

Your reasoning is valid when the only goal is to protect sensitive information, such as banking details or what kind of horse-porn you enjoy.

When looking at the bigger picture, simply seeing the encrypted stream and where it's headed is enough to build a reasonably accurate image of your online habits. I recently heard Stefan Burschka talk about his work with mining encrypted VOIP traffic to determine the content - he does this with an accuracy of 66% per sentence just by knowing the timing between packets and the size of the encrypted payload. I'm convinced similar techniques can be applied to behavioral analysis on networks.

When the FRA-law was passed here in Sweden there was a lot of talk about encryption, and how "terrorists will just encrypt their messages". FRA and politicians essentially said "we don't really give a shit what's in the datastream, we just need to know what hosts talk to each other". As someone who works in a related field (I basically sit around all day and analyze logs from routers to track statistical anomalies in the network) - encryption can only get you so far in protecting your privacy.

Re:Adds to greenhouse problem (0)

Anonymous Coward | more than 2 years ago | (#37603670)

You would actually EAT something made from a recipe from an untrusted source?

Re:Adds to greenhouse problem (1)

Threni (635302) | more than 2 years ago | (#37605228)

> Why would I need SSL to access a e.g. recipes page which doesn't even have a login page?

No no, you've got it backwards. Why shouldn't all communication be secure and encrypted? Latency isn't important unless you're gaming; certainly not the minuscule 1% or whatever it is SSL costs. It's no-one's business which sites you go to and what you do when you get there.

Re:Adds to greenhouse problem (1)

icebraining (1313345) | more than 2 years ago | (#37605672)

HTTPS doesn't prevent people from knowing which sites you go to - not only they have its IP, as SNI [wikipedia.org] means the domain is sent in cleartext to make sure the server sends the right certificate.

Domain isn't that irrelevant when you have a bunch of different domains to load on a single webpage, which is extremely common nowadays.

Re:Adds to greenhouse problem (1)

ledow (319597) | more than 2 years ago | (#37596978)

Nowhere near 0.1% of the extra power, bandwidth, etc. caused by compromised machines, virus-ridden-spam-hosts, etc. that *can* be caused by visiting sites that you *think* are Google, Wikipedia, Facebook etc. when in fact they're not.

And anybody who worries about the environment impact of an SSL enablement really needs to go see the size and power requirements of the average ISP's datacentre, let alone someone like Google.

The problem with greenies is *not* their intentions, it's the fact that they choose TOTALLY the wrong targets to cut down on. Energy-saving lightbulbs, clockworks radios, EnergyStar monitors etc. mean *SHIT* compared to one household accidentally leaving the aircon / heating on for an extra hour for one night each decade.

Greenies should be targeting those with swimming pools, hot-tubs, aircon in countries that really don't need it, etc. rather than try to make the average householder feel guilty for his 40W bulb and TV-on-standby. You are literally orders-of-magnitude out in the things you worry about. And also you are targeting USEFUL energy (because the SSL has a purpose) and not useless energy (like that lost because someone didn't bother to close the lid on their hot-tub properly).

Re:Adds to greenhouse problem (0)

Anonymous Coward | more than 2 years ago | (#37597920)

Sounds like the environmentalists haven't heard of Profile Before Optimizing [c2.com] , aka "find out what draws power, first".

Do some CAs have really cheap certs now? (1)

Lazy Jones (8403) | more than 2 years ago | (#37596694)

I can imagine that the other compromised CAs by the comodo hacker [slashdot.org] have made Wikipedia an offer they couldn't refuse.

Dosen't solve the admin problem (0)

Anonymous Coward | more than 2 years ago | (#37596976)

"Technical" gimmicks like SSL still dosen't protect Wikipedia from abusive admins and deletionist terrorists. Only once abusive admins like Nawlinwiki, Bsadowski1, Bongwarrior and MuzeMike are banned and inclusionism is put in the heart of the project will Wikipedia will be considered secure.

Banned users are heroes!

Does this mean more or less (0)

Anonymous Coward | more than 2 years ago | (#37597176)

of this? [imageshack.us]

That is really annoying my pants off! So many pages have that issue. Or maybe it is not an issue but you need to check manually every single time or either switch off that alarm and pretend the world is a soft, friendly place without secrets...

Any sysadmins for large web servers out there? (1)

Lisandro (799651) | more than 2 years ago | (#37598086)

I was wondering - Âhow much stress does enabling HTTPs on a huge site like Wikipedia puts on a modern web server? IIRC this was one of the reasons Facebook took quite a while to enable SSL for their users.

Re:Any sysadmins for large web servers out there? (1)

Carnildo (712617) | more than 2 years ago | (#37603954)

It depends on your network infrastructure, especially how CPU-intensive your content is already.

Google's numbers are a 2% increase in network traffic, a 1% increase in CPU usage, and 10kb RAM per connection. Your network numbers will go up if you've got SSL frontend servers talking to content backend servers (Wikipedia's solution), while your CPU numbers will go *way* up if most of what you're serving is static content (this is where SSL's reputation as a CPU hog came from; these days, almost everyone serves CPU-heavy dynamic content and won't notice much of an increase).

Re:Any sysadmins for large web servers out there? (1)

Lisandro (799651) | more than 2 years ago | (#37612506)

Thank you.

If only Google would do this too. (1)

GargamelSpaceman (992546) | more than 2 years ago | (#37598238)

I use HTTPS everywhere, but it sends me to an experimental search page for google that lacks the standard tabs. I mostly want standard tabs, so this is annoying.

Caching? (1)

chill (34294) | more than 2 years ago | (#37599216)

Encrypted connections can't be cached by a proxy, unless the proxy acts as a man-in-the-middle. While this is popular at many companies, I don't see a lot of support for your ISP doing it.

SSL Everywhere, if successful, will be the death of caching. Is that a good thing?

Re:Caching? (0)

Anonymous Coward | more than 2 years ago | (#37603556)

No one uses a caching proxy anymore. Caching is all done by CDNs these days, which are totally TLS-friendly.

Well, Maybe... (1)

Nom du Keyboard (633989) | more than 2 years ago | (#37600632)

Considering all of the compromised SSL certificates, you may not be any more private with this change than before.

The Latvian thing (1)

martinve (1233522) | more than 2 years ago | (#37618296)

Nice to have Wikipedia running on Lativan version of standard HTTP protocol (https). But then again, I am an Estonian who commented on the item 2 days late.

What about Slashdot? (0)

Anonymous Coward | more than 2 years ago | (#37624944)

I think that HTTPS wave is really great.

It need to be done for all major web site.

Please, generate that HHTPS certificate and be done with it. 2 hours worth of work for the rights of all your reader.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?