OpenSSL bug CVE-2014-0160
A new OpenSSL vulnerability on 1.0.1 through 1.0.1f is out today, which can be used to reveal memory to a connected client or server.
If you're using an older OpenSSL version, you're safe.
Note that this bug affects way more programs than just Tor — expect everybody who runs an https webserver to be scrambling today. If you need strong anonymity or privacy on the Internet, you might want to stay away from the Internet entirely for the next few days while things settle.
Here are our first thoughts on what Tor components are affected:
- Clients: The browser part of Tor Browser shouldn't be affected, since it uses libnss rather than openssl. But the Tor client part is: Tor clients could possibly be induced to send sensitive information like "what sites you visited in this session" to your entry guards. If you're using TBB we'll have new bundles out shortly; if you're using your operating system's Tor package you should get a new OpenSSL package and then be sure to manually restart your Tor. [update: the bundles are out, and you should upgrade]
- Relays and bridges: Tor relays and bridges could maybe be made to leak their medium-term onion keys (rotated once a week), or their long-term relay identity keys. An attacker who has your relay identity key can publish a new relay descriptor indicating that you're at a new location (not a particularly useful attack). An attacker who has your relay identity key, has your onion key, and can intercept traffic flows to your IP address can impersonate your relay (but remember that Tor's multi-hop design means that attacking just one relay in the client's path is not very useful). In any case, best practice would be to update your OpenSSL package, discard all the files in keys/ in your DataDirectory, and restart your Tor to generate new keys. (You will need to update your MyFamily torrc lines if you run multiple relays.) [update: we've cut the vulnerable relays out of the network]
- Hidden services: Tor hidden services might leak their long-term hidden service identity keys to their guard relays. Like the last big OpenSSL bug, this shouldn't allow an attacker to identify the location of the hidden service [edit: if it's your entry guard that extracted your key, they know where they got it from]. Also, an attacker who knows the hidden service identity key can impersonate the hidden service. Best practice would be to move to a new hidden-service address at your convenience.
- Directory authorities: In addition to the keys listed in the "relays and bridges" section above, Tor directory authorities might leak their medium-term authority signing keys. Once you've updated your OpenSSL package, you should generate a new signing key. Long-term directory authority identity keys are offline so should not be affected (whew). More tricky is that clients have your relay identity key hard-coded, so please don't rotate that yet. We'll see how this unfolds and try to think of a good solution there.
- Tails is still tracking Debian oldstable, so it should not be affected by this bug.
- Orbot looks vulnerable; they have some new packages available for testing.
- The webservers in the https://www.torproject.org/ rotation needed (and got) upgrades. Maybe we'll need to throw away our torproject SSL web cert and get a new one too.
Comments
Please note that the comment area below has been archived.
We have been speculating if
We have been speculating if a tor relay couldn't be made to leak information about the IP address of the next hop in a circuit, since an arbitrary memory leak is possible. Then, in theory, one could walk all nodes in a circuit to eventually uncover the other end of the circuit... (if all nodes in the circuit are linked to vulnerable openssl). Is that somehow prevented by the implementation design?
Sounds doable in theory (no
Sounds doable in theory (no idea about practice, but we should assume so).
Arbitrary memory leaks are bad news.
Another fine reason to get relays to update (many of the big ones are in the process of updating right now).
Do heartbeat messages go
Do heartbeat messages go both ways? If so, can a relay also theoretically read a tor client process's memory?
Is Tor working on removing http://opensslfoundation.com/ as a dependency yet? It seems riddled with bugdoors. Personally, I'd like to avoid using software from Maryland.
Yes, heartbeat messages can
Yes, heartbeat messages can go both ways. See the "clients" section above.
Removing the openssl dependency, and replacing it with what? The world is missing an actually good crypto library.
What about NSS since you're
What about NSS since you're already using it for the browser?
Yes, maybe. There aren't
Yes, maybe. There aren't [m]any good crypto libraries out there to choose from. It's not clear to me that libnss is any better -- at least people *find* some of the bugs in openssl. :)
Since this attack relies on
Since this attack relies on the entire chain being owned, a mix of libraries will prevent any single compromise from owning the system.
By "chain" I assume you mean
By "chain" I assume you mean "Tor circuit".
In that case check out the comment at the top:
https://blog.torproject.org/blog/openssl-bug-cve-2014-0160#comment-55451
And then imagine fetching the memory from the relay that turns out to be Alice's entry guard, and also fetching it from the relay that turns out to be Alice's exit relay.
So the "entire chain" isn't needed. Maybe.
Entry guards or exits could
Entry guards or exits could use different crypto than other relays. Would you consider having exits use lighter weight encryption to ease the load on their CPUs?
Yes, if you can find us some
Yes, if you can find us some lighter-weight crypto that's still secure against all the attacks people are talking about this year.
I think we've gone a long way with the move to curve25519:
https://gitweb.torproject.org/tor.git/blob/tor-0.2.4.21:/ReleaseNotes#l…
"Replacing it with
"Replacing it with what?".
Isn't OpenSSL somewhat bloated? Tor does not need the many ciphers and operation-modes implemented in OpenSSL. Tor could operate just fine using a single cipher and mode of operation. That can be implemented without the baggage of a huge crypto library.
Yes, OpenSSL is a massive
Yes, OpenSSL is a massive library. with several cipher suites. The protocol has seen many years of (from time to time, shaky) service with a huge install base.
The benefits of keeping the same code versus rewriting are well known. I don't like OpenSSL, and would love to see it fixed in so many ways. Better test suites, static analysis, real verification, additional cipher suites, fixes to the protocol design all spring to mind; some of these cannot be done in a backwards-compatible way. It can be rescued, it's just a herculean effort.
Having lots of ciphers is very useful, for instance, when BEAST, CRIME, etc. came along and exploited padding oracles that were only present in block ciphers, servers could switch to RC4. When RC4 was shown to be broken, but the previously mentioned attacks were fixed, we could switch back. This flexibility is crucial in being able to respond quickly to new attacks, and in providing a smooth migration path for users.
So in practice, what does
So in practice, what does this mean for Tor? Could an adversary like the NSA completely unmask the entire Tor network without anyone knowing, or could they unmask a user that connects to a compromised or honeypot website?
This is quite scary...
Completely unmask the entire
Completely unmask the entire Tor network? Not anymore, since many relays have upgraded. But before the vulnerability was announced? Who knows.
A compromised website won't be a good place to launch an attack, since the Tor Browser shouldn't be affected by the bug, and the website doesn't interact with the Tor client at the link encryption layer.
But an entry guard (the first Tor relay you connect to) can potentially read client-side memory. See the 'clients' section above.
So if I had done something
So if I had done something "bad" in the past before the CVE was out, how much should I worry? By "bad" I mean things on the level of drug dealing, child porn, dissidence from nasty countries, etc. (not that I actually _do_ those specific things, but hypothetically if I did something on that level). Should I toss all my online pseudonyms out the window? I'm not quite sure what _practical_ steps I should take to ensure my safety.
By "bad" I mean things on
By "bad" I mean things on the level of drug dealing, child porn, dissidence from nasty countries, etc.
If you deal in drugs, you should pack your bags immediately and head for Mexico, Colombia or Honduras. You will find sanctuary there with like-minded people.
If you indulge in child porn, you should head for Russia. I heard some of Putin's men are child porners.
If you are a political dissident, you are safe. NSA and GCHQ will never uncover your activities or reveal your identity to North Korea, Iran, China, Turkey, Pakistan, etc.
Please do not feed the
Please do not feed the trolls.
I cannot figure out which people in this thread are the trolls. ;)
hey, I don't even like drugs
hey, I don't even like drugs but i like Mexico beaches, not cool.
What if I am a human rights
What if I am a human rights activist that is an enemy of the state to the US govt?
What if you tripped over a
What if you tripped over a rock and fell to your death? That's how much you should be worrying.
"If you deal in drugs, you
"If you deal in drugs, you should pack your bags immediately and head for Mexico, Colombia or Honduras. You will find sanctuary there with like-minded people." -
Well not necessarily, Mexico, Colombia, Honduras, and some other countries produce huge quantities of drugs, but the drugs are not for them, but for the worldwide drug-hungry-consumer countries like US and EU's. In fact, US is the winner on this subject, It's the greatest drug consumer in the world.
"if you are a political dissident, you are safe. NSA and GCHQ will never uncover your activities or reveal your identity to North Korea, Iran, China, Turkey, Pakistan, etc."
Well Not necessarily, if you are a politically dissident towards US policy, or a journalist, who have profound commitment with US constitution's freedom statements and Law, who also look for the truth and nothing but the truth on what's going on with US government's illegal activities perpetrated against American's citizens and other countries's government that are not fond of with US policy. then you might be careful, they may well say you are a whistleblower, and persecute you around the world. even though your duty and responsability you know will always be to release to public opinion those offensive government activities, Hence, you may be in serious trouble specially if living in US soil.
If you believe NSA or GCHQ
If you believe NSA or GCHQ wouldn't shop your ass to the security services, I've got a bridge I want to sell you. Either of them will do anything to anyone precisely as it suits their perceived needs (which mutate constantly).
Has the bridge upgraded to
Has the bridge upgraded to 1.0.1g?
This is the reason why onion
This is the reason why onion pages cannot be viewed? please help, thanks.
Probably not. Sounds like
Probably not. Sounds like you screwed up your Tor installation somehow. Be sure to use the Tor Browser Bundle (not some other thing), and make sure your time and date and timezone are right. If you still have a problem, try the helpdesk or irc.
https://www.torproject.org/about/contact
Some folks knew about this
Some folks knew about this bug for a while. Well, long enough to set up this nifty website talking about the bug.
http://heartbleed.com/
They knew about it long
They knew about it long enough to patch it too. That's how vulnerability disclosure works.
I disagree. Vulnerability
I disagree. Vulnerability disclosure starts with the source and based on severity, escalates quickly to vendors. I know of at least two major OS vendors that were blind-sided by this. They did a great job of releasing patches quickly, but there will be serious fiscal impact, some of which could have been mitigated.
Yeah, the disclosure process
Yeah, the disclosure process sure didn't go smoothly on this one.
Agree. Both the NSA and GCHQ
Agree. Both the NSA and GCHQ have been having a good time. Methinks this bug is the work of an infiltrator from NSA who works on the OpenSSL project.
Would the disclosure be
Would the disclosure be limited to the memory that belonged to the OpenSSL process?
It would be limited to the
It would be limited to the memory that belonged the process that linked the openssl library. So that's Tor, or apache, or whatever else you might use.
I made a tool to check the
I made a tool to check the status of your SSL and see if heartbeat is enabled. If it is, you should run this command: openssl version -a
Ensure your version is NOT 1.0.1f, 1.0.1e, 1.0.1d, 1.0.1c, 1.0.1b, 1.0.1a, 1.0.1, 1.0.2-beta1
Tool at: http://rehmann.co/projects/heartbeat/
I made a tool to check the
I made a tool to check the status of your SSL and see if heartbeat is enabled.
http://s3.jspenguin.org/sslte
http://s3.jspenguin.org/ssltest.py seems to do the job nicely, and be easy to audit too.
It did a nice job of giving
It did a nice job of giving me an access denied error.
How sad! I have mirrored it
How sad! I have mirrored it at
http://freehaven.net/~arma/ssltest.py
You're asking the author of
You're asking the author of a tool if they put backdoors, Trojans or malware in their own code. If they did, they're not going to tell you.
another bad news... jesus
another bad news... jesus doesn`t exist also :(
: ]
: ]
DuckDuckGo.com appears
DuckDuckGo.com appears vulnerable according to http://filippo.io/Heartbleed/#duckduckgo.com
Watch yourself out there.
I wonder which CAs will give
I wonder which CAs will give out replacement SSL certs for free.
(I wonder which CAs will discard their CA keys and generate new ones. ;)
Well, anyone who runs a site
Well, anyone who runs a site and is affected by this should call their host and find out. :)
Discarding CA keys is
Discarding CA keys is unnecessary, as only SSL keys, not key-signing keys, are affected.
Fixed, as of 10.04.2014
Fixed, as of 10.04.2014
So was the security bug
So was the security bug introduced on purpose?
Here's a quote I heard
Here's a quote I heard today: "When it comes to deciding between maliciousness and incompetence in OpenSSL, there's a whole lot of incompetence to go around."
It's actually really hard to write a secure crypto library that implements all the things openssl implements.
So I guess that means my answer is "not necessarily".
in that case if some guys
in that case if some guys have unlimited resources imho they do know about this bag from day zero. take just 100 programmers and ask them to watch openssl development. surely they catch this bag at once.
Agreed. Another quote, by
Agreed. Another quote, by Napoleon (incorrectly ascribed to an American, who had just repeated it and became the "author"): "don't ascribe to malice what can be plainly be explained by incompetence".
Out of curiosity: If the
Out of curiosity: If the webserver uses Diffie-Hellmann for the SSL Key exchange, old and new traffic should still be secure, even if the cert was leaked, right?
(You obviously would want to replace your cert either way, but as I said, curiosity).
The vulnerability has been
The vulnerability has been there for 2 years. No one guarantees you that it hasn't been exploited before to extract private keys and that you did your Diffie-Hellmann key exchange with a man-in-middle who possessed the right key.
In case you're ruling a MITM out, then yeah, that's perfect forward secrecy and you should be good to go with that old traffic.
I don't see how new traffic
I don't see how new traffic could ever be safe if a MITM has your private key
It should still be safe
It should still be safe against a passive attacker -- that's one of the nice features of PFS in handshakes. They have to actually mitm every connection, or they don't get to learn the session key that's computed for that connection.
If the server is using
If the server is using forward secrecy (DHE or ECDHE cipher suites) old traffic is secure, but if the certificate is leaked new traffic can be MITMed. The actual key exchange algorithm doesn't matter.
No the old isn't always
No the old isn't always secure. You're still old screwed if TLS session tickets are in use and the server hasn't restarted and cleared state. It's rare, but google for more.
Hi. What do you mean
Hi. What do you mean with
«best practice would be to update your OpenSSL package, discard all the files in keys/ in your DataDirectory, and restart your Tor to generate new keys»
?
1. aptitude update -> aptitude safe-upgrade
2. rm -rf /var/lib/tor/keys/*
3. /etc/init.d/tor restart
Is this correct? Or the second step is superfluos (or erroneous) ?
Thans in advance for your attention.
That's correct. These steps
That's correct.
These steps will turn you into a new relay (with a new key).
Well, the capacity of the
Well, the capacity of the tor network would be harmed a lot now if everybody were doing this, right? As I understand it, the new relay would need to go through "unmeasured" and "measured" phases, during which the full capacity is not used.
Do we have some data on how many relays actually changed the keys during update?
Correct. Once they get
Correct. Once they get measured by the bwauths, which should take just a couple of days, they should ramp up. Hopefully it won't be too bumpy.
As for data, sort of, but certainly nothing comprehensive yet.
Try adv-tor from the best
Try adv-tor from the best 'asm man' in the worldd
Christian Albu
We got the advor person to
We got the advor person to rename it from advtor, since "advanced Tor" is a poor name when what you're doing is adding a bunch of un-audited patches to Tor. Maybe it is better, maybe it is worse, most likely it's a combination (which in sum is probably not a great result for its users).
So, feel free to use a program called advor if you find one on the internet, if you really want to, but know that it's not endorsed or checked or anything by the Tor people.
I wonder if this is related
I wonder if this is related to an observation concerning Google's recaptcha service.
Various web services embed Google's captchas before they let you use their service.
I observed many times that after solving the captcha the SSL connection to Google on port 443 remains open for a long time.
This could be 30 minutes or longer.
Would it make sense regarding the heartbeat bug that Google keeps those connections open to read parts of the memory of connected Tor clients on a large scale?
Sounds unlikely. Remember
Sounds unlikely. Remember that your browser isn't exploitable here -- so the website will have a tough time attacking you.
What about version 0.9.8y is
What about version 0.9.8y is it safe?
It should be safe (from this
It should be safe (from this vulnerability), yes.
My debian oldstable machine has 0.9.8o-4squeeze14 which should also be safe (from this vulnerability).
i've been to some really bad
i've been to some really bad websites lately.i'm quite pc illiterate.please explain clearly where i must go and what i must do to ensure my safety.went thru tor to these sites.was that enough?
Are you one of the tiny
Are you one of the tiny fraction of Tor users who gives Tor a bad name by trying to reach child porn sites? If so, please go away and stop using Tor. That's not what Tor is for, and you're hurting us.
(If not, I'm sorry I yelled at you.)
LOL talk about jumping to
LOL talk about jumping to conclusions. There are plenty of websites that are considered "bad" in certain countries that in the west we wouldn't even bat an eye at.
I don't like it when people
I don't like it when people make assumptions that just because someone said they went on bad websites that it has to be child porn, even from a Tor dev. Not including the fact that much of what people think is child porn is not actually bad (eg jailbait, not rape or real children, those can still be bad), but there are so many other websites that various governments or societies consider bad, whether it be drugs, or various religious or atheist views, or political views, or even freedom of speech. If anything we need to provide help to anyone who asks and not shun them or even throw out so much suspicion just because they say bad websites and the first thing that comes to mind is the most emotional worst case scenario.
Ok, I am going to cut this
Ok, I am going to cut this part of the thread off before it becomes a big flame war.
It's time to switch to 4096
It's time to switch to 4096 RSA cryptosystems, I'm on a VPN using AES 256 for encryption, SHA256 for data authentication, and 4096RSA for handshake on a 1.7GHz processor with 2GB memory, and I have never EVER had any issues concerning speed, never experienced lag or hiccups, on the contrary I couldn't tell the difference between switching the VPN on and off.
Bumping up key sizes is a
Bumping up key sizes is a fine idea. And it's for that reason that we switched to much stronger ECC for our circuit handshakes, and for link encryption when it's available:
https://gitweb.torproject.org/tor.git/blob/tor-0.2.4.21:/ReleaseNotes#l…
But switching to stronger cryptosystems is not what this vulnerability is about. Even if you switched to 4096 rsa, this vulnerability will be just as bad for you.
Which is better ECC or
Which is better ECC or RSA?
And I do second op, higher cryptosystems will put the minds of laymen to rest, after fixing the current vulnerability of course!
"It depends" is the only
"It depends" is the only answer that can fit in a blog comment.
At this point they're both likely to be stronger than other components in the system (as we learned this week).
What do you think about this
What do you think about this warning https://www.privateinternetaccess.com/pages/vpn-encryption#ecc_warning ?
Don't use the NIST curves if
Don't use the NIST curves if you can help it. (We use them for our link encryption, to blend in, but not for our circuit-level encryption.)
You might like http://safecurves.cr.yp.to/
On first glance it looks like the Safecurves folks understand the issues better than the ones from the site you link.
this bug exposes just how
this bug exposes just how bad the NSA circumventing encryption really is, watch everyone panic over this yet the NSA and probably foreign intelligence have even better exploits. This is a reminder for myself that nothing is safe or secure online.
Yes. And I'm not sure if I
Yes.
And I'm not sure if I should feel happier or sadder to imagine a world where these are the *accidental* bugs that we, the security community, introduce.
Security sure is hard, even without government-level adversaries.
I travel a lot, like every
I travel a lot, like every week, and sometimes every 2-3days, I'm in a different country, should I download TBB in every country I'm in, to be safe, or is there no problem with using the same TBB I downloaded many countries ago in different countries?
As long as you have the
As long as you have the latest TBB, there should be no difference between fetching it from one country vs fetching it from another country.
But be sure to check the signatures on it, to make sure it really is the TBB we made for you.
Any idea when we can expect
Any idea when we can expect new Tor builds ?
Real soon now I hear. You
Real soon now I hear.
You can watch progress on
https://lists.torproject.org/pipermail/tor-qa/2014-April/thread.html
(And now the new builds are up.)
is there any way to check
is there any way to check the first relay my client is connected to for the vulnerability?
You could read your guards
You could read your guards out of your state file and then manually check them with one of the python scripts that's floating around.
The Tor client doesn't do it automatically at this point. I'm not sure it ever should. The CFAA sure is overbroad here:
http://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act
I can't believe how few
I can't believe how few people on this and other sites have not even mentioned using a vpn. Running a vpn-server on the inside-IP of your -second- router with Gargoyle or openwrt on it, would protect you against any vulnerabilities in SSL/TLS. Don't use Tor or surf the web without it. Configuration of the vpn server/client is quite simple on Gargoyle.
Wait, what? No, using a vpn
Wait, what? No, using a vpn would not protect you from "any vulnerabilities in SSL/TLS". For example, if you go to an https website with your browser, and it goes through your vpn, and there's a vulnerability in your browser's SSL library, the vpn does not help you.
For another example, if you use Pidgin to talk to an xmpp server over ssl, and it goes through your vpn, a vulnerability in openssl (like the one in this post) will be bad news for you.
It's about what applications you use, not about how you transport your traffic. Notice that that statement is true in the same way for Tor itself.
I see, you mean in terms of
I see, you mean in terms of anonimity, of course.
I would think a vpn would still encrypt the content, though.
Or am I wrong?
I don't mean in terms of
I don't mean in terms of anonymity.
The vpn does encrypt the content, but the unencrypted content is still exposed on both ends. And that's where the vulnerability is.
Vexing (bug) and
Vexing (bug) and enlightening.
The key lies in memory.
I'm glad I updated my systems, but we'll have to wait for servers to do the same.
Looking forward to the latest TBB as per usual.
Keep up the good work.
And tor encrypt encrypted
And tor encrypt encrypted content right? What info can be collected from that? Endpoints? What else?
To prevent similar problems never collect and keep unnecessary historical information.
Am I right that entry guard potentially can see data for exit relay? tor-tls(tor-tls(tor-tls(tls(plain))))
For the general encryption
For the general encryption question, you might like
https://svn.torproject.org/svn/projects/articles/circumvention-features…
And no, you are mistaken that the entry guard gets to see data for the exit relay. The Tor client does, but that's the one that you run under your own control so it's ok that it does. (Otherwise there would be a point in the network that gets to both see you and learn where you're going, which is exactly what Tor's decentralized design aims to avoid.)
https://www.torproject.org/about/overview
I scanned all the nodes in
I scanned all the nodes in consensus for Heartbleed,
with the following results:
at least 530 nodes are VULNERABLE,
at least 2995 are NOT VULNERABLE,
the rest I don't know beacause of network timeout or something.
I did not check for rekeying.
To test for yourself, do this:
- get IP:ORPORT list of relays (grep the microdesc-consensus)
- get the script https://github.com/noxxi/p5-scripts/blob/master/check-ssl-heartbleed.pl
- run it against each IP:PORT
- count the results
Yep. I think it's more like
Yep. I think it's more like 1000 relays vulnerable at this point.
But counting relays is the wrong way to assess the vulnerability of the Tor network as a whole. You should ask what fraction of total consensus weight, or what fraction of advertised descriptor bandwidth, is vulnerable.
Otherwise you fall into the trap of various past researchers who see 500 relays on Windows, claim Windows is vulnerable, and conclude they can compromise 10% of the Tor network. This is true except that 10% of the Tor network doesn't see 10% of the users or traffic.
Now, in this case a big fraction of the network by weight *is* vulnerable. I mailed the top 400 relay operators last night to tell them about the bug. You can also follow along the tor-relays list:
https://lists.torproject.org/pipermail/tor-relays/2014-April/thread.htm…
Eventually we'll start taking away the Valid flag from fingerprints of relays who were known to be running with a vulnerable openssl.
It's going to be a bumpy couple of days / weeks.
Hi! Sorry if I ask something
Hi! Sorry if I ask something stupid, I'm not an advanced user. I don't quite understand what these numbers mean to the average user. Can I assume that it means that there is a good chance that a few of my my connections weren't vulnerable at all?
If I understand correctly probably someone could have connected my ip with my traffic. Was this an easy thing, or hard to achieve? What I'm trying to figure out is how probable is that someone kept their anonmity?
keep up the good work tor
keep up the good work tor and hopefully in a few days everyone above can carry on with all their "bad" stuff ;-) lel
What the hell is OpenSSL? I
What the hell is OpenSSL? I never installed anything called OpenSSL. Can't anyone write who is affected and what needs to be done in easy to understand words?
Every freaking time something happens everyone seems to start speaking binary...
You might
You might find
https://www.google.com/search?q=openssl&tbm=nws
to be useful. Many many journalists have been trying to write about it, and explain it, over the past day.
OpenSSL is a library which
OpenSSL is a library which many programs and websites (including but not limited to tor) use to do cryptography.
A critical security vulnerability was found in this library yesterday. Just about everyone who uses tor (and the whole Internet, in fact, not just tor) is affected in some way.
What needs to be done is for an updated Tor Browser Bundle (coming soon) to be released, and for all users to upgrade. The relay operators also need to update their relays, and generate new keys for them.
Unfortunately, there isn't a lot more a user can do than that. Nobody knows if anyone has actually been attacked using this vulnerability, and even if they were, it would be basically impossible to find out. That's why everyone is scrambling today.
How many directory
How many directory authorities were vulnerable? Assuming more than half, how soon can we expect a new tor build with their new keys?
Two or three of the nine
Two or three of the nine directory authorities were unaffected, and the rest were vulnerable.
The long-term authority identity keys are unaffected, since they're kept offline.
The medium-term authority signing keys were affected, and they've been rotated (except for two authorities, which are offline until they can be brought back safely). Rotating them makes things a lot better, but still not perfect.
And the relay identity keys might have been taken, but really that's more of a hassle than a security thing -- if we rotate them, existing clients will scream in their logs that they're being mitm'ed, and more importantly existing clients will refuse to proceed, even though the directory documents they fetch are signed with other keys.
It's not clear to me yet that this change is worth a flag day. Hopefully we can do it more smoothly.
Is there going to be a new
Is there going to be a new os tor version coming out or do i have to update openssl? I have no idea how to do that. Any help??? Please
New release of the Tor
New release of the Tor Browser Bundle which include the security upgrade are on the way.
If you use another version of tor distributed with your operating system, you should ask the people who maintain that package for your operating system. You should also ask those who maintain openSSL for your operating system.
Let me get this straight: an
Let me get this straight: an attacker can know what relays and clients are communicating to each other? and de-anonymize traffic?
OK, I'm not as tech-savy as
OK, I'm not as tech-savy as many on here. Am I understanding this correctly though?
The bug does not allow your computer to be directly compromised or identified? It can alow people to see what data is going backwards and forwards though. It can alow the first computer in TOR to see what other websites you have visited in this session and what cookies you have?
So if you follow tor's advice and never enter personally-identifying information, you are still safe, it is not like the big last problem then?
Thanks for any clarification, there must be lots like me who dont really understand the significance of thsi.
I would also really
I would also really appreciate an answer to this. I am still pretty confused whether or not the tor browser bundle was compromised
It can allow the first relay
It can allow the first relay in your Tor circuit to see what other websites you have visited in this session, yes. That's because your Tor client might keep past destinations (e.g. websites you visited) in memory, and this bug allows the SSL server (in this case, the first relay in your Tor circuit) to basically paw through your Tor client's memory.
If you visit https websites using your browser, though, your Tor client will never have your web cookies in its memory, because they would be encrypted -- just as the exit relay can't see them because they're encrypted, so also your Tor client can't see them because they're encrypted:
https://svn.torproject.org/svn/projects/articles/circumvention-features…
https://www.eff.org/pages/tor-and-https
The Tor Browser, which is based on Firefox, is not affected. But your Tor client, which is the program called Tor that comes in the Tor Browser Bundle and that your Tor Browser proxies its traffic into, is affected.
I'm sorry this is complicated. If the above doesn't make sense, a little box on a blog comment isn't going to fix it for you. I recommend starting at
https://www.torproject.org/docs/documentation#UpToSpeed
Hidden services are
Hidden services are encrypted end to end right ? Would they show in memory, exposing the user ?
They could potentially show
They could potentially show in memory yes. That's because one of the ends *is* your Tor client. And the other end is the hidden service itself. If the entry guard ran this attack in either case, it could potentially learn things.
It can also get passwords
It can also get passwords can't it? So we should all be visiting all of the places we have passwords for like Amazon, eBay and websites where we leave comments and checking to see if we need to change our passwords for any of them also?
Yes, maybe. But that's a
Yes, maybe. But that's a question about those websites and their past security practices, and has nothing to do with Tor or the Tor Browser Bundle.
That is, the way it would have been a problem, if it is, is due to security at the webserver end, not the webbrowser end.
Would my privacy and
Would my privacy and anonymity be safe if I use tails to visit vulnerable clearnet/.onion sites and use vulnerable relays?
I was running a virtual
I was running a virtual machine and I used the both the guest os(TAILS) and the host(windows) to visit vulnerable sites. Could the attacker read and decipher the memory content of the virtual machine?
Yes. A virtual machine
Yes. A virtual machine doesn't magically stop an attacker from reading memory using this bug.
As mentioned before: Tails
As mentioned before: Tails was not explitable because it is using the older OpenSSL that doesn't support the heartbeat feature. You got lucky there, pal. ;)
A client side question: What
A client side question:
What does it exactly mean "... Tor browser is not affected but your Tor client is affected ..."? I understand when one operates the Tor browser it connects to the Tor network through the Tor client. What kind of data could leak out from the Tor client?
You can think of the Tor
You can think of the Tor client like your VPN provider. Or if that metaphor doesn't make sense, think of it like your network.
For example, when you're sitting at Starbucks just using the Internet directly, people next to you get to see your network traffic and learn things about what you do -- if you only go to https sites then they learn which sites you go to but not what you send since it's encrypted. And if you go to an http site then they learn not only where you went but also what you said.
The Tor client sees exactly this same information.
Or if that metaphor didn't make sense, but you know about the Tor exit relay issue: the Tor client gets to see everything that all your exit relays get to see.
How to check on Linux which
How to check on Linux which OpenSSL version the TBB uses?
Open terminal and type
Open terminal and type "openssl version".
No, I think this is bad
No, I think this is bad advice. TBB ships with its own libssl.
(The answer to the original question, if you're a normal user, is "you can read the TBB changelog to find out". If you're not a normal user, there are plenty of ways you can actually learn on your own, none of which fit into a little blog comment box.)
I'm guessing those people
I'm guessing those people using TBB 3.6 Beta-1 are affected but have no where to go for Pluggable Transports except the last stable TBB-PT which has its own security problems, hence the update to a newer Firefox ESR.
Yes. Mike et al are working
Yes. Mike et al are working on building a newer TBB 3.6 beta currently.
Another out of bounds and
Another out of bounds and pointer's arithmetic bug... shouldn't we just walk away from C and C++? These two languages make it simply too easy to make bad mistakes like these... i am quite sure that by just dropping it we could improve a lot our security. It is simply too hard for a human being to make correct programs with such languages.
Also... C was a good idea back in the 70ies... but with the hardware of today it makes little sense to use it (if not for OSes)... it doesn't even support multi-core processors natively (you need an external library to use threads, it is heavy and memory-hungry and the language itself has no safe native way for inter-process communication).
Most applications of today, Tor included, could be written in a modern language, without pointer arithmetics (which is useless) and with a garbage collector which frees the developer from having to remember to allocate or free memory areas. What about Google-Go for instance? It is fast, it is made to make it easy to write error-free software, it supports natively multi-core and multi-thread programming in a lightweight way, it supports natively interprocess-communication, it doesn't need pointer's arithmetics and it has a garbage collector...
Really, it is time to evolve. Especially if we care about security.
You are jumping to
You are jumping to conclusions, because in the background behind your magic vm curtain everything behaves like before, and there memory and addresses and "pointers" are the fabric every programm runs on.
With VMs - that actually jitcompile your bytecode into native - you are just relaying your dependence on layers of layers of lay..., however if your bottom layer has a vulnerability your programm might be untouched however, the outcome isn't different.
A vm is just a layer and if one layer breaks your program is broken,
replace layer by sandbox.
What is really needed is a best practice "book", about code examples being vulnerable and how to do things better.
You are wrong. I am not
You are wrong. I am not jumping at anything.
I have several decades of experience in programming, in more languages that i can remember, and i perfectly know what i am saying.
And no. No "best practice book" or experience, no matter how long that is, can help you in this. This is the usual excuse that C and C++ fanboys throw at you whenever you tell them the pure and simple truth: the only thing that DOES help is a language that simply stops you from doing stupid things and that takes care of things such as memory management that are better left to the machine itself.
Please also note that who works on complex projects such as openssl are usually very experienced programmers with a huge background in security and "best practices"... they try their best to avoid bugs... yet they DO mistakes as this (very serious) bug proves. Probably now you see my point. No matter how many books you read. No matter how much experience you read. You will do mistakes. More so if the language seems to be DESIGNED to work against the developer itself.
And yes, in the bottom of every VM there is low level code. So what? I did not state that this solution would be perfect... i said that it would be BETTER.
In few words... it is much harder to find and exploit bugs in a VM (which, in time, gets better and better) than find and exploit bugs in several thousands (or even millions) lines of code in every software ever made because... well... you are human... and humans do mistakes. Humans forget things such as freeing a block of allocated memory, machines don't. Humans make math mistakes such as accessing an index outside an array of bytes, machines don't. Humans forget to free resources such as open files, machines don't. In simple words... Machines are just better than Humans at certain tasks... so we should just let them do it in our place. And unfortunately C and C++ fail at this.
We cannot completely avoid bugs, it is in our nature to make mistakes, but we can make the surface for a possible attack smaller. My proposal is exactly this: make the attack-surface smaller by using better, newer languages that make it easy to write better software and make it harder to make disastrous bugs.
Well, I think that C and C++
Well, I think that C and C++ are still the best languages out there for doing actual computation, or more generally for anything where speed is important...unless you want to break down and code in assembly language.
That being said, I agree that they should not be the medium for network-facing applications. Type-safe languages may be the only practical way to prevent buffer-overrun attacks.
"modern language" is just
"modern language" is just another layer for introducing bugs. really they are just about laziness not security.
It all makes sense now, poor
It all makes sense now, poor silkroad & freedom hosting, fucking ***!
What makes sense now? It's
What makes sense now?
It's pretty clear that those were unrelated to this.
And they were likely unrelated to each other too.
See previous blog posts for details.
Well, to my understanding
Well, to my understanding this bug has been there for one year or more.
If so... then the NSA could have found it and used it to deanonymize the whole Tor network...
This could explain how they found SilkRoad and Freedom Hosting servers.
Although i still believe that they used vulnerabilities on the servers themselves to gain root access and find out the real IP.
Could they? AFAIK they could
Could they? AFAIK they could have monitored only parts of it, because they don't have access to all the endpoints (am I correct?)
Roger, how much work would
Roger, how much work would it be to make Tor use PolarSSL and GnuTLS?
I think it would be good if relay operators could run Tor on a mixture of operating systems and SSL libs.
I think if we're going to do
I think if we're going to do that, and maintain them all, we should seriously consider switching to a link encryption that doesn't use the SSL protocol at all.
That said, it shouldn't be *too* hard technically. Check out src/common/crypto.[ch] and src/common/tortls.[ch].
Now Tor expert bundle
Now Tor expert bundle 0.2.4.21 has OpenSSL 1.0.1g.
Do this fix the bug for the client?
Yes. See
Yes. See [geshifilter-code]http://heartbleed.com[/geshifilter-code]
Isn't it kind of hilarious
Isn't it kind of hilarious that human beings can both design cryptosystems that take multiple lifespans of the universe to break and yet also have them be undone by simple memory management bugs? When will the software development community adopt formal verification as a standard practice for critical programs? Save us BitC.
Um, for the record, in
Um, for the record, in general, verifying the correctness of a computer program with another program is impossible...theoretically, you can't even tell if the program will finish executing, this is in fact what motivated the Turing machine in the first place.
I should qualify this to point out that this works a little differently for quantum computers...that is why Lockheed bought the D-Wave machine they keep at USC, actually. But, that caveat notwithstanding, I think the answer to your question regarding adoption of code verification is, more or less, "not any time soon."
It does make sense.
It does make sense.
I have updated to the newest
I have updated to the newest TBB and when I go to https://www.howsmyssl.com/ It tells me that my browser is still vulnerable. That's due to the use of TLS 1.0 by Firefox 24.4. Does this affect Browsing through Tor in any shape or form?
FF is capable of using TLS 1.2 but it's not enabled in FF 24 or even 26 as far as I know. I can fix this by modifying security.tls.version.max to 3 and security.tls.version.min to 1 in about:config . Would this modification single me out among other Tor users in any shape or form?
Thanks.
I can fix this by modifying
I can fix this by modifying security.tls.version.max to 3 and security.tls.version.min to 1 in about:config .
For the benefit of those who are not computer savvy, could you outline in greater detail on how to make those modifications to FF 24 and FF 26?
And what do Tor developers have to say about FF 24.4 using the very old version of TLS 1.0 instead of 1.2? Let us hear what they (through arma, probably) have to say.
See:
See: https://trac.torproject.org/projects/tor/ticket/11253. It is unlikely that changing this pref as any effect in ESR 24.
You must also disable
You must also disable security.ssl3.rsa_fips_des_ede3_sha and enable security.enable_tls_session_tickets to get "probably okay" instead of "improvable" or "bad".
Source : https://blog.samwhited.com/2014/01/fixing-tls-in-firefox/
I have no idea if this changes make your client noticeable among the others.
I don't think so, but I don't know.
- type about:config in the
- type about:config in the URL bar, then click on "I'll be careful I promise"
- search for "security.tls.version.max", double-click on it and change the number from 1 to 3
- (optional) search for "security.tls.version.min", double-click on it and change from 0 to 1 or 2
- search for "security.ssl3.rsa_fips_des_ede3_sha" just double-click on it, it will set it to "false"
- search for "security.enable_tls_session_tickets", double-click on it, it will set it to "true".
(thanks to https://blog.dbrgn.ch/2014/1/8/improving_firefox_ssl_tls_security/ and https://blog.samwhited.com/2014/01/fixing-tls-in-firefox/)
You may have to re-start Tor browser to make sure the changes were effective.
You can now re-test https://www.howsmyssl.com/, it should now be on "probably okay".
Note : SamWhited says that these settings were disabled by default because "firefox is vulnerable to downgrade attacks", I definetly don't know which one is better.
Note2 : you may want to check your browser fingerprint on https://panopticlick.eff.org/ before and after the changes. For me it was OK.
Please help me understand.
Please help me understand. Could the hacker see the entire memory of the victim or only the memory being used by the tor client?
Only the memory used by the
Only the memory used by the Tor client. So not the Tor Browser memory, not the memory of other processes you might be running, etc.
Are you really, absolutely
Are you really, absolutely positively sure of that ?
Yes, technically it may be true that the memory, which is used to send the (dummy) "heartbeat" data, belongs to the Tor client process; - at the moment that data is being sent - ;
but if the Tor client had to request (allocate) the ~ 64k bytes block of memory from the OS, that particular block of physical mem might happen to come from anywhere (depending on OS and unpredictable conditions) and still hold data which formerly belonged to other processes, system's or users'. Or does the OS zero out a block of memory it handles in response to a process's requests ? Surely not your vanilla OS, including MS Windows !
Assuming Tor requests a 64 k block for the stated uninitialised "heartbeat" operation, does it then release, or keep the same block for use in other heartbeat (or other) circumstances ? If the block is allocated once and for all, ot would mitigate the vulnerability as an attacker could not dig in the client's memory again and again by repeating the attack. OTOH if the block is released and a new block be allocated each time, then it's the worst possible scenario, depnding on precise OS kernel's way of satisfying memory requests, a large part of the system's physical memory contents might be grabbed by an persistent attacker.
Opinions, please ?
--
Noino
I had a similar train of
I had a similar train of thought and I think this is the most important question on this page.
Many people are using Tor for online research and making notes while browsing. Their text may be stored encrypted on the hard disk but in memory it is clear text. In addition an auto save feature of the word processors or text editor may put multiple versions of the clear text in dynamically allocated memory locations.
Say such a person downloads a video while writing and during this download every few seconds OpenSSL happily sends 64KB chunks of memory into the internet.
I would like to see a table with the threat potential for memory disclosures for the prevalent Windows OS from XP to 8.1 for administrator and user sessions as well as for Linux, if necessary with a differentation for 32bit and 64bit systems.
Every serious OS, including
Every serious OS, including Windows, zeros the memory it hands out to programs. This is to prevent security issues like reading the memory of sensitive programs, including those that are a part of the OS themselves.
Right. If you find any that
Right. If you find any that don't do this correctly, that in itself is a serious security flaw.
The problem is that OpenSSL
The problem is that OpenSSL has its own memory management, it does not use the memory management from the OS. It has been a known bug for I think 5 years that disabling the OpenSSL internal memory management when compiling results in a non-functional version of OpenSSL, because of all the memory handing bugs that exist in the OpenSSL code. This is why the OpenBSD folks are forking the code base and attacking it with chain saws, in order to get it down to a code base that they can audit and fix to their satisfaction.
If being able to dump memory
If being able to dump memory from a relay exposes users, wouldn't the admin that is running the relay be able to dump his memory ( via say gdb ) and expose the clients that are going though him ?
Yes, a given relay operator
Yes, a given relay operator can see whatever his relay can see. That's why Tor circuits are multiple hops, and no single relay gets to know both the client and also her destination.
https://www.torproject.org/about/overview
But if you can break into many relays, your odds go up of running across both the first hop in the user's path and also the last hop.
It would be pretty cool to have a design where the relay can't know anybody about the connections it's handling. But that solution would need to include somebody watching the traffic flows into / out of the relay, which has nothing to do with Tor process memory.
Makes complete sense, I
Makes complete sense, I don't see how that could deanonymize a user though ? If I am a relay operator and like you said can dump memory / use wireshark whatever and see the data that is going through. The exploit does the same thing ( dumps memory ). The two are the same, how can that be used as a attack vector ?
Thanks for answering my questions
Can tor use use 3 processes
Can tor use use 3 processes through pipe or like?
client <-->
[p3 <--> (p2 <-->(p1 <--> entry_guard)<-->inner) <--> exit ]
<--> inetsvr
any leakage restricted to corresponding process.
btw you are free to use distinct codecs/tls versions/etc at stages.
How many relays have
How many relays have upgraded to acceptable SSL so far. When does the process reach 90% complete? What is the schedule for Directory Authorities?
You'll do better following
You'll do better following the answers to these questions on #tor-dev irc channel and on tor-dev mailing list. (Also, don't think of these things as number of relays, but rather percentage of capacity or consensus weights.)
So if I understand this
So if I understand this correctly, for the past two years any malicious entry guard has been able to match up a user's real IP address (which it has) with a list of sites they have visited in TBB (which it obtains via heartbleed)?
If so, yikes! I wonder how many western agencies have been exploiting this little baby.
"Maybe, but possibly yes"
"Maybe, but possibly yes" and "good question, who knows" respectively.
Also, why limit your worry to western agencies? :/
For someone to connect my
For someone to connect my traffic with my IP they have to be connected directly to me (so they are the relay I'm connected to), they have to know about this fact, and they have to know about the vulnerability.
Am I correct? If so, there is not a big chance that someone did this, even if they did, only a small amount of people are affected. (At least not every tor user) Am I wrong?
The counters for this is
The counters for this is obvious:
1: Always run Torbrowser from a newly-extracted, never-used directory or from a copy of that in a directory on a tmpfs in RAM.
2: When it really counts, do not log into anything or engage in any activity that would identify you. Boot, do your secure work, then shut down.
3: Any time security forces could be a danger, use Tor from public wifi hotspots, using that hotspot for nothing else. Use it at home only to avoid things like building up an unwanted Google search history.
This way, heartbleed and any similar attacks all fail. They get an empty history and the IP address of a public wifi hotspot, after working like hell to get it. Just like running a brute-force encryption cracking program for three months, only to find another encrypted tarball as the only contents...
If you're doing steps like
If you're doing steps like this, you should look at using Tails.
Could duckduckgo.com be made
Could duckduckgo.com be made to replace the google.com in the search space in the upper right corner of Firefox's browser?
Following on from this if
Following on from this if the user was using a VPN although the malicious entry guard would know the sites visited and whatever it could get out of memory, would the associated IP be the one of the VPN? Or is there a way of gaining the real IP through this bug?
If I'm using OpenSSL 1.0.0j
If I'm using OpenSSL 1.0.0j (which is what is in Liberte) then I'm not affected by this bug correct?
I use Super VPN and I'm not
I use Super VPN and I'm not having any issues on this matter. What is going on!!
Sorry for the dumb question,
Sorry for the dumb question, but reading news, this blog, the comments there is one thing I'm not sure of.
I know that there is no way to know whether someone actually exploited this vulnerability or not.
But could they listen to everyone, or is it just based on luck? So was it technically possible to monitor everyone, or just random members? Let's assume that in the past two years someone did actively exploited this vulnerability. (let's assume the worst). Would that mean that everyone's traffic is affected or just a few or a lot of people?
Is there a patch I can run
Is there a patch I can run to fix this problem? Will running the "OpenSSL 1.0.1g" fix my computer? Will it ask me Q's that I can't answer (as a intermediate computer user)?
I have Tor v0.2.3.25
I have Tor v0.2.3.25 (installed from expert bundle) running on Windows. I use it as client-only: no hidden service or relay. Is it affected by this vulnerability?
Yes.
Yes.
This is not really about
This is not really about TOR, but please could someone knowledgeable help me as I can't find the answer via searches?
While logged in to Yahoo (when it was vulnerable) and logged in to eBay at the same time (which was not vulnerable), could the bug have revealed my eBay password and so I need to reset that as well as the Yahoo one?
"It depends." Not through
"It depends."
Not through the obvious version of the attack (since it's the server that's vulnerable, not your browser), but maybe through some non-obvious version of it.
Please keep up the good
Please keep up the good work. The TOR team is awesome. Thanks a lot.
Was the old version of
Was the old version of Torchat - 0.9.9.553 or isthe version of OpenSSL too old?
You'd have to ask the
You'd have to ask the Torchat people. Torchat has nothing to do with Tor and we haven't looked at it or evaluated it in any way. (In large part this is because they picked a confusing name for their program, so we spend energy teaching people that it's a confusing name rather than actually looking at it).
Torchat hasn't updated in
Torchat hasn't updated in ages, you need to do this manually.
Upgrade Tor in TorChat
1. Close TorChat
2. Download the offical Tor Browser Bundle from Tor Project
3. Extract Tor Browser Bundle to: c:\
4. Copy: C:\Tor Browser\Tor\tor.exe to c:\TorChat\bin\Tor\
5. Copy: C:\Tor Browser\Tor\libeay32.dll to c:\TorChat\bin\Tor\
6. Copy: C:\Tor Browser\Tor\libevent-2-0-5.dll to c:\TorChat\bin\Tor\
7. Copy: C:\Tor Browser\Tor\libssp-0.dll to c:\TorChat\bin\Tor\
8. Copy: C:\Tor Browser\Tor\ssleay32.dll to c:\TorChat\bin\Tor\
9. Copy: C:\Tor Browser\Tor\zlib1.dll to c:\TorChat\bin\Tor\
10. Start TorChat: c:\TorChat\bin\torchat.exe
Remember TC is a hidden service and like mentioned in the post above you should update Tor and then switch IDs.
Wow
Wow
Since any active security
Since any active security agency had plenty of time to map all Tor users IP addresses and more, What is the best practice to become anonymous from now on?. They know all their targets and their signatures as far as how they use internet. Does one needs to restart with new IP address, new persona, new hardware (computer, etc.), new software, new firmware, new VPN, new guards (relays), and essentially get ride of all things that could connect one to old persona?
TOR is awesome... I had to
TOR is awesome...
I had to know it little earlier
Hi! I'm also using
Hi!
I'm also using 0.2.03.25. What should I have to do/to change/to check, please!
Regards,
Me.
Stop using the outdated
Stop using the outdated version of Tor and switch to the latest version.
(I bet there are a lot of other things wrong with your setup too, if that version is a part of it.)
After one changes the keys
After one changes the keys on a relay tor weather jumps in with an announcement. Something should be done about that, a rekeying API or something.
I'm just happy Tor weather
I'm just happy Tor weather is still running at all. We've had nobody to maintain it or fix bugs or anything in it for years. Perhaps somebody wants to volunteer to help? See the tor-dev threads about it.
https://www.cloudflarechallen
https://www.cloudflarechallenge.com/heartbleed
Private keys have been obtained, but it took over 100k requests and this was outside of Tor. How long on the fastest Tor connection would 100k requests take to complete?
Has anyone timed this yet?
Has anyone timed this yet? On average, how long does it take to get the key of a HS? How many times out of how many requests were you able to get it?
As far as I know, there are
As far as I know, there are zero cases where anybody has successfully extracted a hidden service private key from a Tor client. Or for that matter a relay identity key from a relay.
That doesn't mean you can't do it. But it means we're not near to answering your "how long, how often" questions.
Could there be a future
Could there be a future torrc option to restrict OpenSSL heartbeat to once every few minutes or shut it off altogether?
Somebody should indeed go
Somebody should indeed go through openssl and figure out all of its 'features' like this one. So far as I can tell, Tor doesn't need this heartbeat thing -- the Tor protocol has its own heartbeats built in.
The other question for each one will be whether an external observer can use any of the features we take out to distinguish us from 'real' SSL handshakes -- that's a major way that governments like Iran have been blocking Tor via DPI over the years.
I'm guessing at this point that focusing on just the heartbeat feature is like closing the barn door after the horses are extinct. But there are bound to be more issues remaining in other parts of openssl.
Tor Browser's anonymity
Tor Browser's anonymity stands on OpenSSL and NSS.
I think a proactive code review of NSS would be well advised.
https://developer.mozilla.org/en-US/docs/NSS_Sources_Building_Testing
You should give out a bounty if someone reports a deanonymizing bug in one of those libraries. I think about $1000-2000. This would be a nice reward without the need for shady dealings to sell such bug on the black market.
I think this is the only realistic approach to get enough people to actually look through the code.
We talked a while ago about
We talked a while ago about doing bug bounties. Note that Mozilla itself does bounties for "security" problems, though you're right that our definition of security problem differs from theirs.
In the end we decided that we already know about plenty of important bugs that need fixing (see trac.torproject.org), and our Tor Browser money is better spent fixing as many of the known issues as we can than finding yet more issues but not fixing them.
That said, if anybody knows somebody who wants to fund Tor Browser bug bounties, we'd love to reconsider this plan.
In retrospect did the Tor
In retrospect did the Tor client with an unpatched OpenSSL send the heartbeat over TCP only once per server session or could it have been more often? Multiple hearbeats could pave the way for reading larger memory areas at the client side.
I understand from http://tools.ietf.org/html/rfc6520 that multiple heartbeats are only necessary over UDP.
This heartbeat implementation is so silly one has not change to change anything to make it into a joke: http://xkcd.com/1354/
Is anyone on the project
Is anyone on the project doing practical tests to see how effective the attacks would be? If malicious entry guards are able to see sites a user had visited, possibly for 2 years, that is quite worrying.
If it turned out to be quite hard in practice (like private keys on web servers) it might be a bit more reassuring for tor users.
I encourage people to work
I encourage people to work on this one.
I bet it will be quite tricky but not impossible.
hi i'm still confused (after
hi i'm still confused (after reading all these posts) exactly how i go about sorting this problem out? i use the tor bundle 3.5.4 that i updated a couple of days ago, i have no idea how to "update my ssl package" and don't understand if that applies to me as i use the bundle. also whats this about a tool to check to see if my ssl is compromised
Tool at: http://rehmann.co/projects/heartbeat/
is this a good idea?
basically is there anything i personally can do to protect myself, and should i still use tor?
thanks
ps i recon the dude that posted that he looks at "very bad" websites is into kiddy porn and i hope he's sweating waiting for the feds to to make a "hard entry" on his front door and take him to live in the big house with "bad bubba and the shower sisters"
If you're just using Tor as
If you're just using Tor as a client, and only using TBB, then moving to TBB 3.5.4 should be all you need to do for Tor.
(I say "for Tor" because if you logged into some website using https over the past few years, it's possible that the website was vulnerable, completely separate from what browser you used to reach it -- people could attack the website to extract whatever personal information you might have given it.)
>ps i recon the dude that
>ps i recon the dude that posted that he looks at "very bad" websites is into kiddy porn
OK, that's disgusting. Not him, you. Let me guess, you're from America, Canada, or the UK, right? Either way, most of the world does not jump to the conclusion that "I look at bad websites" = "I look at kiddie porn". It's people with your views who try to get Tor banned or censored, because they assume the only reason people use Tor is for "bad things". Please, don't make completely and utterly unfounded assumptions, to the point where you actually wish great suffering upon a person. Honestly, I find that more disturbing what you're doing than the slim chance that his version of "bad sites" is exactly the same as your view.
I'm not trying to be rude, but I'm really quite tired of this. Quite often I'm on various chats, or forums, and I mention I like anonymity and privacy, and the first thing people assume is drugs, kiddie porn, or terrorism, and refuse to help me, or just as you do, wish for pain and suffering.
How's this. I go on "bad sites". Do you hope I suffer now? Do you hope I'm terrified of being locked up for decades and raped? Well too bad for you, because the sites I go on that are o-so bad are websites about atheism.
Once we're all done with our moral panics, can we please show some compassion for others who are lumped into one category just because we live in a place where we might like things big brother doesn't approve of?
btw how about return to
btw how about return to rotating entry guards? longer you connected to the guard more leakage it can collect. new tor development lock you at single entry guard, is it coincidence?
No, this doesn't make much
No, this doesn't make much sense to me. I think a single guard can do a lot of damage to you, and if you have three of them, then any of the three can.
never ever use shared
never ever use shared libraries! if you application was from openssl 1.0.0 era and you have updated system to "newest" 1.0.1
https://www.ssllabs.com/sslte
https://www.ssllabs.com/ssltest/analyze.html?d=torproject.org&s=38.229…
https://www.ssllabs.com/ssltest/analyze.html?d=torproject.org&s=86.59.3…
https://www.ssllabs.com/ssltest/analyze.html?d=torproject.org&s=93.95.2…
https://www.ssllabs.com/ssltest/analyze.html?d=torproject.org&s=82.195…
https://www.ssllabs.com/ssltest/analyze.html?d=torproject.org&s=38.229…
"The server does not support Forward Secrecy with the reference browsers. Grade reduced to A-."
I think the answer is "take
I think the answer is "take it up with Debian".
i think its great the way
i think its great the way the Tor project actively responds to a lot of user inquires.The EFF linked to this thread and i just wanted to say, you guys sincerely care about your work and its very admirable.
anyways, it is possible to incorporate PFS to tor nodes?
I think Tor nodes already
I think Tor nodes already have PFS.
Right. I wonder what the
Right. I wonder what the question is actually about?
Tor uses PFS in its link encryption, and also the circuit handshake uses PFS.
Plus relays rotate their (medium-term) onion keys weekly:
https://www.torproject.org/docs/faq#KeyManagement
Is using remote (shared) tor
Is using remote (shared) tor through SOCKS protocol seems preferable? No personal data leakage possible, bcose process dont have access to them.
No, because that remote Tor
No, because that remote Tor client still knows everything that a local Tor client would, and it would still be vulnerable to this same sort of attack (if you haven't upgraded).
Plus, if you use a remote Tor client you add yet another point in the network that gets to know both you and everything you do.
And if that's not enough, you're still going to be running whatever application (e.g. browser) on your own computer, so if it has problems then you didn't deal with that.
Bad idea IMO.
If you are using Tor, STOP
If you are using Tor, STOP NOW.
I suggest EVERYONE GO BACK TO 2.3.5.
Client and server. Clients, enable NoScript!
2.3.5 is back from 2011 and is tried and true, as far as I know.
Any hidden sites using compromised versions of Tor/SSL are unsafe and can never be considered safe again, unless the owner can prove through use of an old PGP key/.bit address that they own the new .onion site. All .onion sites using newer versions of Tor ARE POTENTIALLY COMPROMISED.
Also, any clients who have used a version of Tor within the last 2-3 years should be considering all of their keyrings potentially compromised by the entry guard (first relay) and should be completely re-encrypting their systems and generating new PGP keys, etc, as the first relay could have been reading our RAM through HeartBleed.
* IF YOU CANNOT SAFELY THREATEN TO KILL POLITICIANS OR DOWNLOAD/POST CP, YOU ARE NOT ANONYMOUS. *
I knew that it was sketchy when Tor Project was telling everyone to update their browser bundles after the Firefox javascript exploit that was revealing IP addresses of pedos.
THIS REQUIRES FURTHER RESEARCH AND THE TOR PROJECT IS COMPLETELY INCAPABLE OF DOING IT, AS IT HAS BEEN OVERRUN BY NSA SHILLS. We need to go back and fork the project!
Be careful listening to this
Be careful listening to this person's advice.
For example, the 2.x TBBs have old obsolete insecure versions of Firefox in them, so that part is clearly bad advice.
Hidden services that used insecure versions of openssl are indeed unsafe and shouldn't be used again -- I agree. But this notion of proving something via PGP? Not enough details. And why blame the newer versions of Tor? Haven't you looked at the code? Or at least the changelog that shows all the bugs we fixed since the version you prefer to run?
Ha, and then we get to the end of your comment. I guess I'll let people judge that one for themselves.
Is it possible to check
Is it possible to check which SSL-version is installed by my TBB? (I'm only using the browser/client-only)
Version is 0.2.3.25-?. Do I have to look vidalia or browser-options?
Wow. You are using an
Wow. You are using an obsolete insecure version of Tor Browser from years ago.
That browser you have probably has security holes by now. I recommend against running it.
I recommend you implement a
I recommend you implement a way to to make tor nodes refuse connections from older version -anything that isn't the latest TBB. Too many people have no idea what dangers they're putting themselves in when using a not-up-to-date TBB.
Part of the trouble is that
Part of the trouble is that Tor is an anonymity system, and our protocol is open and there are multiple implementations.
So there are no good ways of having the relays reach in and figure out what version the Tor client is.
I guess we could have the Tor client volunteer its version info, and then the relays can hang up if they don't like it. But if we're to do that, why not have the clients just opt to fail if they're out of date?
That sure would make some users upset, e.g. the ones who put Tor on their USB stick, go to the censored/surveilled place with really crappy Internet, and then can't use it because of the update that came out the day before -- even if all they planned to use it for was to fetch the newer version.
So in sum, "it's complicated; somebody should come up with a clear plan and then we can see if it would actually work. Most versions of such plans don't seem good."
PersonalWeb, the True Names
PersonalWeb, the True Names patent troll, claimed they could hack SSL to insert advertising with MITM attacks for ISP customers. Does anybody know if they were using this hack?
Seems extremely unlikely.
Seems extremely unlikely. Best guess is that they were bluffing. Second best guess is that they were referring to all the problems in the CA model.
But hey, who knows.
No question, but I just
No question, but I just wanted to thank you very much for answering so many questions on here, it must be hard with all the other work you are doing. I wish I knew as much as you do.
Thanks! Please learn more
Thanks!
Please learn more and then help answer questions. :)
Or something more concrete that you can do: take the best questions you find here, and make entries on https://tor.stackexchange.com/ and put the answers in, so they will be useful to other people in the future.
well TOR can you use PFS vs.
well TOR can you use PFS vs. heartbeat bug
It depends what you're
It depends what you're asking, but I think "no" is the likely answer.
so the bug is fixed?
so the bug is fixed?
Yes, the bug is fixed. But
Yes, the bug is fixed. But some of the fallout from the bug is still ongoing. For example, we'll be putting out a Tor update in the next while that blacklists the old (no longer in use) directory signing keys from the directory authorities -- not because we know they were compromised, but because we don't know they weren't. For another example, we cut 1000 relays out of the network because they were still vulnerable to the bug. And there are another 500-1000 that have upgraded (so the bug is fixed for them) but maybe their long-term identity key was extracted from them before they upgraded, and we'll be cutting those out of the network at some point.
It'll be a while yet until we can call it all resolved. I recommend following on the tor-relays list and other places than these blog comments.
OpenSSL is forked:
OpenSSL is forked: http://www.theregister.co.uk/2014/04/22/openssl_fork_libressl/
I'ts going to be a good
I'ts going to be a good while before that fork is mature...