The Latest in IT Security

OpenSSL ‘Heartbleed’ Bug Leaks Sensitive Data


Website owners should move quickly to patch a critical vulnerability in the OpenSSL cryptographic software library.

The flaw, which was disclosed Monday, can be exploited to compromise the secret keys used to identify service providers and encrypt traffic, usernames, passwords and content. Dubbed ‘Heartbleed’ because the bug is in OpenSSL implementation of the TLS/DTLS heartbeat extension (RFC6520), the vulnerability was introduced to OpenSSL in December 2011 and has been out in the wild since OpenSSL release 1.0.1 on March 14, 2012. OpenSSL versions 1.0.1 through 1.0.1f are vulnerable, but the latest version released Monday – 1.0.1g – is not.

“Looking only at web servers, it seems that OpenSSL 0.9.8 and 1.0.0 are still the most popular versions, which are not affected,” said Mark Schloesser, security researcher for Rapid7. “However, we count at least a few hundred thousand servers using affected library versions, so it poses a significant threat. As the same problem affects other protocols/services such as mail servers and databases, we assume that, overall we’re looking at millions of vulnerable systems connected to the public Internet.”

According to an advisory from the OpenSSL Project, the issue comes down to a missing bounds check in the handling of the TLS heartbeat extension that can reveal up to 64K of memory to a connected client or server. However, the researchers that discovered the bug added that there technically is no 64K limit to the attack, as that limit applies only to a single heartbeat. According to researchers at security vendor Codenomicon, who discovered the bug along with Neel Mehta from Google, an attacker can either keep reconnecting or keep requesting arbitrary numbers of 64 kilobyte chunks of memory content during an active TLS connection until enough secrets are revealed.

“We have tested some of our own services from attacker’s perspective,” Codenomicon noted in an FAQ on the findings. “We attacked ourselves from outside, without leaving a trace. Without using any privileged information or credentials we were able steal from ourselves the secret keys used for our X.509 certificates, user names and passwords, instant messages, emails and business critical documents and communication.”

TLS client certificate authentication does not mitigate the issue, nor does OpenSSL’s FIPS mode, according to Codenomicon. However, using Perfect Forward Secrecy (PFS) should keep past communications from retrospective decryption.

A proof-of-concept exploit for the vulnerability has already made its way online.According to Fox-IT, Yahoo is among the sites vulnerable to attack.

“It is possible to detect successful exploitation of this vulnerability by inspecting the network traffic,” blogged Joost Bijl of Fox-IT. “We have developed Snort signatures to detect succesful exploitation of the ‘heartbleed bug’.”

“This bug,” he added, “affects both sides of the connection. Not only will client certificates not save you from having to update your server certificate, they can be read from the client (along with your username, password etc.) by any server you connect to. DNS poisoning, MitM etc. can be used to direct clients to a malicious server – it seems that this vulnerability can be exploited before the server has to authenticate itself.”


Brian Prince is a Contributing Writer for SecurityWeek.Previous Columns by Brian Prince:OpenSSL Heartbleed Bug Leaks Sensitive Data 8 Charged in Identity Theft Scheme Targeting ATT Customers Banks Warned of Attacks Targeting ATMs and Card Authorization SystemsDDoS Attackers Increasingly Use Multiple Attack VectorsMicrosoft to Patch Word Vulnerability Targeted in Attacks

sponsored links


Virus Malware

Comments are closed.



Mission-Critical Broadband – Why Governments Should Partner with Commercial Operators:
Many governments embrace mobile network operator (MNO) networks as ...

ARA at Scale: How to Choose a Solution That Grows With Your Needs:
Application release automation (ARA) tools enable best practices in...

The Multi-Model Database:
Part of the “new normal” where data and cloud applications are ...



Latest Comments