Menu

Michael Gracie

A Public Key Infrastructure for Extensible Interoperability

An explanation for that title is in order. It is the culmination of what I call my “dark side”. Literally.

Eighteen months ago a colleague and I embarked on a brain-busting adventure: figure out a way to encrypt anything (or everything) on the web without installing any software – full-on security for the cloud. A few months later CryptML, an entirely new markup language whose sole purpose is hard encryption, was born. Since that time we’ve been developing a representative sample implementation, debugging more code than any human being should ever have to, and building an entire platform around what started as satisfaction of intellectual curiosity.

Could it be done? You betcha! We’re now working straight out of the National Security Agency’s playbook, using a collection of algorithmic components called Suite B. And we’re doing so without violating any patents.

While a few folks out there pass by these pages looking for help with mcrypt and Wireshark, most come by to hear about fly fishing. I’d like to keep it that way, so I won’t bore you with the extraneous details. Nevertheless, a paper I wrote recently outlining some of the history of modern encryption and why I feel it is in the best interests of both the general public AND national security to adopt technology such as ours, was picked up by Military Information Technology Magazine, and published in their April edition. Needless to say we’re pretty happy about that, and we greatly appreciate MITM’s consideration of what we’ve accomplished.

The introduction from that paper follows, and for those really interested a link to the entirety over at the MITM website has been included below too.

The existing public key infrastructure was developed in the late 70’s and early 80’s as part of research coming out of academia. The systems and methods were quickly perceived as a revolutionary way to satisfy the secure data exchange needs of the scientific community, and later the federal government. Since that seminal period, advances in microcomputer technology have pushed communication channels, protocols, and hardware to the point where the convergence of voice, video and data are the norm. Secure communication using encryption, however, is still based on standards developed in decades past, with advancement centering primarily on new algorithms to replace those for which mathematical weaknesses are found.

Unfortunately, it is the seemingly never-ending advancement of computing horsepower combined with its ever-falling prices that are enabling the discovery of the “flaws”. Unless computational speeds unexpectedly plateau, the costly cycle of adopting new platforms and devices to replace those built on ever weakening encryption schemes will continue unabated.

What is needed is a completely new paradigm for the process of encoding, exchanging, decoding, and validating data – one that can completely replace that built for the closed-loop, point-to-point communications existing before internet protocol (IP) use became pervasive.

The rest of the piece can be found here: http://www.military-information-technology.com/mit-archives/241-mit-2010-volume-14-issue-3-april/2793-public-key-infrastructure-for-interoperability.html – Public Key Infrastructure for Interoperability. If the link doesn’t work, just pick up a hard copy – available in the lobby of the Pentagon.

Special thanks go out to editorial staff at MITM, as well as other individuals who helped make this happen. If you are in the defense, healthcare information systems, or financial services fields and are interested in seeing our representative application, a hard-encrypted messaging tool, you can contact us here for an invitation.

UPDATE 10/12/16: The link above is no longer in service. Hence the entire text of the original article has been reproduced in its entirety below.

——————————-

This article originally appeared in Military Information Technology 14.3.

Introduction

The existing public key infrastructure was developed in the late 70’s and early 80’s as part of research coming out of academia. The systems and methods were quickly perceived as a revolutionary way to satisfy the secure data exchange needs of the scientific community, and later the federal government. Since that seminal period, advances in microcomputer technology have pushed communication channels, protocols, and hardware to the point where the convergence of voice, video and data are the norm. Secure communication using encryption, however, is still based on standards developed in decades past, with advancement centering primarily on new algorithms to replace those for which mathematical weaknesses are found.

Unfortunately, it is the seemingly never-ending advancement of computing horsepower combined with its ever-falling prices that are enabling the discovery of the “flaws”. Unless computational speeds unexpectedly plateau, the costly cycle of adopting new platforms and devices to replace those built on ever weakening encryption schemes will continue unabated.

What is needed is a completely new paradigm for the process of encoding, exchanging, decoding, and validating data – one that can completely replace that built for the closed-loop, point-to-point communications existing before internet protocol (IP) use became pervasive.

A brief explanation of key-based encryption

Key-based encryption generally falls into two categories – symmetric and asymmetric – and both are actually fairly simple concepts when the mathematics is removed from the description. Under symmetric encryption, there exists a single key that both “locks” and “unlocks” the data – both sender and receiver share that key before the data is transferred. In asymmetric encryption, two keys are required: one is used to encrypt (lock) the data, and is readily shareable by all participants, while another key is used to decode (unlock) the data.

With symmetric encryption, if sender and receiver want to keep communications between themselves alone the connection must remain secure, particularly during the key sharing process. For this reason, symmetric encryption is most suitable for point-to-point networks such as those historically deployed in the military theatre. Asymmetric encryption, however, doesn’t require this pre-determined, ongoing trust. A party wishing to receive, say simple text messages, simply generates a publicly available key for encrypting those incoming messages, and can do so far in advance of the discrete communication. It is then distributed to everyone the potential data recipient needs to receive communications from, and that receiver is ultimately responsible for keeping it up to date. The secret, or private, key is the only key that can decode any inbound communications, and is generally kept secure by that recipient. Asymmetric encryption also allows other users to validate those self-generated keys, as well as determine when they are legitimately changed.

Origin of today’s standards

In 1976 Whitfield Diffie and Martin Hellman published a paper entitled New Directions in Cryptography that specified an innovative method for key exchange. The process allowed communicators with no prior knowledge of each other to establish a shared key without the need for a secure network. While a number of cryptosystems have been developed since, Diffie-Hellman has become the de-facto standard for key exchange, and part of the foundation of the public-key encryption in wide use today.

The National Security Agency adopted an improved derivative of Diffie-Hellman and married it with algorithms used for other parts of encrypted communications, forming a set of standards called Suite B. The openly published Suite B includes specifications for encrypting data, exchanging keys, signing messages, and validating received data.

Suite B’s base elements and the Federal Information Processing Standards they are derived from are as follows:

– Encryption: Advanced Encryption Standard (AES) – FIPS 197 (with key sizes of 128 and 256 bits)
– Key Exchange: Elliptic Curve Diffie-Hellman – Draft NIST Special Publication 800-56 (using the curves with 256 and 384-bit prime moduli)
– Digital Signatures: Elliptic Curve Digital Signature Algorithm – FIPS 186-2 (using the curves with 256 and 384-bit prime moduli)
– Hashing: Secure Hash Algorithm – FIPS 180-2 (using SHA-256 and SHA-384)

Announced in early 2005, Suite B complies with policy guidance set out by The Committee on National Security Systems, and can be used for encrypting information up to a level of Top Secret when the larger bit-sized keys are used.

The Suite B specification has been submitted to the Internet Engineering Task Force (IETF) for inclusion in Internet Protocol Security (IPSec), a budding framework for securing data at the packet layer. IPSec is designed to encrypt all traffic crossing the internet, either by securing the individual package being sent (transport mode) or by securing the actual route the data is sent over (tunnel mode, or virtual private networking). The latter methodology is in widespread commercial and governmental use today, but it suffers from many of the same limitations the underlying encryption infrastructure does.

Inherent weaknesses of the existing model

The existing model has aged. It was created while the personal computer was still in its infancy, and when widespread access to networks did not exist. Development burgeoned when the environment was simple, and the user base sophisticated. The surrounding conditions are now infinitely more complex, and the average user less so.

Encryption technology was developed before 3MHz desktop CPUs were commonplace – 3GHz is now a norm – and internet connections, if available at all, were at best dialup speeds. Hence, the approach to implementing it meant minimizing the processing requirements as well as minimizing the amount of data that had to be transmitted. The tradeoff was that software was inflexible, provided minimal features, and was difficult to update. What might have seemed like fair compensation for performance then is insignificant, maybe even a nuisance, today. Concerns about encryption overhead should now be relegated to only the most demanding applications, particularly across networks whose capacity has expanded a billion-fold or more since the 1970s.

Variants of Suite B are already used in a variety of hardware and software applications, but most are fixed with respect to the platform. In other words, change is hard to come by. If a computer scientist (a.k.a. hacker) is able to find a weakness in just one portion of the mathematics that are part of Suite B, entire systems must be changed for future security to be ensured. For example, the MD5 hashing algorithm was “cracked” in 2004, yet many systems, including the Secure Sockets Layer (SSL) certificate exchange that is the foundation of internet commerce, are still using MD5 because of the widespread switching costs that would result.

Existing key-based encryption works within an infrastructure that is relatively transparent to the sophisticated user. For those that are not extensively trained, however, even installing and properly configuring a desktop email client encryption plug-in can be an impossible task. Users must learn how to generate random key data, must know how and/or where to obtain an encryption key pair, must know how to generate a revocation certificate to change those keys, and must quickly comprehend complex web-of-trust issues before they can safely communicate in a secure environment. Further, varying commercial software products contain components of encryption functionality, but such features are often de-emphasized both in the product and in the documentation. Stand-alone encryption software is much the same, and usually requires support in the form of a full-fledged systems administrator to operate.

Network in transition

While development of the public-key infrastructure remained relatively stagnant, the growth of the internet as public medium pushed forth – and was pushed forth – by new protocols and languages. New communication standards were formed for the exchange of different data types, including:

– iLBC and G.711 for voice
– MPEG-4 and H.264 for video
– HTML and XML for text and data

These standards centered on usability and interoperability. Hardware and software manufacturers adopted them because the general public, the consumer market critical to their economic growth, could utilize them easily, and the protocols could be delivered with chosen measures of opacity. Newer, more sophisticated applications were developed on and around them.

The first email was sent in 1971 – less than a decade later public-key encryption entered its present stage. Meanwhile, in 1989 Tim Berners-Lee gave us the World Wide Web, and a few years later the browser was born. Core processes for delivering secure data may be relatively unchanged, but we now have streaming video on cellular phones to deal with. As data processing moves ever closer to the fully-distributed cloud computing model, leveraging the combination of open standards and tools built first for usability makes perfect sense both technologically and economically. Encryption schemes need to follow the same path, that of interoperability and extensibility.

Re-engineer the entire infrastructure

Adopting a new infrastructure for secure data exchange works for one simple reason: the rest of the network has already moved far past what exists today. Governmental bodies have already begun implementing Commercial Off-The-Shelf (COTS) hardware into newly deployed systems. In most cases this hardware is already fully capable of interfacing with web services.

Embracing the base technology behind the commercial internet provides the ultimate in interoperability. Browsers, for example, are virtually omnipresent – and they can run on almost any hardware. In fact, much of the COTS network equipment utilized by the US military already contains browser software for configuration and management. By extension all devices deployed in the theatre, whether it is baseband hardware, reachback equipment, or remote connections via handheld device, could be exchanging data via the same web services, and with little or no additional customization.

The World Wide Web was designed for constant change, hence the tools that interpret the data must be able to constantly adapt. Unlike the present encryption infrastructure, a new model is already available that can not only manage key exchange from point to point or amongst multiple users in disparate locations, but can also obtain keys from a variety of sources (include those under physical control) and orderly or arbitrarily switch those sources as security procedures require. Further, applying web technologies to encryption services allows the user to update those services to keep up with changing mathematics. If an algorithm presently in use is deemed insecure, it can be replaced with another one immediately. Should the system user determine that interchanging algorithms in the middle of a conversation is necessary to comply with the classification of content being exchanged, it can be done on-the-quick-halt instead of after a new hardware requisition.

Web services are also designed to be interoperable not only with hardware, but also with software. While internet browsers are the de-facto standard for accessing web services on the public internet, many application front-ends are ported to client-side software. This provides additional security in the form of source code audit-ability. In addition, users gain added control over access to virtual private networking services, certification, and other layers of security outside the realm of commercially available and/or open source software. A well-engineered service can be access via proprietary software, without degrading either performance or the flexibility web services are known for.

Beneficial change

The free exchange of data across IP networks is not going away. Once ubiquitous, which it arguably already is in all but developing countries, those connected will find ever-expanding ways to leverage it. Encryption technology must already make a generational leap just to catch up.

Adopting a web services approach to a new encryption infrastructure has several distinct advantages:

1) Usability – the growth of the commercial internet is proof-of-concept, now a mainstay of day to day voice, video and data exchange, including commerce that is a measureable portion of gross domestic product;

2) Flexibility – as data exchange needs change, the platform is quickly modified to comply with those needs with virtually no additional effort or costs – changes can also be made on-the-quick-halt;

3) Interoperability – web-based technologies run on networking hardware, personal computers, cellular telephones and other devices very efficiently; web-services can operate seamlessly with new systems as well as hardware and software already deployed in the field; and most importantly…

4) Security – encryption as a web service means state of the art defense against intrusion; combining it with existing standards such as Secure Sockets Layer and Virtual Private Networking makes it even more so.

The cost versus benefit applied to encryption schemes is insignificant. Training happens quickly, even at the individual non-technical personnel level, because familiarity with the required tools is an afterthought. All equipment deployed in the field utilizes the same technology, resulting in multiple-mission capability – overall hardware needs are significantly reduced. And finally, as the base mathematics behind encryption are deemed inferior for the data they are to secure, new algorithms can replace the old immediately, versus recalling all equipment for update or dispersing technical expertise into the field to perform the task.

It is time to implement the next generation of encryption delivery, to ensure data security network-wide, permanently applied against the foreseeable future.

——————————-

I like my cookies with encryption on top

Quick and dirty mcrypt usage

I don’t know where I discovered the original idea, but in messing around with a PHP app I found the need to encrypt session cookies. Here’s how it was done, with the mcrypt library:

//encrypt session cookie
function encryptUserCookie($value)
{
if(!$value) {
return false;
}
$key = SESSION_SALT;
$text = $value;
$iv_size = mcrypt_get_iv_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_ECB);
$iv = mcrypt_create_iv($iv_size, MCRYPT_RAND);
$crypttext = mcrypt_encrypt(MCRYPT_RIJNDAEL_256, $key, $text, MCRYPT_MODE_ECB, $iv);
return trim(base64_encode($crypttext)); //encode for cookie
}

Decoding the cookie was much the same…

//decrypt session cookie
function decryptUserCookie($value)
{
if(!$value) {
return false;
}
$key = SESSION_SALT;
$crypttext = base64_decode($value); //decode cookie
$iv_size = mcrypt_get_iv_size(MCRYPT_RIJNDAEL_256, MCRYPT_MODE_ECB);
$iv = mcrypt_create_iv($iv_size, MCRYPT_RAND);
$decrypttext = mcrypt_decrypt(MCRYPT_RIJNDAEL_256, $key, $crypttext, MCRYPT_MODE_ECB, $iv);
return trim($decrypttext);
}

SESSION_SALT was of course something I called from a variables file.

These snippets were used in an online directory system, where I didn’t want attendees inspecting the cookies for the purpose of setting up multiple listings under the same login.

Simple stuff, but hope it is useful to someone.

How not to store your keys

Bruce Schneier, on learning that some health care works stored encrypted information on a USB memory device, along with the key to unlocking the encryption itself:

It’s smart to encrypt USB memory devices, but it’s stupid to attach the encryption key to the device….I’m sure they were so proud that they chose a secure encryption algorithm.

For those just joining, the above described action is the approximately equivalent to this…

Hide-A-Key

Editor’s note: the picture above is neither a good Photoshopping job nor an accurate depiction of the author’s rear bumper.

Crossing Borders with Laptops and PDAs

Bruce Schneier recommends a good cleaning and PGP (or TrueCrypt).

More on PGP here. I also use Cache Out X for clearing internet and system caches, as well as system logs.

Researchers Find Way to Steal Encrypted Data

Sadly, the headline is somewhat amiss.

Researchers have actually figured out a way to steal data from hard disks which are encrypted in full by operating systems’ resident protection schemes. In other words, I don’t believe this method would work on file/container encryption with passphrases (which happens to be my personal preference).

Nobody listens to the White House

After the Veterans Administration wrote the script for downplaying risk, when tens of millions of data records were stolen out of an employee’s home, the Bush Administration issued an edict – encrypt all data on government laptops.

Good idea, but nobody’s listening. Wonder what the TSA’s “100,000” number will grow to?

Data security experts…Ohio won’t be calling (any moment)

I wish I could say I am shocked and bewildered that the recent data theft out of the State of Ohio was more than 15 times worse than Ted Strickland & Co. made it out to be when the physical drive (?) was stolen out of an employee’s car, but alas I cannot. I wish I had a more sarcastic way to put it too, but Carlo over at Techdirt did a pretty good job of that. Meanwhile, I’ve recently heard that sarcasm is symptomatic of passive-aggressive behaviour, and since an old girlfriend once told me I was the only man she ever dated that wasn’t “PA,” I’m going to respect her opinion and refrain from sarcasm from this day forward.

Ok, maybe not…

It’s not as though Ohio didn’t see this coming – it’s been going on in the Buckeye state for some time. Then again, does anyone in bureaucracies ever know what is actually going on? If they did, would they even care? Or are they just so attuned to stretching the truth that they just don’t know how to shut up, even in the face of stone cold evidence waiting to rear it’s ugly head?

No matter. When the “powers that be” come out with statements like this:

“He’s actually in line with our conclusions that it would be very difficult for someone without special knowledge and understanding to actually access that piece of information.”

…you know someone is speaking for someone else right before they get handed their pink slip. “Very difficult?” “Special knowledge?” The spokesperson is either completely insane or oblivious to the fact that there are third world countries full of brilliant mathemeticians, since cast into the shadows of unemployment and looking feverishly for work on internet message boards.

The same types of folks create stuff like this:
            algo1

Add this:
               algo2

And wind up with this:
    algo3.gif

And that’s for a few hundred bucks, based on some handwritten notes a moron like me scratches on the back of an envelope over three Blue Moon drafts, and faxes over to him at his office at the local community college. I use such strokes of amatuerism to create graphs on a very stupid, highly unsuccessful website I built for a few thousand bucks more.

If I can rally such idiots to produce algorithms at a price equal to a steak dinner in New York proper, for something I will never see a return on my investment for, you can be assured that there is someone out there that can crack the encryption on a device left in the back of a government clerk’s car that contains social security and bank account numbers on a million people, just for throwing in a bottle of 1999 Chateau Pichon Lalande.

UPDATE: None of this matters anymore – a scapegoat has been caught, tried, and hung. That’s how it works.

Full disk encryption nowhere close to foolproof

The talk is directed at Bitlocker, the full disk encryption in Windows Vista, but it applies to all similar methodologies.

It’s simple. Fools don’t have physically secure, unencrypted backups. Fools think everything should run like lightning, regardless of the strain on the system. And, of course, fools lose passwords.

Doesn’t sound foolproof.

Might I suggest using virtual disk encryption, like that offered by PGP. It is slightly more cumbersome but puts less strain on the system and the “product” is portable – better design for fools (like me).

Acrobat bug biggest of 2007!

Now that is saying something, since it is presently January 6th. No, I’m not the one saying it – some security researchers are, and those researchers are implying it could be the biggest bug of the whole year (but I think that is only because they know Acrobat Reader has a huge install base, and most people are too dumb to bother implementing a patch when it does arrive).

Adobe has been on a decent streak as of late, so no better time to try and kick them down. The bright side of this is that it is free software we’re dealing with, so at least you didn’t pay to have your computer screwed up.

Note: Spamroll wins, however – a new category has been started – Software Bugs! Report quirks at your leisure.

UPDATE: Speaking of free software in need of patching…OpenOffice. I need to do it too, OpenOffice being a great tool for parsing small database tables when readying for import – Excel for Mac does a crappy job at it.

UPDATE 2: Since I’m on a free software binge this morning (while the dog pesters me for a walk), Dr. Dobbs notes that the free TrueCrypt encryption software is a hell of a way to thwart phishers. Check it out.

The last day of the year – time for 2007 predictions

It is the last day of 2006. What better time for predictions…

From the experts:

Spamroll says:

  • Spam will not end in late January (and Bill Gates will remain mum thereafter)
  • Some spyware companies will be getting sued again by February, while the rest change their company name
  • The government will quit buying consumer data in March, after determining that who is buying TMX Elmo is in no way correlated with who has a tendency to be a terrorist
  • Everyone will be backing up their hard drives by April, but only if external hard drives are free
  • They’ll be encrypting them by May, because everyone will be running hacked versions of Vista
  • We’ll all take the summer off, since phishers already do
  • Back-to-school will piss off millions of children, and not much else
  • October will be much like September
  • Telcos will implement IPv6 for Thanksgiving, and everyone on the internet will know who everyone else is, once and for all (with the exception of MacBook Pro users, which are already being tracked via heatsink)
  • We’ll get a ton of self-serving predictions for 2008, a week early at Christmas

Happy New Year!

UPDATE: Sarcasm does work – someone is thinking about backup.