Today marks the 28th anniversary of the Morris Worm, which devastated large portions of the nascent Internet on November 2, 1988. Even though it was unleashed nearly three decades ago, it was more advanced than the Mirai worm that compromised hundreds of thousands of IoT devices in recent weeks.
I originally started drafting this post on January 14, 2012, but it sat unpublished since then. Its fun to look back at ones journey, 1743 days ago. In 2012, I was relatively new to the Ruby on Rails platform after having worked in PHP and SQL for years, as well as a little .NET. The platform has been a good choice that I enjoy working with still to this day. I was working in Rails 3 at the time and had completed at least three client websites in Rails in 2011.
Anyway, let’s take a look at the little lesson that I had started to write about over 4 years ago.
How to Handle Maximum Lengths for User Supplied Input
Stop annoying users by appearing to allow more text in a field than supported!
The default length of a string in an ActiveRecord model is 255 characters. By default the text_field helper will allow the user to enter more. As a user, one is incorrectly to think that he or she can enter more text than is allowed and it is silently truncated by the web app. Stop it, seriously.
Do it by two easy steps:
- Set the maxlength and size attributes on your one-line text fields.
- Validate the length of the text fields in your model.
In the View:
In the Model:
1 2 3 4 5
Even today in 2016, many Rails developers leave maximum length validation out of their Rails models. This is a mistake. If you are using PostgreSQL, then validating that a string is no more than 255 is even more important because a string that is longer will cause the model to be reported as valid and yet PostgreSQL will raise an exception on save. This will lead to data loss and the dreaded “Something Went Wrong” 500 error page for your users unless you handle the length validation properly.
The 2016 Verizon DBIR report is out and is available for download. Among the findings is the prevalence of data breaches that are attributable to stolen authorization credentials.
According to the report “63% of confirmed data breaches involved weak, default or stolen passwords” (page 20). This is an increase from 2015, when the stat was that 51% of web application breaches were attributable to stolen credentials. If anything is clear, it’s that the lowly credential theft is a clear and present danger in information security. It is responsible for more incidents than all the other exotic, technically interesting attacks combined.
The continued calls for the U.S. Congress to ban effective encryption despite the current computer security crisis in which data breaches are regular news is dangerous, shortsighted, and destined to harm all Americans. The two most effective tools that we have capable of helping prevent data breaches are encryption and reducing the attack surface of computer systems that handle sensitive or private data. Under the proposed legal framework, both will be sacrificed for a false sense of safety.
The latest installment of Congressional hearings was held by the Energy and Commerce Committee on April 19, 2016, and was titled Deciphering the Debate Over Encryption: Industry and Law Enforcement Perspectives. The calls for Congress to ban effective encryption are repeated with little variance from the past. Some Members of Congress are expressing frustration that the debate is repeating itself without law enforcement suggesting any particular middle ground that would be workable for the tech community. But what is most chilling is that those in law enforcement continue to demand exceptional access despite years of back and forth and the parade of high profile data breaches both within government and the private sector. We’re losing the cybersecurity battle and the government is calling for a ban on one of the most effective tools that computer science has at its disposal.
The anticipated Feinstein-Burr Compliance with Court Orders Act, an anti-security bill, would require the provision of data in an intelligible format to a government pursuant to a court order (scribd.com). A draft copy was uploaded by The Hill reporter Cory Bennett, though whether it has been submitted officially within the Senate is not yet clear (vice.com).
This bill essentially says you can not have any conversation or data exchange that the government can not access if it wants to. It is the legal culmination of what the FBI has been lobbying Congress for years. If Feinstein-Burr becomes law, it will be illegal to deploy strong encryption without key escrow maintained by each company. Cryptographers and computer scientists near-unanimously assert key backup systems are insecure at scale.
Crypto War II, the first crypto war having taken place in the 90s with the clipper chip, is in full swing with hostilities started back up a few years ago when FBI Director James Comey and others started lobbying congress and giving public speeches about how being unable to unlock some devices and communications makes it hard to do their job. It has been an unrelenting full public relations assault on practical strong encryption.
Ultimately FBI Director James Comey wants a future where it is illegal or impractical to deploy strong encryption without key escrow, which is a key backup system that the great consensus of cryptographers and computer scientists assert is insecure at scale. As a statesman he never comes out and says this directly, but it is the only conceivable outcome to what he is demanding of tech companies before congress and the actions that the FBI has taken in court.
TL;DR; SHA1, SHA256, and SHA512 are all fast hashes and are bad for passwords. SCRYPT and BCRYPT are both a slow hash and are good for passwords. Always use slow hashes, never fast hashes.
SANS’ Securing Web Application Technologies [SWAT] Checklist is offering a bit of bad security advice for the everyday web application developer, under the heading “Store User Passwords Using A Strong, Iterative, Salted Hash”:
User passwords must be stored using secure hashing techniques with a strong algorithm like SHA-256. Simply hashing the password a single time does not sufficiently protect the password. Use iterative hashing with a random salt to make the hash strong.
I’m super glad to see the word getting out that security has to be part of the development process. Oh by the way, I learned at the ISSA International conference this week that Microsoft has a version of their Secure Development Lifecycle tailored for Agile development. These development practices are universally applicable without respect to platform. I am going to be reviewing that and incorporating more of those practices into future talks.
I publicly speaking about how development teams and those who employ them should go about using user stories with security constraints and abuser stories as a security documentation tool. At this time there is not an entry on Wikipedia about it, so I am going to take a stab at writing it up for you here.
What is an Abuser Story in Software Development?
In software development and product management, an abuser story is a user story from the point of view of a malicious adversary. Abuser stories are used with agile software development methodologies as the basis for defining the activities that should be actively blocked or mitigated by the software and proven by automated regression testing.
I’m back from Boulder, Colorado, having presented on application security to the Ruby developers at the Rocky Mountain Ruby Conference! It was a fantastic group and security is one of those topics that are just not talked about enough within the developer community.
I started off with a definition of application security:
Application Security is the subset of Information Security focused on protecting data and privacy from abuse by adversaries who have access to the software system as a whole. Its purpose is to make software resilient to attack, especially when network defenses alone are insufficient.
Then proceeded to talk about the importance of writing User Stories with security constraints and Abuser Stories, which are user stories from the point of view of a malicious adversary. It’s all about clearly communicating among developers and the non-technical stakeholders about the threats so that these considerations can inform development decisions.
The Q&A was robust with more questions than there was time to get to them all. I was able to give out two blue Yubikey Fido U2F keys thanks to Yubico.