Return Data Length Validation: a Bug We Missed

Photo by patricia serna on Unsplash

Validating Return Data Length

The Bug

I’ve distilled the vulnerable code down to a simple function. In the below code, assume that the IFetcher contract is supplied by a user. See if you can spot the bug:

pragma solidity 0.5.10;interface IFetcher {
function fetch() external view returns (uint256);
}
contract Vulnerable {
function getNumber(IFetcher f) external view returns (uint256) {
bytes32 selector = f.fetch.selector;

uint256 ret;
bool success;

assembly {
let ptr := mload(0x40) // get free memory pointer
mstore(0x40, add(ptr, 32)) // allocate 32 bytes
mstore(ptr, selector) // write the selector there
success := staticcall(
gas, // forward all gas
f, // target
ptr, // start of call data
4, // call data length
ptr, // where to write return data
32 // length of return data
)
ret := mload(ptr) // copy return data into ret
}

require(success, "Call failed.");

return ret;
}
}

Why This Matters In the 0x Exchange

This bug manifested itself in the 0x v2.0 Exchange in a particularly nasty way due to where the bug was. The exchange allows trades to be constructed off chain and then validated and filled on chain. To know that a trade is authorized, the exchange must validate a trader’s signature.

How It Was Introduced

This bug was introduced in a fix to a critical reentrancy issue identified in the initial phase of our audit. The recently introduced STATICCALL opcode was an ideal solution, but it wasn’t yet used by the current version of Solidity (0.4.22).

The Fix

There are a few ways to fix this class of vulnerability. The 0x team took a sensible defense-in-depth approach in their patch.

contract SafeSolidity {
function getNumber(IFetcher f) external view returns (uint256) {
return f.fetch();
}
}
success := staticcall(...)
if lt(returndatasize, 32) {
revert(0, 0)
}
  1. Before even making the call, the new code uses extcodesize to check if the target address is a smart contract. If it isn’t, the transaction is immediately reverted.
  2. Instead of just checking for a boolean true, the new code requires a specific return value (a “magic salt” value). Anything else is treated as failure.

Lessons Learned

Aside from the technical lesson about a gotcha of the Ethereum Virtual Machine (EVM), there are some other lessons our team has taken from this experience. Some of these lessons may be relevant to others in the Ethereum community, so I’d like to share those here.

Audited Code Can Still Have Bugs

Perhaps the most important lesson is that an audit isn’t a guarantee. All code is at risk of having bugs. A security audit can help to assess that risk and to find some of the worst bugs early, but it can’t promise to identify all bugs. Even formal verification, which provides stricter proofs around correctness, can only find bugs to the extent that the code is formally specified. A bug like this one can easily slip through such a process.

Have a Contingency Plan

When a bug is discovered after deployment, it’s important to have a plan for how to respond. Sometimes, it’s simply too late by the time you learn about the bug because it’s already been exploited in an irreparable way. But for cases where a security researcher discloses a bug privately or the bug has limited reach, you’ll want to know what to do.

Testing Isn’t the Answer

In retrospect, it seems like there’s a simple test that would have caught this issue: just try validating a signature with an externally owned account. This is often true in retrospect, but it’s another matter to come up with all the important test cases before knowing exactly where a bug will be found.

Inline Assembly Is Risky

Although we didn’t call it out in this audit report (because the 0x team was well aware already), we tend to warn clients that inline assembly is riskier than sticking to Solidity.

Sharing and Encoding Knowledge

It’s impossible for everyone to know about every gotcha, every technique, and every vulnerability. One aspect I really enjoy about working within the security community is that everyone seems committed to sharing their knowledge and expertise.

  1. Can we make the problem go away altogether? For example, uninitialized storage pointers used to be a common cause of vulnerabilities in Solidity code, but since Solidity 0.5.0, such vulnerable code simply won’t compile. Now this is one less thing developers and auditors need to know about.
  2. Can we address the problem with tools? Analysis tools in the Ethereum space have come a long way quite quickly. Our own team has MythX, a service for running a variety of analysis tools. When new classes of bugs appear, we can build knowledge of them into the tools. Reentrancy vulnerabilities are an example of an anti-pattern that tools can identify quite effectively.
  3. Can we address the problem with checklists? Checklists are a well-known way of sharing knowledge and preventing mistakes. Every audit our team does starts with a list of “TODO” issues in GitHub that remind us of things to look for in the code we’re auditing. After each audit, we perform a retrospective and look for things to add or remove from that list. To make sure the TODOs stay relevant, we plan to make more specific TODO lists for different types of code. This particular vulnerability, for example, might go in a list of things to look for in inline assembly. (This is, of course, only if the issue can’t be adequately addressed with tooling.)
  4. Can we document the problem? If nothing else, we can document and organize issues so others have an easier time learning about them. Our team maintains the Ethereum Smart Contract Security Best Practices and Smart Contract Weakness Classification Registry for this reason, but there’s never enough time in the day to document everything we’d like to. Contributions are welcome to both, so please share your expertise!

Summary

  • This particular bug has to do with a gotcha around the CALL and STATICCALL EVM opcodes. When using these low-level calls, it’s up to you to explicitly check the length of the return data.
  • If you want to avoid this class of bugs, avoid inline assembly and stick to Solidity. If you must use assembly, be sure to validate the return data length!
  • Writing code carefully, testing thoroughly, and getting a security audit are all great ways to improve the security of your code, but none are a panacea. Make sure you have thought through contingency plans.
  • We have to make it easier for everyone to write secure code, and this requires a holistic approach that takes into account improving the system itself, building tools, and sharing our expertise.

Further Reading

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store