Yes, We Can Have Two 500-Year Floods in Ten Years

There have been a lot of people suggesting that Harvey the Hurricane shows that "really and truly climate change is happening, see, in-your-face deniers!"

Of course, it's possible, even though the actual evidence — including the 12-year drought in major hurricanes — is against it. But hurricanes are a perfect opportunity for stupid math tricks. Hurricanes also provide great opportunities to explain concepts that are unclear to people. So, let's consider the concept of a "500-year flood."

Most people hear this and think it means "one flood this size in 500 years." The real definition is subtly different: saying "a 500-year flood" actually means "there is one chance in 500 of a flood this size happening in any year."

It's called a "500-year flood" because statistically, over a long enough time, we would expect to have roughly one such flood on average every 500 years. So, if we had 100,000 years of weather data (and things stayed the same otherwise, which is an unrealistic assumption) then we'd expect to have seen 100,000/500- or 200 500-year floods [Ed. typo fixed] at that level.

The trouble is, we've only got about 100 years of good weather data for the Houston area.

The estimate that this scale flood happens with a one in 500 chance, then, is made using a statistical model of how often a flood happens. Using the data from Corpus Christi for illustration (because they have data for every year from 1895 to 2000), you get a distribution like this:

Annual Precipitation in Corpus Christi

The blue bars represent real measurements in inches of precipitation in a year; the blue line is the smoothed approximate distribution you get from the data.

Simulated precipitation in Corpus Christi (Example)

This is an example of the distribution you get if you simulate rainfall amounts, assuming it's a normal distribution. Notice it doesn't look like a bell curve either — it's only 105 samples. The more samples we take, the more it would look like a true normal distribution.

Example normal distribution

And this is the computed normal distribution. When they estimate what a 500-year flood is like, they assume some distribution — my guess is a normal distribution because, as I said, everything has a normal distribution and from the data and the simulation runs, it looks like a good assumption.

In all of these graphs, the vertical axis for the bar chart shows how many years had some range of amounts of rain, and the vertical axis is then interpreted as the probability of some amount of rain, and the horizontal axis is the amount of rain. So, to estimate what a 500-year flood will be, you draw a horizontal line at the 1 in 500 height on the vertical, and see where it crosses the curve; draw a vertical down, and that gives how much rain a one in 500 flood is.