How long will my
portfolio analysis take?

Many clients often ask how long will my portfolio analysis take?, especially when deadlines are tight and the client wants to know when they will receive the results they need to make important business decisions. In reality, probabilistic catastrophe models can take hours, or even days, to estimate losses for a portfolio of properties. But just what factors can influence this run time?

We'll explore these factors in this blog, based on our own experience using our catastrophe modelling software, JCalf®, as well as our work to make our models available on Oasis and ELEMENTS. However, the principles will be similar for other models and platforms.

Data size and processing power

These are two of the more obvious factors. Catastrophe models crunch numbers; the more numbers there are to process and the slower the computer, the longer it takes. A client portfolio with more locations, more coverages or more perils (or sub-perils) will usually take longer to process. If the model allows you to select different event sets, picking a longer event set will prolong the length of the analysis. In the same way, requesting a high number of samples (for a Monte Carlo-based model) in order to better capture correlated losses, for instance, will increase the number of calculations required and slow the analysis down.

Data characteristics

We often see portfolios of a similar size taking very different times to complete. This is often due to the number of locations effectively at risk. Our models will quickly identify locations that aren’t at risk and assign them a loss of zero without performing any further calculations. A portfolio that is concentrated in flood prone areas such as along the coast or near rivers will therefore require more calculations and take longer to analyse than a more evenly distributed portfolio. The modelled perils will also affect the run time, although not necessarily in a predictable way. A peril such as wind will likely impact a greater number of properties over a wider area than a more localised peril such as flood, thus increasing run time. However, in order to be selective, the flood model will require higher resolution hazard and exposure data which may in turn slow it down.

Load time

The data contained in catastrophe models can be quite large, especially when they're high resolution and cover whole continents. The time required to read this data from disk into memory can represent a substantial proportion of the total run time. The biggest impact on run time will come from careful selection of the data format and ensuring that the software only reads the data that it needs, although this is a decision made by the engineers who wrote the software and not usually configurable by the users. Using solid state drives (SSDs) to store model data instead of slower mechanical disks will also help boost performance, as will adding additional RAM. Characteristics of the portfolio can come into play here too but are harder to predict and are dependent on implementation details of a particular model. If a portfolio has many similar locations or property types, the software or even the operating system might cache certain data to save on disk reads or repeated calculations.

Outputs

The same portfolio analysed against the same model with the same number of samples can take different amounts of time depending on the outputs requested. For instance, requesting location-level event loss tables may cause everything to grind to a halt while large amounts of data are written to disk or copied across the network, especially if the network isn’t designed to handle this amount of data. Insured loss calculations will also increase the amount of processing required, even though the ground up loss calculations remain the same. The effect is even more pronounced when the terms and conditions are very detailed, for instance with many policies or with location-level limits and deductibles, or when applying complex reinsurance terms.

Operational overhead

Clicking buttons can be slow work, so at JBA, we automate as much of each process as we can and try to minimise the amount of clicking required. Even so, loading model or client data requires some amount of human intervention, as does examining the outputs. Batching up runs, if the software supports it, can save precious time by ensuring the processing machines remain busy overnight and at weekends. On a shared system though, you may have to wait for other jobs to complete before yours can run.

The Cloud

We’ve mentioned processing power, amount of RAM and type of hard drive, but if you’re running an analysis in the cloud, or on a processing cluster maintained by your IT department, you may not have any knowledge of the hardware. Nonetheless, running catastrophe models transparently across many machines can provide a convenient and cost-effective way to speed up analyses.

Quirks

Just like any piece of engineering, catastrophe modelling software can sometimes behave in unexpected or even illogical ways. Perhaps leaving unused fields blank in your input file makes things run faster or maybe you need to fill in those fields with zeros or sort your file in a particular way. This sort of quirky behaviour is often easy to fix once the developers are made aware of it.

At JBA, not only do we regularly update the data used in our probabilistic models, we also improve the software used to run them to save us and our clients time and ensure that they get the data they need in the most convenient format.

If you would like to find out more about our portfolio analysis services or our probabilistic catastrophe models, please get in touch.

News &
Insights

News International Women’s Day 2024

JBA women talk about their achievements, support for other women in the workplace, and ideas to #InspireInclusion for International Women's Day 2024.

Learn more
News Mind the physical-risk due diligence gap

A failure to use good quality data and sophisticated climate change intelligence to understand the impact of flood on physical assets could be putting investors' portfolios at risk.

Continue reading
News JBA seals flood data partnership with Old Mutual Insure for South Africa

JBA and Old Mutual Insure have signed a new deal, providing them access to our cutting-edge flood maps for South Africa.

Continue reading
Blog Modelling and Uncertainty - Climate Sensitivity

JBA shares insights from its climate change experts about uncertainty and modelling future flood risk.

Continue reading