The complexity of tunnel design is well understood. Numerical modelling programs offer a means of considering the complicating factors (such as heterogeneity of the ground and complex geometries) explicitly in design. The use of numerical modelling has risen rapidly with the increase in computing power and user-friendliness of the programs. No matter how sophisticated the numerical model, it is still an approximation of the real case. Numerical models can be used to predict the behaviour of tunnels but only if sufficient information is available to define the model. Often this is not the case. Nevertheless, numerical models can play a valuable role in identifying mechanisms of behaviour. In either situation a great deal of care and engineering judgement is required in the interpretation of the results. There are significant risks in the use of such complex tools by inadequately trained personnel.

A bewildering array of programs are commercially available and most of them have not been designed specifically for tunnelling. Even if not directly involved in numerical modelling, engineers may well have to assess the suitability of a program or to evaluate the results. The first step in choosing the software package is to decide whether the ground is discontinuous or continuous. The term “discontinuous” refers to ground that consists of a collection of discrete bodies (e.g. a blocky rock mass). “Continuous” refers to ground that can be considered to behave has single body (e.g. sand, clay, fault gouge or massive rock). Different methods of modelling are available to simulate the two fundamentally different types of ground (see table 1). Other important criteria in evaluating software are its:

Capability – can the program handle the problems you need to analyse? For example is it limited in its modelling of groundwater flow or three-dimensional geometries?

Flexibility – can the program be customized to overcome limitations? For example can user defined constitutive models be added to simulate a particular type of ground?

Speed – Boundary Element and Finite Element programs usually analyse models faster then Finite Difference programs;

Skill – what prior experience/training do the users have of the software? The learning curve for all programs is very steep.

Having chosen the software, the next challenge is to obtain meaningful results from it. Table 2 contains a list of the sources of “error” (i.e. discrepancies or mistakes). These errors should be removed or quantified and the results corrected accordingly. In some cases there is established guidance, in other cases the numerical model, as a whole or elements of it, can be calibrated against analytical solutions or experimental and field data. Benchmarking exercises can provide a check on the modelling process as a whole. The normal process of checking calculations should be applied to counter human errors. It is worth noting that even the task of presenting the results can be quite difficult.

Fire-life-safety

As recently as thirty years ago tunnels were often designed with a much lower regard for the safety of their occupants. For example, in road tunnels, ventilation was usually based on the amount of air required to dilute vehicle exhaust gases to acceptable levels and in rail tunnels on the amount required to provide fresh air for passengers. Emergency situations, such as fires and evacuation, were rarely considered in detail.

The first computer applications for safety in tunnels were probably the computational codes written in the 1960s to calculate transient pressures due to the passage of trains through tunnels. Their development was driven by the need to understand what speed restrictions were required for new, faster trains in old (i.e. small) tunnels to ensure that the pressures exerted on passengers’ ears were acceptable. Since then, the same basic codes (Thermotun, MM aero, SES) have been augmented to allow the modelling of features in tunnel ventilation systems such as air shafts, cross-passages, fans and platform screen doors. They are now used to assess pressures and draught-induced air velocities in more complicated tunnel systems such as metro systems. By including heat sources (trains and electrical equipment) and heat conduction through the lining and surrounding ground, annual temperature variations in the metros can be estimated.

Similar programs have been written to estimate the behaviour of hot smoke in tunnels, based on one dimensional flow, two zone (hot layer/cold layer) models or 3D finite volume techniques. Examples of these programs include JASMINE, FASIT and Hotflow.

Computer calculations of human behaviour have also been developed, based on experimental studies, for estimating how long it would take to evacuate enclosed spaces (e.g. Mott MacDonald‘s STEPS program). These programs usually work by defining the geometry (tunnel, building etc.), restrictions such as doorways and moving elements such as escalators and lifts. Then the people are defined – tangible factors such as size, start location, walking speeds on different terrain, and intangibles such as familiarity with the geometry, impatience, grouping (for families). The calculations then estimate how long it will take to evacuate. The results are not exact but they can be used to assess the effect of adding or removing escape routes e.g. better/worse. Most of these programs are proprietary software developed by individual companies or universities, though a few are in the public domain or are available under license.

Risk management

There are two main ways in which computers can be used to support risk management – risk modelling, and risk analysis. Risk modelling involves the incorporation of uncertainty within what might otherwise be termed ‘deterministic’ traditional methods of analysis. As an example, large numbers of routine analyses might be carried out, each with slightly different but plausible soil parameters, in order to look at the impact of variable ground conditions on a design case, eg: tunnel face stability. Of course, the output is no longer a single ‘factor of safety’, but a range of partial factors expressing the range of strengths mobilised in each of the analyses. The engineer makes a choice (for example) of most probable, minimum and maximum values of strength, and a computer is used to repeatedly pick random values of the variable for use in the stability analyses.

Output may be given as a curve showing the probability of achieving a particular value of partial factor and, where a number of variables are considered, a ‘tornado’ chart that correlates the sensitivity of the output to variations in each variable. Simple analyses that can be carried out within Excel, can use the spreadsheet add-on @RISK, but for most purposes it is necessary to combine random number generation for parameter determination with random generation of spatial characteristics (for example to examine the effects of natural spatial variability, and anisotropy). The challenge of carrying out analyses like these within a finite element or finite difference code is obvious, but whilst this type of analysis presently remains beyond the scope of most practitioners, it will not be long before computer power allows its routine use.

At present, one of the most widely used risk management tools is risk analysis. Risk analysis does not, as might be thought, involve complex computation. Rather, it is a simple technique aimed at identifying and qualitatively evaluating the risks to the successful completion of a project, of ensuring that all parties to the project are aware of the risks, and of getting those most able to control them to do so. Spreadsheets are often used as a basis for recording the data derived from the process. In a recent development CIRIA has developed an add on for Excel, known as ‘RiskCom’, which can be used as a more formal basis for risk analysis, and for developing standard catalogues of hazards and risks that can be evaluated on each new project.

The use of computer software in design is now so widely accepted that the actual benefits are rarely quantified. While computer technology can be a powerful tool for engineers, the sophistication can be deceptive. In a recent benchmarking study, a simple tunnel design problem was set but the participants produced widely different predictions of ground movements (even when using the same software)(2). As the old adage goes “rubbish in = rubbish out”. A firm understanding of how the programs function themselves as well as sound engineering judgement are required. In the past it has been this engineering judgement that has made up for the deficiencies of simple analytical design methods and this knowledge is just as important now.

Finally one should remember that this is a developing field. Much of the development is carried out at universities. There is plenty of scope for the tunnelling industry to improve its use of computer technology by making use of this cutting-edge academic research.