Development of underground structures, deep excavations or cut and cover routes has become almost the only way to improve infrastructure in large urban areas, many of which contain historic cores often unsuited to modern living. The construction of these underground facilities involves the short term removal of soil or rock, the consequent loss of support to the surrounding ground, changes in the in-situ stresses, followed after some finite time, depending on the construction sequences, by the insertion of the permanent structural work, which ideally replaces the lost support.

The process is often inherently dangerous and, even if complex construction procedures and equipment are employed, must result in loss of material and deformations in the surrounding soil, these losses reach the ground surface and deform existing buildings, roads and services. The magnitude and significance of the deformations are dependant on the size, physical locations and geometry of the system, details of the stress conditions, the soil properties and the construction process, including any preconditioning of the soil and the speed of construction.

Predicting deformations

Despite the availability of many advanced computer packages, predicting deformations with so many variables is costly, time consuming and remains very suspect particularly with regard to modelling the construction process, specifically the influence of speed of construction. During project design all structures and services along a proposed tunnel route need to be assessed for potential damage, and are typically divided into the following risk categories:

  • Class (i) – Seriously at risk and requiring special compensation work or underpinning

  • Class (ii) – Possibly at risk, special works may be necessary if unexpected conditions arise

  • Class (iii) – Unaffected buildings

    The assessment method is currently based on green field assumptions and the developed inverted Gaussian settlement distribution, based on empirical loss based on the construction method and a factor for spread (k) based on the soil type.

    Engineers and designers at proposal stage tend to be cautious, consequently more structures tend to be categorised as class (i) and (ii) risk than in fact will in reality be the case during construction. Intervention methods such as compensation grouting, ground freezing and jacking are all costly, require time and resources and can cause delay. In addition, they can be very visible and hence potentially unpopular.

    There are as yet no analytical techniques that can reduce this problem, but a change in engineer philosophy can yield major savings. This has been termed ‘The Observational Method’ by Peck and several successful cases have been detailed by Powderham (Mott MacDonald). The technique involves making realistic or even slightly optimistic assessments to complement the conventional one mentioned above. This identifies structures definitely requiring special work, and allows flexibility in the handling of those falling between classes (ii), in the conservative assessment, and (iii), in the realistic assessment. For these structures remedial or protection work will only be carried out if their real performance shows that unacceptable behaviour is developing. Of course to enable this approach, it is essential to have available a rapid response system of planned remedial action.

    The key to assessing real performance and to validating the effectiveness of the remedial measures is the provision of accurate, reliable and timely data upon which decisions can be made. High quality field instrumentation is required to provide this data and as the problem is dominated by both magnitude and rate of movement, the direct measurement in deformation of the soil and of the structure itself is the primary data.

    Providing for this data has seen a trend develop in recent years (within the UK at least) for instrumentation to be provided as a separate sub contract – including installation, commission, take base readings, a method of viewing data quickly, and hand-over – all within the main construction contract. This method has been used extensively on the Jubilee Line extension and the Heathrow Express Rail Link and all current main CTRL contracts contain such sub-contracts. There are those who do not favour this approach, especially as it encourages a ‘low bid’ method to procure something which must maintain it’s accuracy and robustness in very difficult conditions (i.e. it is not easy), but, again using CTRL as an example, Rail Link Engineering (RLE) requires an audit trail to show competitive tendering is established and actively used. The current notable exception to this approach is Heathrow’s Terminal 5 project where BAA has set up first and second tier supplier agreements which cover extensive work packages involving partnering between contractors and consultants.

    Whatever the purchasing route, there is no doubt that as instrumentation becomes more complex, with fully mapped graphical software packages often duplicating data to the web, the choice of both instrumentation and it’s suppliers and installers has become more critical than ever.

    Developing technology

    Compounding the problem for specifiers and engineers is the introduction of new instrumentation, which is needed in order to meet growing cost/accuracy demands, where a catch 22 situation of unwillingness to use what may be perceived as an untried measurement device leads to the required track record being almost impossible to build up. This was particularly seen with the introduction of electrolevel sensors in the early 1990’s where there were many cases of rumours of poor performance (which in some cases were entirely justified), which did not help matters. Now, after many well-documented installations, electrolevels are a widely specified, accepted and used tool, with the arguments one used to hear being thankfully rare.

    As electrolevels were the new technique 10+ years ago, today the ability to remotely and automatically survey structures and excavations via the use of one or more robotic total stations appears to be the latest instrumentation growth area. By no means simple to accomplish, once correctly installed and set up, data in three as opposed to a single axis for each measurement point can be quickly collected, with the targets themselves being very discreet.

    Typically these targets consist of prisms, which are read at 30 minute or hourly intervals and the instrument position established using a three or more point resection from reference targets outside the zone of influence of the instrument and/or structure being monitored. Using this technique a repeatability of 0.2mm in X, Y and Z can be reliably achieved with a true accuracy of better than 1mm, all without physical contact between measuring tool and target.

    When combined with a secondary backup system of monitoring, such as arrays of conventional sensors and/or regular manual surveys it is possible to have and maintain a great deal of confidence in the automated system, by comparison of results from known common points from within the primary and secondary systems. This confidence can be very necessary if that system is to be relied upon to provide data during circumstances such as the operating hours of a railway, where the construction activity in question may affect one or more operating parameters.

    Sophisticated systems in use

    All of this complexity leads to a mass of data being collected, which needs to be compared and acted on if maximum benefit is to be gained from the (often considerable) effort and expenditure to set up the instrumentation system. A good example of this is the new Airside Road Tunnel at Heathrow, currently under construction, which has an extensive automated monitoring system, backed up by manual precise levelling, designed to integrate all types of sensors including those on the TBM itself. As the machine advances, automated extensometers, inclinometers and surface survey points read by robotic total stations provide displacement data that is directly compared with the various parameters being relayed in real time from the TBM to the surface. As the TBM passes under ‘green field’ areas such as the grass areas between the taxiways, the data collected is used to hone the operational parameters of the TBM to limit potentially harmful settlements. This data may also be used in finite element analysis to define likely deformations as the TBM crosses above or below a sensitive structure (such as the Heathrow Express tunnel), which is itself extensively monitored and the deformation movements instantly compared to the model.

    To allow for all of this to work effectively, a monitoring office, set up so the design engineer and tunnelling contractor have their respective data collection and interpretation PCs next to one another has ensured that the data (both design and actual) are correctly used and quickly responded to. During sensitive parts of the drive, such as the Heathrow Express Tunnel crossing, 24 hour cover is provided by all parties including the instrumentation provider and a Site Emergency Plan prepared to cover all possible scenarios. If necessary, all this data can be posted on a secure web site to allow off duty staff and interested outside parties to see summaries of current data.

    In some instances an ‘intelligent’ alarm system may be provided not only to warn of an event of concern, but also to confirm that monitoring is taking place and that data is being collected. For ease of use, the alarm has a ‘green’ light indicating all is well, with a ‘red’ light to indicate the need to talk to the tunnelling team. The green light has various fail-safe mechanisms including an alert if any part of the measurement chain, from tunnel to monitoring office, to control room, fails. The chain is then ‘refreshed’ by another automated system, giving the duty controller the confidence that the green light will only stay on if everything really is working.

    This may all seem over complicated, but in October this year, the systems outlined above enabled the first of two crossings of the 9.3m diameter twin bore Airside Road Tunnel TBM, only 3.5 metres above the live Heathrow Express tunnel, itself some 5.8m diameter. All involved had access to information to not only ensure the live tunnel could continue to operate in complete safety, but that potential deformations, and the time the TBM took to cross overhead, were minimised.

    Finite element analysis of the drive prior to tunnel driving (refined using data from the initial drive under the green field area), had predicted the amount and nature of the small elastic deformation of the Heathrow Express tunnel. Data was collected from some 80 prisms in arrays around the Heathrow Express tunnel read by two high end Robotic Total Stations (Leica TCA2003’s) and a secondary electrolevel system, both backed up with regular manual measurements of track level and alignment. Using all of the data the team were able to watch deformation take place as the TBM crossed and the subsequent relaxation occur exactly as expected, so that by the time the TBM was 50m past the Heathrow Express tunnel, the predicted long term deformation had all but occurred.

    The point of course being that traditionally, substantial preventative measures, some of which may have caused a permanent rather than temporary deformation to take place, would have been used to protect the live tunnel during the new drive. None of these proved necessary by adopting “the protection work will only be carried out if real observations show an unacceptable behaviour is developing” approach.

    The second crossing is scheduled to occur in around Easter 2003. The experience gained during the first crossing exercise is being used to refine the monitoring alarm levels and the response action plan. Again, 24 hour cover will be provided by all parties and it is fair to say that with confidence levels being high following the first crossing that the second is anticipated with some enthusiasm.

    The work described above at Heathrow is a very good example of the ethos described in the first part of this article, the adoption of what at first may appear a complex and expensive monitoring programme, correctly implemented and acted upon, bringing significant cost/benefit/programme advantages to the project as a whole.