Even before Covid limited travel opportunities, one of the most significant challenges for infrastructure asset managers and engineers was discovering how to view infrastructure safely. This held for inspection, asset management, or even for those just looking for curiosity.

Until recently, the ability to share real-view conditions involved laser scanning and point cloud delivery, which are great for providing geometry but not good at colourising in the dark. However, they give CAD people some critical dimensional information. Still, their file size and ease of navigation make them less friendly for sharing across a much wider audience and range of disciplines. Sometimes, the cost of LiDAR data acquisition, processing, cleaning and publishing makes them difficult to justify economically. Overall, a great tool but it constitutes just one way of cracking this particular problem.

Astrea, a small company based in Glasgow, Scotland, saw this as a problem to be solved. Using techniques from experience in the film and games sector, 32-bit photogrammetry, and artificial intelligence (AI), it set about developing a cost-effective and straightforward method to capture reality using photographs, not point clouds.

The technology to capture these images is pretty basic. Anyone can do it with an off-the-shelf DSLR camera. The breakthrough was to find a way to improve the images. We can present users with a daylight view, captured under any lighting conditions, then share these without any need to download anything else or the model. We call these ‘Smart 360’ images – ‘smart’ because we can capture any scene in tunnels and mines, under bridges, inside structures, or at night, and view them in near-daylight simulation. ‘Smart’ because anyone with an internet browser can view them, providing what we call the ‘be there from anywhere’ experience; and ‘smart’ because we can also use them to share, store, find and manage any documentation about that object, scene, or structure. 32-bit processing provides true colours and luminosity values with sub-millimetre pixel size, so each pixel becomes a rich data source. Hence the name Richpix.

The Astrea team had to find a way to represent true colours and brightness across an image scene. In some, there can be major contrast differences across the image. The camera will optimise exposure, time, f-stop, and ISO used to capture the frame but can leave areas under or overexposed. Aside from losing texture and detail during photogrammetry processing, these clipped areas create holes in the data, point clouds, and subsequent mesh models. But 32-bit processing has enabled Astrea to enhance these specific areas of the image while leaving the rest unadjusted.

The application of 32-bit image processing evolved from the team’s forensic crime scene, CGI, and games industry experience. Many professional photographers use High Dynamic Range Imaging (HDRi) mainly for aesthetic purposes. For Astrea, it is about maximising the detail and data that can be extracted from that image. 32-bit is a step further into image processing and uses a technique called image-based lighting (IBL). In any film, to make scenes look realistic, any CGI object, vehicle, animated character, or model must be lit in the same way as the real actors and environment that they share a scene with. Lighting and CGI artists use 360 HDRi imagery to provide ‘light maps’ of accurate colour and an aspect specifically crucial for each pixel’s luminosity values. These 32-bit light maps are also amazingly detailed and controllable images.

Most current digital twins are delivered as point clouds or textured meshes and designed to be opened, viewed, and used with CAD packages. Commonly, those in design and maintenance would need these. However, Astrea identified that digital twins information could be of use to a far wider audience than just those familiar with CAD software. For example, health and safety, and workers in familiarisation and site induction, and risk and ecological assessments could benefit from access.

Regarding accessibility and shareability, we imagined emergency responders having site layouts and virtual tours available on any device in the cab on the way to the scene, onsite, or in a control room. It is not just engineers and maintenance professionals, consultants, and discipline specialists who can have a closeup view from wherever they may be, but also suppliers and procurement bodies. This includes, for example, a simple link in tenders, allowing suppliers to see the site conditions, not just having 2D drawings, so there are no surprises for them. Clients tell us that they have found that it reduces the procurement process costs and the final bid price.

You also have an as-built document, a record for heritage conservation or contract handover, before and after, and conditioning monitoring over time. We can also take advantage of our latest version as we can also integrate with the new generation of phone-based LiDAR devices or 360° imagery. And if necessary, we can view everything in 3D, viewpoint cloud, IFC’s and mesh models in the same viewer.’

As the team uses 360° high-definition imagery, it is also possible to build a 3D model from the images, too, should there be a need to, and establish some survey control in the pictures. Fortunately, tunnels are an ideal subject structure for 360° photogrammetry due to the presence of 360° texture, which can be stitched, providing sufficient overlap from camera positions. If so, the user can achieve the same accuracy and significantly better mesh models than point-cloud capture. It is important to remember that the team at Astrea is not capturing points but using actual photographs in its version of reality capture and production of asset visual twins.

There are some structures – and in almost every ‘interior’ mission – which make drones difficult to fly and control. Photogrammetry requires overlap, and it is tricky to shoot and pilot a drone manually to obtain the required consistency of image capture. Pre-programmed flights, which use GPS positioning, will simply not work in tunnels and steep cuttings. An example of such a project was the Frangoch Tunnel in North Wales, where Astrea worked alongside York-based tunnel inspection company Inspire (Structures).

Opened in 1867, Aberdovey No 1 is one of four tunnels on the Cambrian coast railway in North Wales, and a four-hour round trip from the engineering office. As in most cases in railway structures, safe access was only available for a few hours during the night shift. With only a simple head torch for lighting and one additional torch, the team could shoot smart 360° images delivering complete coverage of the 180m-long tunnel in just a few hours. We can visually document any structure with this technique to a GSD of >0.1mm (the size of a pixel on the ground) at over 100m/hr with only one man and one camera. The kit costs a tenth of the cost of a laser scanner. Engineering and asset managers back at HQ could view their new visual twin through any web browser using a simple link to a website. However, this could also be shared with experts much further afield. A key example of this was the sharing of a 120-year-old bridge renewal project in London. In just one day, 180 subject experts and disciplines from Chicago to Bangalore could see, share and collaborate with the as-built visual twin. It is entirely possible to have your visual twin within an hour for emergency response and geotechnical events.

It is simple technology but gives tunnel engineers, both in construction and maintenance, a new set of documentation tools, visualisation collaboration, and improved safety. It is probably as crucial for deployment that it does not stress the IT department or make any significant holes in a budget. In the current health climate, the benefits of tools such as these which help enable remote working and decision-making, reducing the need for travel and minimising contact are becoming even more critical. From Birkenhead to Bogotá, from Bangalore to Brisbane, anyone can be there from anywhere.’