Subscribe

LONGREAD: how the world is being digitised

Let’s say you're interested in an investment property. But it's in another city and you can't be there to view the floorplan. Instead, you visit a website and do a virtual walkthrough – in 3D, not just on a computer screen - where it's fully furnished, contains realistic lighting and final surface colours – before it's even built.

Or you're an auto designer and want to see how a prototype rear fin performs if you lower the pitch by a degree or two. You take your 3D model, place it in a virtual wind tunnel and take new readings to see what you have to tweak when you fabricate it.

" With the processing power of supercomputers or the cloud, 3D models are getting bigger, more detailed and easier to share than ever."
Drew Turney, Freelance journalist

Almost anything is possible to imagine, build and manipulate with software that gives us the power to digitise the world around us and manipulate it – in real time and in three dimension.

From an app on your smartphone to scanners which capture buildings or whole cities, the uses for digital models of the world around us are not only endless, they're already saving enormous amounts of money, the environment and even making new forms of art.

This industrial upheaval is achieved by a combination of sensors, computing power and data storage, all of which are getting both cheaper and more powerful. Sensors can record visual, audio, thermal (measuring heat), distance or elevation data to plot countless parameters that let us model reality and tweak it however we like. That’s why some argue the disruption is the equivalent of electricity or the internal combustion engine.

With the processing power of supercomputers or the cloud, 3D models are getting bigger, more detailed and easier to share than ever.

Click image to zoom Tap image to zoom

A 3D printed model of a city. PHOTO: Autodesk

BRINGING SCIENCE TO LIFE

Anthropologist Louise Leakey is continuing her family's search for the origins of modern humans with her foundation's project AfricanFossils.org.

Most of the fossil record is stacked away in trays in dusty museum laboratories, bogged down by restrictions and regulations about who can see or touch them. But digitally scanning the surface of fossils in very fine detail puts a virtual representation of the material in the hands of anyone who cares to view or download the 3D file.

"The accessibility of digital models of the classic skulls and specimens provides a much better opportunity to educators and students in understanding the science," Leakey says by email from her workplace at Lake Turkana, northern Kenya.

The balance to be struck is between putting your sample in as many hands as possible of those who might make a contribution to the science and minimising physical handling.

Highly detailed 3D models mean other scientists or even students can send the file to a 3D printer and get a close approximation of the original specimen (you can do the same thing with a 3D model of the Apollo 11 lunar module, courtesy of the Smithsonian Museum).

Apollo 11 Command Module | 3D Documentation

Leakey says digital models of skulls and fossils don't only save time and money in research, they have more chance of inspiring the next generation about the science itself.

"[The] digital collection [plays] an equally important role in driving the interest and in inspiring enthusiasts about this field," she says.

THE VIRTUAL CITY

The downtown area of Washington DC in the US has long had problems with both traffic congestion and flooding from rain. With a goal to become the greenest city in the US by 2032, the city undertook a huge project to redesign storm water management and the energy usage of nearby buildings – collecting 75 percent of rainwater across the district and retrofitting buildings to net zero-energy standards.

The ambitious plan to dig a huge water catchment area and parking lot under the National Mall would have been very different even a few years ago, with countless drawings and plans dug out of dusty archives, many of them out of date and with huge gaps in the data for areas that had never been accurately recorded or surveyed.

Click image to zoom Tap image to zoom

A REM render of Washington DC. PHOTO: Autodesk

A visual representation of the plan is often the best way to sell it to clients, policymakers or the public, as Moiz Kapadia, the sustainable cities specialist who worked on the plan, says.

"Visualisations and videos make it easier to tell the story of the why sustainability initiatives need happen," he says, "it allows the public and policy makers to 'get it' much faster."

Successful 3D modeling is all about the quality of the data and in the case of the green Washington plan, information came from free and publicly available GIS (geographic information system) data to model the landscape and proprietary information for the buildings.

Click image to zoom Tap image to zoom

Renders could save energy by helping retrofit windows. PHOTO: Autodesk

US Department of Energy simulation tools and proprietary weather data combined to model the flow of water and when it's put all together, it makes for a very visual and easy manipulation tool.

BEYOND THE REAL WORLD

Now here's an interesting adjunct. If you're left with a complete digital model of a city, couldn't you license or sell it to a game company or movie studio who wants to make a post-apocalyptic zombie survivalist game or movie set there?

It's not only possible, it's common practice. As Ben Guthrie, product manager, virtual production, film & TV solutions for 3D software publisher Autodesk explains, taking a virtual model of something real and chopping it up into elements to be repurposed is also called 'kit-bashing'.

The city of Paris was 3D scanned and digitised for the 2014 action movie Edge of Tomorrow certain buildings partially destroyed to depict the war-torn city.

"Fictitious buildings or props often come down to the most cost effective and realistic-looking method, and 3D scanning is the method of choice these days," Guthrie adds.

ANIMATION FOR A NEW AGE

If you saw Avatar (and who didn't), you've seen motion capture technology in action, where a high-resolution imager scans the face and/or body of an actor and transposes their visage, mannerisms and movement directly onto a completely computer-animated character. It's the reason the aliens played by each actor looked a lot like them in real life.

Click image to zoom Tap image to zoom

PHOTO: Atomic fiction

But 3D face and body scanning is used in movies, TV, advertising and games for a lot more than just aliens and monsters. In 2015 Joseph Gordon Levitt played high wire walker Phillipe Petit in the Bob Zemeckis film The Walk and rather than train to walk across a gulf 400 metres up, everything around him from the original World Trade Center towers and city of New York to the wire he stood upon were digital.

In sequences where real wirewalking was used for shots closer to the ground, a trained wirewalker did the stunt and a 3D facial scanned animation of Levitt was composited onto his body.

Click image to zoom Tap image to zoom

PHOTO: Atomic fiction

Click image to zoom Tap image to zoom

PHOTO: Atomic fiction

Faceware Technologies was the company behind the virtual Joseph Gordon Levitt, and VP of business development Peter Busch describes the technique as “capturing a person at a moment in time”.

For props and simple shapes you can simply pass a handheld scanner back and forth in front of the object. It shoots out light, the sensors measure the time it takes for the light to reflect back and it lets the computer map the distance to each point, building a 3D model of the object.

Click image to zoom Tap image to zoom

PHOTO: Atomic fiction

But for a more detailed subject like a person – especially in movies where realism is key – Busch recommends a light stage. The actor stands in the middle of a frame or whole room covered with lights that fire in sequence to capture data about shape and geometry but also texture, getting every little bump, nook and cranny and producing a software file called a 3D rig.

For an early example of how a light stage captured reality and produced a digital actor, see 2009's Emily Project .

Click image to zoom Tap image to zoom

PHOTO: USC-ICT

The 3D rig is a kind of wireframe of all the expressions the animation is capable of. Animators or compositors down the line take the rig and animate it into specific expressions or shapes, and the software remembers all the associated movements and behaviours that accompany it.

For example, a slider in the software that goes from zero (no smile) to one (full smile) will automatically animate crinkles on the forehead or crows feet around the eyes as the software renders the expression.

With all the textural information in the rig, it also means that when the scene is lit further along the production chain, the light will reflect off the digital face the way it would off the real actor.

'The Walk' visual effects supervisor Kevin Baillie - Variety Artisans

SAVING THE PLANET

Coral is in trouble. It seems every week we read about a new study warning us about how treasures like Australia's Great Barrier Reef are retreating, coral being bleached or killed from ocean acidification.

A good part of the effort to stem the tide of our dying oceans is accurate measurement so we know the rate of decline, and traditional methods give us only very rough estimates.

Divers would traditionally lay grids of PVC pipe across coral beds and make a visual guess about the percentage of surface area in a given square had suffered bleaching. To measure changes accurately you have to dig coral up, which kills it and defeats the purpose of the research.

Now a program called Hydrous is transforming the world of underwater conservation through 3D modelling. Researchers need only swim slowly past coral beds, taking pictures with very high resolution underwater cameras.

In the same way you can take a few pictures of yourself in many consumer apps and end up with 3D model (more below), sensors readings of an object form more angles let the 3D engines more accurately plot it.

Powerful computing systems (or distributed computing networks found in the cloud) process the data and return an accurate 3D model you can flip, spin, invert or manipulate as much as you want.

In Hydrous' case, such accuracy has seen the measurement errors of coral beds plunge to less than 5 percent, giving scientists a far clearer idea of how much coral we're losing.

MAKING BETTER BUILDINGS

In 2010, the Empire State Realty Trust engaged a renowned consulting organisation to make the iconic Empire State Building more energy efficient.

The first step was a BIM (building information modelling) model, which gives a picture of everything from the energy usage to the movement of human traffic throughout a building. Back then it took nine months to do all the surveying, measuring and plotting.

Today, with an array of visual, electromagnetic, photogrammetric (more below) and other smart sensors, a BIM can be called up in minutes and constantly updated with live data. BIM can tell owners and managers which power outlets are being used, light levels and humidity, people patterns such as spots workers gather socially, where they tend to work on weekends, etc.

It makes management like the heating, cooling or lighting or certain areas much more immediate and responsive.

But you can do even more with a BIM model. If you're planning to increase production in a factory and need to extend a conveyor belt through a packaging area, a BIM will let you make the changes virtually before you spend a cent, finding the most cost- and efficiency-optimum way to do it.

You don't want to start a very expensive construction project only to find there's an air conditioning duct or forklift route in your way.

The same principle – of tweaking digital models – can be used in almost any field or industry. The computer can tell you how best to design a road bridge to cross a mountain pass, using the least amount of materials, making it the easiest possible ride and exerting as little impact on the environment as possible.

As we saw above, seeing how different car designs perform in simulations of reality is far cheaper than building a single prototype.

What is BIM? (Building Information Modelling) - NBS National BIM Library

HOW TO COLLECT THE DATA

Digitising the world is all about the information, and because of the cost of capture and processing, you can produce digital models of bridges, cars, buildings or people for an incredibly low cost. In fact you can do it in apps right on your smartphone.

The most obvious sensory information for most purposes is visual, and it's possible to build 3D models from almost any collection of images. A halfway decent representation of a desk toy needs only a dozen or so. The fine detail necessary for industrial purposes calls for higher resolution shots from more angles.

But the principle behind the algorithms that assemble 3D models is the same. In fact you can collect a few dozen images of an object from a Google image search, where each picture has different lighting conditions, different resolution and was taken at different times by different cameras and still get a workable result.

But we can get more than just colours and shapes from photos. Photogrammetry is the science of mapping surface points of objects in images. Digital representations of entire cities can be made from satellite pictures.

Even more accurate 3D maps can be created through a technology called 'lidar'. Short for 'light detection and ranging', lidar is also called laser scanning. A laser is pointed at an object and the distance to the surface measured by the amount of time it takes for the laser to return to the sensor.

Not traditionally associated with visual images that contain colour, lidar is used extensively in areas like mining, archaeology, architecture, military and spaceflight – anywhere an accurate measurement of a distant or sensitive surface (such as a region of the earth) is important.

Even more seemingly futuristic is the BRDF – bidirectional reflectivity distribution function – scanner. In the below example, one of the kitchens is a real photo taken to showcase the products from an appliance manufacturer, and the other is a computer rendered simulation (pics).

The BRDF scanner lets you model the reflective properties of materials like the wood of the floor and the granite of the benchtops, and a process called ray tracing simulates how the light in the scene will reflect off the surfaces because of how the molecuar make-up of wood, granite, etc causes photos of light to scatter in certain ways, represented to the human eye as colours.

DIGITISE THE WORLD TO SAVE IT

Called everything from reality capture to world modelling, scanning the cities, objects and people around us is contributing to some of the most cutting edge practices in construction, art and countless other areas.

It's been used to reduce environmental waste, introduce youth to science and engineering careers like never before, and it's no doubt made an impact on your industry or the tools you use every day – even if you don't know it.

Drew Turney is a freelance journalist specialising in science and technology

The views and opinions expressed in this communication are those of the author and may not necessarily state or reflect those of ANZ.

editor's picks

09 Sep 2016

Will driverless cars break the banks?

Craig Ridley | Managing Director

While robots driving cars may still seem like technology from a futuristic science-fiction film, soon enough driverless cars will be a part of everyday society, which could have a surprisingly dramatic impact on a diverse range of industries.

08 Sep 2016

The future of money is out of this world (but still in your pocket)

Brooke Hemphill | Editor, ADMA

In the not-too-distant future, banks will be jostling for attention alongside a new breed of digital-only financial providers able to move at the speed of light, powered by the latest in technological advances and a lack of legacy infrastructure.

22 Aug 2016

VIDEO: How big data becomes big value

Angie Simmonds | BlueNotes contributing editor

Big data on its own won’t change the world for business, SAS Australia & New Zealand chief operating officer Lynette Clunies-Ross says. It’s only when you apply it to big opportunities that you create big value.