Using Augmented Reality to Visualize Engineering Designs

This week I experimented with using an augmented reality app at a public meeting to display this simple visualization of one of our projects.

3D model of a project using Augmented Reality

My experience was that it did seem to help with explaining or showing people not only an overall snapshot of how the roadway will look when finished, but it also helped show specific design and operational details that were difficult to describe. So I figured I'd post a quick explanation of the software I used to create the model and which app I used to host and display the visualization in case anyone else is interested in trying something similar out at their own meetings.

To begin, I would have liked to just export the project directly out of CAD and into the program where I assembled the 3D model because this allows for a more accurate representation and saves some steps. I've done this before using AutoCAD, and it was very straight forward and easy. Unfortunately we use Microstation at work instead of AutoCAD and so far I have been unable to find a way to get this to work with that software. The main problem seems to be that although it has 3D export capability, the program will not allow me to export something with a thickness so everything ends up flat. And it won't let me expand my export in the "Z" direction even if I import it into other 3D programs. I've spoken with a representative of the company, and as I've indicated in previous posts, he said Microstation is not interested in supporting this because they don't see a need for engineers to use this type of feature – as he put it "we're engineers, not gamers." So because of this, I had to create the 3D objects in other programs and use a PDF of the plan view as a guide in placing the objects. Also because of this and my time constraints I only modeled a portion of the project where there were no complex shapes.

So the program I used to create and assemble the 3D objects is Opensimulator. It's a free, open source program that runs as a web service and allows you to create and texture 3D objects then export them as models. You can set up Opensimulator on your own desktop by using something like SimonaStick or you can run it by installing it on your computer and connecting it to a service that allows you to access the interface with the use of a viewer. I used the second option and connected it to a service called OSGrid then used the Singularity viewer to log in and build. The main difference between building with this method and one that would use a more traditional 3D program like Blender is that building in Opensimulator is much more intuitive because you create using an avatar or 3D representation of yourself. Opensimulator has also greatly simplified the creation and texturing of objects.

Here is an "aerial view" of what my "build" looked like inside OSGrid right before I exported it as a 3D model.

As a side note, what is nice about building in Opensimulator is that you can use your avatar to walk through the project and get a feel for how it will function. If we were also building a streetscape, the use of an avatar helps assist in placement of elements. And from what I understand, if I had an Oculus Rift device, I could have put it on and immersed myself in the design as if I was actually there. This is something I hope to also eventually try as a design tool once I get a chance to buy one of those devices.

The only elements in the photo above that I could not create in Opensimulator were the cars and curbed medians. As you can see from the attribution note in the photo, the cars were 3D models I downloaded from the Kator Legaz website then uploaded into Opensimulator using Singularity. And I also uploaded a median that I created in Blender because I wanted the top to be curved like a regular curb is – Opensimulator does not allow for the creation of something like this so I had to use Blender then import the 3D model into Opensimulator. For all the graphics or textures applied to the models, I created them using a graphic software package then uploaded them into Opensimulator. There are many graphics programs that I use, but if you are looking for a good, free program, you can always use Gimp.

Once everything was assembled, I used the export 3D model feature in Singularity to create a Collada file of my build. Then I used my account on Augment to upload the model so I could access it and view it using the Augment app on my iPhone and iPad. Having the model available at the meeting made it convenient to show people what it will look like when built. For example, when I was having trouble explaining how the inlets impact the bike lane, I was able to just use this model to show someone how the inlets effectively reduce the bike lane from 7.5 feet in width to 5 feet if people do not want to ride over the inlet. Overall I would say having the model did enhance the ability to share our project with the community, and I hope to be able to build upon this experience to create more complex and detailed models in the future. If you want to check out the model yourself, you can access it here:

3D Model of Protected Bike Lane

Share

Augview – a Window to Your Underground Assets

 

Water Main Installation

Augview, founded by Michael Bundock in 2012 in New Zealand, is the first commercial, mobile application I have seen offered to the public works industry allowing utilities to geospatially capture, store, and display underground utilities in 3D through the use of a tablet or other mobile device. The software, through the use of GIS, will show operators their water, sewer, or other underground lines superimposed in 3D upon the ground in a geospatially accurate position. Users can then query the lines as with any other online GIS and access data about that utility such as size, material, age, and any other type of stored data. Or if a locator finds a discrepancy in a line's location or if he finds a new line, he can enter it into the software and immediately verify the updated or new location is accurate.

One example I can think of where I could have used this type of device was when we found a patched area in a roadway on one of our projects. It was one of those typical failures you find where you can see someone repaired something, but there's still something going on because a small hole opens back up with a void underneath. A lot of times this is caused by a hole in a sewer which allows soil above the pipe to wash away into the line leaving a void under the pavement. I knew the city had a sewer running along the roadway near that area, and I noticed a water shut off box nearby in the parkway. Because in our area the sewer lines used to be run with the water lines, I suspected it could be a failure with the building sewer. The business owner came out to comment on it and mentioned there had been a problem there, but it was difficult for me to tell for sure from what she explained if it had been the city sewer or the building owner's line. If I had Augview, I would have seen how all these lines related and where they were located. This visualization would have offered a better prediction of exactly which line could possibly have a failure. Of course public works professionals already try to make this determination using paper maps, but if it was the building owner's line, it is much easier to explain the problem to them using a 3D representation of everything rather than expect them to read a utility atlas.

I would have also liked to have an application like Augview for management of our water network. Our crews could have used the application to document the valve position when they opened or closed it. Then we could have just driven by to see if we had opened them all back up after we repaired the break, or we could have noticed when a valve between our pressure zones accidently was opened.

It would also be useful to use Augview to look at non-utility data for something like visualizing roadway ratings in the field. Then each year when we went out to rate the roadways, perhaps Augview could color the roadway based on the rating we assigned the year before in our GIS. This would prevent us from juggling paper maps in the truck while we are trying to also view and assess the pavement.

Past articles on this site have also imagined one day a product like Augview could be used to assist contractors as they build by displaying not only the underground lines but actually superimposing the plan onto the site. And I don't think it will be long before this type of implementation is extended to allow us to display real time data too. I can see one day we will be able to look up at the water tower and actually see the level of water in it or be able to see an indication at our water or wastewater plants of the flows running in and out and through each process. It would also be interesting to be able to drive by our lift stations and see the whole area colored red or green rather than just see the little red/green run light. This is also another facility that could display flows, seal failures, water levels or any other type of data.

While at the present time Augview has primarily been implemented in New Zealand, Melanie Langlotz,  business development manager, said she is "also looking for interested parties in the U.S. who can see the possibilities." So I believe it won't be long before we see Augview in use throughout the U.S. and other countries.

You can find out more about Augview by watching the video below or visiting their website or other social media sites:

Share

Adding Augmented Reality to Your Zoning Ordinance

Augmented reality holds promise and opportunity for the public works industry. With the development of this technology through efforts like the Smart Vidente Project and the recent release of development tools by Layar, we are moving closer to actual implementation on the job. And over the next few months, I plan to try to set up some layers for work using the Layar service. So I was discussing this with another engineer at work and trying to come up with some ideas to try out. We  thought about setting up some virtual notifications programmed to pop up when we approach certain properties in the city. These notifications could alert us to drainage issues, special structures in the area, and other information in which we would be interested. But as we imagined this future of virtual objects left waiting to be discovered, the other engineer came up with a brilliant question. He wondered if anyone could set out virtual objects like this, what would prevent the world from becoming a virtual junkyard.

DragonHe had an excellent point. Right now the technology is so new, we don’t yet have this problem. If I walk through out city today with the Layar app on my smart phone, there is a good chance I will see no virtual objects. But once the technology becomes ubiquitous, we could walk through our cities looking for something with our augmented reality app and find ourselves bombarded by virtual junk. This virtual junk would also detract from augmented reality uses for wayfinding and utility location. And what could be worse is that people could leave virtual junk on our private property. Maybe someone gets upset with someone else and decides to leave a virtual sign with some not-so-nice wording on that person’s lawn. So there will probably come a day when there will need to be some regulation defining where virtual objects can be placed. But once you start considering this and how it fits into our current system of governance and regulations, it becomes quite an intriguing discussion. Local government regulates and permits placement of objects in the right of way, so should this permitting be extended to virtual objects placed in the right of way? And who should have the authority to regulate what is placed on private property?

Developing the framework for regulating augmented reality in public and private spaces will take some thoughtful consideration and time. But setting this up now will make it that much easier when we get the first call from one of our citizens complaining because someone put a virtual, 20-foot pink dragon breathing fire on their property.

Share

Augmented Reality for Public Works

Construction siteAugmented reality (AR) has been gaining ground over the last couple years—most likely as a result of an increasing number of applications incorporating AR and an increase in the capabilities of supporting technology. But while the advances have been useful and impressive, I have not seen much related to the public works industry. This surprises me because AR could be incredibly useful and could increase efficiencies and decrease costs. So I thought I would post a few ideas of ways in which AR could be applied to the public works field with the hope that someone takes up the challenge and implements these tools:

Utility Locates:
Utility locating can be a pain, but it is important to prevent damage to the utility or injury to people working near the utility. Current tools of the locating trade can include a map on a laptop or on paper, locating devices for accurately pinpointing the utility location, shovels, picks, probes, and paint or flags for marking the location in the field. The reason this task is so challenging is the need to rely on maps that many times are not accurate enough to allow the locator to just walk right up to the utility.

For example, a locator might have trouble finding a water shut off valve in someone’s yard if it is buried under snow or dirt. If there are accurate measurements to the valve, the locator uses a measuring tape and map to find the general location of the valve. And if there are not accurate measurements, which is often the case, the locator would have to randomly search the area with the locating device. Once a probable position is determined, the locator digs for the valve with a shovel. This can result in multiple holes being dug before the valve is found. If other utilities are in the area, readings can be inaccurate which makes finding the valve even harder. The whole process can be very time consuming.

Some cities have their valves in a GIS allowing the locator to walk to the approximate location with the help of a GPS device. This is very useful, but how much better and more intuitive would it be if the valve could be projected digitally onto the ground using augmented reality. The locator drives up to the site, gets out of the vehicle and puts on a headset or uses a mobile device, and all the utilities show up on the ground through the use of augmented reality.

Engineering Design:
Using this same idea, engineering design could be greatly simplified. If an engineer needs to improve a road by installing curb and gutter and a new storm sewer, maps must be collected and utilities marked in the field to designate locations of gas, electric, water, etc. Only then can the engineer determine the best place to put the curb and sewer. If all an engineer had to do was drive out to the job and use augmented reality, the best locations for the new improvements could be determined faster with more accuracy.

Engineers could also use this if a resident calls with a problem. Many times, when we respond to residents, we do not know exactly what the problem really is until we get to the site. So we might not have everything we need to determine if we can help with the problem. But if the resident had a question or problem related to a utility such as needing to tap onto our sewer or water or if they had a drainage problem and needed to tap into our storm sewer, I could not only determine right away if there was a feasible solution, but I could also show the resident by having them use the technology. Seeing the line on the ground would mean more to them than looking at a line on a map. And how much better it would be for them if we could animate the line somehow showing water flowing.

Maintenance and Construction:
Augmented reality could also be used to make sure crews are working on the right asset in the field. If we could digitally mark the manhole that needs to be fixed or the tree needing to be pruned, or the area in which I want landscaping planted, we could reduce confusion or errors in the field.

And if a contractor is installing a pipe, he could use augmented reality to see where he needs to dig. This could also assist the city in showing property owners where improvements will be made. Residents could use AR technology and actually see how the new road will look.

I could have also used AR when I was putting up the trim at my last house. This would have prevented me from drilling into a pipe or it could have helped me find the studs.

There are a lot of other uses we could figure out to help us better perform our job in public works. Hopefully this post helps generate some more ideas and maybe even challenge someone to develop an AR for public works tool.

Share

Virtual Cemeteries

Managing a public cemetery is just one of the many tasks handled by a public works department. Typically we take care of cutting grass, repairing monuments, paving/plowing roads, removing leaves, burying people, selling lots, and handling the documentation and requests for grave locations. Little has changed over the years in how these duties are administered. But now, new tools like mobile devices, virtual worlds, and augmented reality offer us the ability to enhance delivery of some of these services.

Augmented reality (AR), or overlaying a computer generated image onto the real environment, is now available with the use of a mobile device like the iPhone. So how can this be used in the cemetery?

First let’s see how the City of Manor, Texas, used AR to created a Christmas greeting with the help of Muzar.org: http://www.flickr.com/photos/cityofmanor/4203935446/

Applying this concept to cemeteries, cities could contract with organizations like muzar.org to allow people to post digital content for their loved ones grave site. This content could be images of the loved one or family or even the home in which they lived. As a genealogist, I could also see the benefit of displaying documents related to the person’s life. Perhaps eventually people would be offered the chance to save this digital information for their own family history files.

At some point perhaps AR could also allow us to input a loved ones name while standing in the cemetery and have a virtual path displayed on the ground leading us to the grave. This technology could also allow the city to arrange for unsold lots to display a certain color when a person scans the cemetery with a mobile device.

The city of Manor is also using QR codes – here is an example of their use in a city park: http://www.flickr.com/photos/cityofmanor/2780890639/. Cities could use these codes in the cemetery to convey information. The codes could be placed on or near the graves or on maps printed out from a city Website or distributed at the cemetery. Then visitors to the cemetery could access the code once they are in the cemetery. These codes could link to information posted by the family, to information held by the city about that gravesite, or even at some point to information about the person on sites like Ancestry.com.

Finally something I have not yet seen but wonder at its application is the use of virtual worlds. If a cemetery was recreated in a 3-D application such as Second Life, a person could virtually visit the cemetery. This technology also allows for people to attach information, images, and video. I also wonder if someday we will be able to link our avatar’s movements to our own. Then while standing in the real cemetery we could access the virtual cemetery on our mobile device and walk our avatar to the virtual grave while we walk to the real one. This would allow us to experience whatever was placed at the virtual grave while we are standing at the real grave.

As an add-on: @RogerSmolski passed along a link to an interesting post about the use of QR codes in cemeteries in Japan: QR Code from The Grave

Also I am trying to add at the end of each post a disclaimer indicating that the opinions expressed here are my own and are not meant in any way to reflect those of my employer.

Share

A Little Bit of GIS in My Life

So today Rick Knights, technical service associate, with WTH Engineering stopped by to continue his work on helping us implement our city geographic information system, or GIS. Rick is a certified ENP or emergency number professional who is very familiar with how the 911 system works in the states. We began discussing the many intricacies of the 911 system, how it all evolved, and then started in on GIS – where we are at and where we are going. Wow – this was one of those conversations that begins low key and ends with a major revelation.

Because I am excited about the potential of virtual worlds and augmented reality and how it all fits with the GIS component, we began going down that road. Well, while we were debating the need for accuracy in determining parcel boundary data, Rick threw out this idea: perhaps someday, we would decide to develop parcel data to the point that it did accurately reflect the actual property boundary so that this data would then serve as the established property line. As a former land surveying technician, I was not too certain of how this would be received by that industry. But then when I entered the idea of using augmented reality, Rick brought up an intriguing idea of being able to project property lines onto the actual ground using that technology. How useful that would be when determining setbacks or resolving property disputes.

From there we went to using GIS data that is officially established and maintained by utilities and government agencies and running it through an augmented reality interface to create virtual lines on the actual ground.

Well for anyone who has worked construction, you realize what this could mean. Using the proper technology, you could arrange to project utility lines on the ground – both proposed and existing. With this technology, perhaps one day, I could PROJECT MY DESIGN PLAN DRAWN IN CAD FOR A PARTICULAR AREA ON THE ACTUAL GROUND ON WHICH IT WILL BE BUILT! Then I could walk around making sure everything fit with existing grades, structures, etc. And I could better show the residents who always want to know, “how far is the road going to be widened into my yard?”

Having read about and looked into augmented reality, I cannot believe my mind had not already made this leap before! How useful that would be during the design and even the construction process. Can you imagine having the proposed sewer line projected onto the ground during construction so the backhoe operator can always see the alignment, along with all the other utility locations.

Perhaps others have already made this leap in the use of this particular technology, and I have just not yet come across it. So far everything I have seen has been about projecting a proposed building design on a lot for planning or projecting internal body systems onto a person’s outer skin. The idea of using this technology in the engineering and public works industry to virtually display designs and utilities on the ground or internal buildings systems on a wall is exciting to me because this is bringing the technology down to the level at which it becomes useful on a daily basis to city personnel.

So thanks Rick for the 911 info and help and that little extra promise that GIS can bring to my world.

Share