How Perkins+Will London is Using VR to Improve Coordination Meetings

Angel Say
Resolve
Published in
4 min readOct 1, 2018

--

The following is a case study written by David Sewell, Design Applications Manager of Perkins+Will London on the topic of integrating VR into early design stage coordination meetings. You can hear a presentation that includes this case study at the upcoming BILT Europe and Autodesk University conferences.

Recap: By using VR during early stage design coordination meetings Perkins+Will London is 1) finding more issues 2) getting more engagement from project stakeholders and 3) spending less time “driving” 3D programs and more time reviewing during meetings.

At Perkins+Will London we are always looking for better ways to ensure the collective design intent information is accurately coordinated for use as a base for the construction team.

Flagging issues in VR with InsiteVR’s speech-to-text annotations

At the beginning of this coordination process, when things are more fluid, we have traditionally used Navisworks to highlight issues by visual inspection rather than automated clash. This process forms part of a coordination cycle where clash viewpoints are saved and compiled into a report that is circulated prior to a regular coordination meeting. While this has been successful, it does take time federating models from Revit exports, then identifying and marking up coordination issues using tools not everyone is familiar with.

Our prior attempts to mirror this process in VR have seemed cumbersome and difficult to track.

InsiteVR recently provided a software release that incorporates some of the built in workflows that we would associate with early Navisworks use and we have since been utilizing these in our coordination process.

We started by walking the model via our standard office Vive set up but the same workflow can be shared across multiple VR devices or even laptops simply with an internet connection and the InsiteVR app. This is especially useful for a recent project as one of our design partners is based in Italy and we’re in London.

It’s just very simple and intuitive.

Flagging Issues with Speech-To-Text Annotations

Walking a model at real scale makes coordination issues easy to detect. Each issue can then be labelled with a text bubble via voice recognition. It feels a little like having a Siri or Alexa assistant during your review sessions. Press a button, say your comment out loud, the VR headset records it, and the app will transcribe it into text that you can attach to the elements in question. It’s just very simple and intuitive.

These labels are automatically saved as viewpoints and can be navigated to simply by selecting as you would any saved view.

Reviewing previously flagged issues in VR to ensure resolution

Automatic Meeting Notes

Once the clashes are identified and annotated, a PDF report can be instantly produced that comprises of a saved viewpoint, an automatic mark up around the area where the annotation was placed, a comment based on the identified issue, a timestamp and the author.

The report was issued prior to the coordination meeting to form an agenda and the relevant parties then opened the model in VR and discussed each issue by clicking on the saved annotation which teleported them directly to the issue in question.

The Results: Simplified and Efficient Coordination

Feedback received was that it was an invaluable tool and the team commented that some of the issues discovered by walking the model at real scale may not have been identified at a larger scale during visual inspection. It also simplified what the team often considers a time consuming/difficult process, especially if there is not a familiarity with typical industry standard coordination software.

Although this is NOT an automated clash detection tool with a transparent audit trail we are finding it really useful on collaborative projects at the appropriate time during the design program.

VR Interview with David Sewell of Perkins+Will London

Interested in hearing more? David will be presenting at BILT Europe with Dan Chasteen — “The Reality of Virtual Reality” — on October 11th and at Autodesk University with Iffat Mai, XR Guru, on November 14 — XR in Design: How Perkins+Will Uses XR to Complement the Design Process Globally

--

--