Office of Naval Research
Grant #N000141010143
Grant #N000141310438

Supporting the Assembly Process by Leveraging Augmented Reality

Project description
To date, drawings still serve as the main mean for assembly guidance. However, the use of drawing guidelines is not enough for the complexity of assembling large-scale apparatus such as computer systems or other intricate components. The challenge involves: (a) defining standard formats to describe the assembly process as well as defining a common storage method and a repository for assembly documentation to which refer to when needed; (b) creating and storing a well-documented self-documented accessible assembly guideline; (c) deploy in a non-intrusive way the assembly guideline; (d) assessing user’s progress in real-time by comparing user’s assembly with a model stored from the expert (including recognizing the parts, tools, and the current state of the assembly, which is a complex problem that fits in the object recognition area); (e) providing non-intrusive hint or feedback about the current and next state of the assembly, which is also a complex problem that aside the fact it also requires object recognition, it is needed to consider that there are many possible paths to achieve a correct assembly.

The purpose of this project was to develop an innovative Augmented Reality Product Assembly (ARPA) system able to:

  • Implement the learn-by-example philosophy. The system developed learns while watching an expert human working on an assembly and will create the assembly guidelines capturing the assembly process followed by the expert human.
  • Deploy virtual tutoring for novices. The system developed recognizes the process that a novice human is following trying to assemble pieces and will provide proper feedback.

Video – Last version of the system

BB Final Video from ANGLE Lab ASU on Vimeo.