All Rights Reserved. © 2010-2020  Jing Yan
MAT works
past works
on sketchbook
about Eye

The project is about AR navigation on Android devices. Our goal is to create a navigation application that could provide user with a more intuitive way finding experience on the street and a more immersive store information display. We expected this application to be displayed on a heads-up 3D device such as Google Cardboard. Thus, besides the our two main functions, we implement voice control and hand tracking user interface to provide a better user interaction for head-mounted device.

 

The project is mainly consisted of three parts: navigation (by Zhenyu), location-based information display(by Jing), and user interface(by Junxiang).

PART 1. Navigation / Zhenyu Yang

The navigation provides the geographic information to the AR application. This part consists of four main parts: Obtaining destination via voice input from the user, requesting direction data from Google maps servers, decoding location data, visualising the direction by rendering an arrow and multiple waypoints in the 3D virtual space.  

PART 2. Location-based Information Display / Jing Yan

The second functionality of our AR navigation application is to display location-based information as an overlay on the real scene. Among all the information on street, we choose to display store information, in particular restaurant information, which is closely related to our daily life.

Scenario

Imagine a common scenario, that you are walking on a street with lots of restaurants. But you have no idea about what they are serving and which one you should go to. Before, people usually look at menus and window displacements of sample dishes. Nowadays, we search on Yelp for better photo references directly uploaded by users.

Our application intends to move one step forward by skipping that searching process. At the first time a restaurant is detected all the related information are rendered automatically using the web based Google Place API. Additionally, in ideal we hope that user can customize his/her augmented world displacement by deciding whether to show the information or not.

Mixed Reality Mechanisms & Technical Implementation

Here is the basic pipeline of the implementation:

Step1. Get the geographical location(send from GPS module), and track with Vuforia target sets(logo image are uploaded as dataset)

​Step2. Get the store location(longitude, latitude) and store name(keyword)

Step3. Get real-time Json file from that store using Google Place API, parse Json file to basic data(store name, rating, price level, image reference, place id)

Step4. Rendering basic data as text(using Texample library) and images(as textures) on screen in openGL ES.

Information Displacement

As a graphic designer, I feel it’s not easy to make nice layout in an augmented view. Because the design is generated on frames of real world video, the design background is complicated and uncontrollable. It’s difficult to keep the information clearly stand out as well as create a unique aesthetic. In this case, I am using a card based design which is clear, easy to understand, and modular based. Also, the shape of card can provide user with better affordance in interaction design.

Additionally, 2D cards can be improved by being constructed into a dynamic 3D object. They can be automatically folded up when users pay no attention to them, and automatically open up when users are focusing on them. Furthermore, in order to create a better information hierarchy and more customized displayment, virtual buttons can be added to the 3D cards to allow basic choosing. (Here is the concept sketch.)

 

Demonstration Screenshots

outdoor setting

indoor setting

 

 

PART 3. User Interface / Junxiang Yao

For the user interface and interaction part, the goal is to experiment some button control in augmented reality environment.

Voice Control 

Navigation & Store Information

// The project is developed with openGL ES, Android SDK, Vuforia, and Google Place API.  Full Documentation Website (https://sites.google.com/view/arnavigation)

Team: Jing Yan / Zhenyu Yang / Junxiang Yao

CS 291A Mixed and Augmented Reality | instructor: Tobias Höllerer