Vuforia and Gravity Jack’s browsAR code stack
At Gravity Jack we have a unique love of Vuforia, as we have been there since the humble beginnings at Georgia Tech. As a platform, they have gained significant traction and really recognize where augmented reality (AR) is going as an industry.
In fact, several of Gravity Jack’s patents cross over into Qualcomm’s Vuforia’s capabilities. Combined with our close working relationship and unique ability to customize and optimize the Vuforia engine, Gravity Jack’s team of scientists, mathematicians and computer vision software engineers can create and extend like no other agency or studio can. Vuforia’s SDK is the best of its breed, but combined with our team and extensive knowledge — we can literally change the world.
A Vuforia SDK-based AR application combined with browsAR’s unique optimizations uses the display of the mobile device as a “magic lens” or looking glass into an augmented world where the real and virtual worlds appear to coexist. The application renders the live camera preview image on the display to represent a view of the physical world. Virtual 3D objects are then superimposed on the live camera preview and they appear to be tightly coupled in the real world.
An application developed for Vuforia with browsAR technology will give your users a more compelling experience:
- Faster local detection of targets
- Cloud recognition of up to 1 million targets simultaneously
- User-defined targets for run-time target generation
- Cylinder targets – Detection and tracking of images on a cylindrical surface
- Text recognition – Recognition and tracking of printed text (words)
- Robust tracking – Augmentations stick to the target and are not easily lost as the device moves
- Simultaneous tracking of up to five targets
- Better results in real world conditions – Low light, partially covered target
- Optimizations that ensure better and more realistic graphics rendered on the target
This diagram provides an overview of the application development process with the platform. The platform consists of the Vuforia Engine (inside the SDK), the Target Management System hosted on the developer portal, and optionally, the Cloud Target Database.
A developer uploads the input image for the target that he wants to track. The target resources can then be accessed by the mobile app in two ways:
- Accessed from a cloud target database using web services
- Downloaded in a device target database to be bundled with the mobile app
For text recognition, the developer can specify a set of words that we can recognize, using the following text data sets:
- Word lists in the VWL binary format (Vuforia Word List)
- Additional word lists, which can be specified via simple text files
- Optional word list filters (black or white lists) to explicitly include/exclude the recognition of specific words
The word lists and filter files are bundled with the mobile app and loaded at runtime using the API.
Sounds pretty easy right? Well, to be honest — it is. When you eat, sleep and breath augmented reality, of course!
Like what you just read?