Learning More about space using LEGO®, Augmented Reality, and the HP Reveal app
Introduction
I have been particularly interested in Augmented Reality (AR) lately. Most of the work I have been researching on the medium involves the use of a smartphone scanning a physical object to produce a digital model or projection, which is used as an educational or training tool. That experience is really fun, but I am reminded of how similar this is to a person scanning their phone over a QR code to launch content on their phone. It may not be a 3D model or some hologram like it is with AR, but the process and steps involved are identical: Open an app on a phone, scan something, experience something else. If this is the smartphone approach to AR, the fun experience people have today may become boring over time, similar to how QR codes are regarded now. New, creative experiences must be developed. I thought it would be interesting and different if the only way a person could scan a physical object to then launch something digital was if they first had to create that physical object themselves. Since LEGO is all about getting hands on with materials and creating something out of smaller parts, I thought it would be novel and fun to create an experience where a person would build a LEGO set, which could then be scanned to reveal information that otherwise was unobtainable. To further the impact, I chose to build and create an AR experience for the Saturn V Rocket set. This is part of LEGO’s “Ideas” Series, which has an educational component to it. This makes sense given this is the rocket NASA used to take man to the moon in 1969 and is considered one of the most amazing engineering achievements of the 20th century.
Procedure Used
After purchasing the set and opening its contents, I noticed the instruction manual had the user complete the build in various stages, similar to how the rocket – once launched - was used in stages to propel the passengers and the payload to the moon. This was a good approach given the amount of pieces involved meant the build would take over five hours and it is important to be able to have the builder take breaks in logical places, similar to how an author writes a book in chapters. This quickly led me to the idea of creating multiple AR scannable objects, that collectively would educate the builder on the 1969 Apollo 11 moon mission, as opposed to one educational lesson that would be revealed by scanning the entire rocket once complete and educating them all at once. This approach also meant the builder/learner would be rewarded at various stages of the build rather than with one final reward at the end, and breaks the lesson into smaller components very similar to how microlearning’s approach is all about dividing up a large amount of content into more consumable pieces. The instruction manual included by LEGO also had a few pages of content that educated the learner/builder on the Apollo program and the Saturn V rocket, so I leveraged those materials into the learning solution via an animated video made in PowerPoint in order to provide continuity and familiarity between materials included and later revealed. In general, my approach was to build a section of the Saturn V rocket, then take that built section and develop an AR scannable object from it. I used the HP Reveal app to do the AR coding. HP Reveal was formerly known as Aurasma, an app that has been used in the educational field before to create AR experiences. The AR objects developed were made public, so technically now that it has been created, anyone with the Saturn V LEGO rocket kit can scan these objects to reveal the additional content.
Obstacles Faced
Most of the obstacles were related to the content I wanted to leverage. The HP Reveal App allows you to pair an image or video to an object scanned, but the videos have to be actual files and below a certain size, rather than links to those files. This just meant when it came to leveraging excellent educational videos about Apollo 11 and the Saturn V Rocket, I first had to download the videos from YouTube, compress them, save them to my phone, and then attach them to the object during coding. I would have liked to have kept those videos as links so not to infringe upon intellectual property rights, even if it is a concept project. The same could be said about the LEGO instruction manual images and Apollo mission images I leveraged to create a video from – I had no legal right to leverage those but did so for the point of proving this project was possible.
Results
HP Reveal does not keep metrics on how many times objects you code/create an AR experience from are scanned by you or others, which is unfortunate because that would be a great indicator of how often these objects have been and will be scanned. However, the results of the project creation can be seen from watching this part of the overall project video (link is time stamped to go specifically to results of the experience).
What Was Learned
Aside from recalling how fun LEGO build are, I feel I successfully proved a new avenue is possible for AR projects. I also learned this is probably best reserved for scanning objects to reveal images, not videos. Or, the videos should be very short. This is because when an object is scanned and reveals a video, the video restarts if the phone used to scan the object is even temporarily interrupted – by an external notification/call, or after the scanned object leaves the phone’s camera’s field of view. This takes a fun, enjoyable experience and turns it into a slightly less pleasant user experience given the videos made were each a few minutes long.
Future Next Steps
I would like to see HP Reveal expand to offer more features, and I would like to expand my research of AR apps to see what other possibilities are out there. I would like to see the introduction of 3D images/holograms to more mainstream apps, and see the ability for learners to get more interactive with revealed objects. Things like scanning an object to reveal a menu of options would be great, for example or the ability to scan an unlock content that then is saved on the user’s device so once it is unlocked it can be consumed on demand, rather than always having to scan the object to reveal it.