Augmented reality is a term that describes the operating principle of applications which capture sensory data and add certain functional information to it. The data is fixated via a mobile device’s built-in sensors such as the camera, GPS indicators, accelerometer, etc. Nowadays, such applications are actively utilized in education and science sectors, as well as by various professional establishments.
What is the difference between augmented reality and a virtual one? Virtual reality is more complex in its operating mechanisms. It completely replaces elements visually perceptible by users with an alternative one, predetermined by the application’s script. This process usually requires additional devices that would provide a sense of immersion (e.g. a virtual reality headset). Applications based on the augmented reality concept, in turn, can simply use a mobile gadget’s screen to provide the user with a proper experience.
How do augmented reality applications work?
There are two essential operating principles of augmented reality applications that are based on markers’ recognition and the transformation of data received through GPS and other built-in sensors. A detailed analysis of these two AR applications’ operating mechanisms is presented below.
- Operating principle #1 – utilization of markers. Such applications work by the principle of juxtaposition of pre-installed markers and data received through the camera of a user’s smartphone or tablet. Very primitive objects (UPC and QR codes), as well as very complicated objects (color shades and people’s faces) can function as markers. What happens when a match with markers is detected? In this case, the AR application executes a certain script (for example, it “draws” new geographical objects over the environment, similar to the popular mobile video game, Pokemon Go).
- Operating principle #2 – utilization of data received through GPS indicators, gyroscopes, accelerometers, etc. Data received from the device’s built-in sensors is processed, and information requested by the user is displayed on the gadget’s screen (for example, applications that inform users about local sights operate by that principle).
We figured out a bit about the data recognition mechanisms in AR applications; now, let us get to choosing the toolset that is especially dedicated to their creation. We presented a list of popular frameworks, below, which are most commonly used during the augmented reality development.
4 important stages in the development of augmented reality applications
Choosing the framework. Before you choose the best framework for the development of an application based on the augmented reality conception, you must decide if it is going to be native (i.e. dedicated to a single platform) or cross-platform. As practice shows, the latter option has been in demand among clients for the past few years due, to its moderate financial and time requirements.
Native rendering is common if you are working on an augmented reality app using React Native. The experience of most developers indicates that it is one of the most valuable characteristics provided by the cross-platform development environment. The feature speeds up an application’s responsiveness, which can be an issue when working with Iconic, for example. What can we say about Appcelerator Titanium? The framework features special requirements for installed libraries which, in certain ways, limit augmented reality app developers. Another component of Appcelerator Titanium’s peculiarity is the high price of its license – $39-$249 per workplace.
Let us summarize. Considering a number of extra advantages (user-friendliness, extensive technical support, API wrapper independence, adaptation to creation of complex scripts, and a special set of tools like react native, motion manager, react native camera, react native animatable, etc.) it is safe to say that React Native is the perfect choice if you want to build AR-based applications.
The operating principle of the libraries consists of following processes. Continuous delivery of camera images is recorded in each frame. Then, received data is analyzed in respect to pre-installed markers. Based on that analysis, the application gets data on the position of each marker in a frame (particularly, its turn and location). What happens to that data? It is utilized by the application for the rendering of 3D models over the real image captured by the user device’s camera.
Regarding differences between the libraries, there are a few. Users can personally define the dimensions of uploaded images in JSARToolKit. The smallest of them are processed very rapidly; however, due to their miniature size, the recognition process can be insufficiently detailed. In addition, in JSARToolKit you can customize gradation threshold requirements, which automatically define which color a certain pixel belongs to. JSAruco is not so flexible in use - this library requires the size of an image to be comparable to the size of a 3D rendering canvas. Furthermore, problems can appear in JSAruco when the canvas is very wide (over 700 pixels). In such cases, detection of markers by your application can become quite a complex process.
Arranging the AR application’s basic functionality. To create an application based on augmented reality technology and ensure it lives up to the anticipated requirements, you will have to work through a set of essentially important elements:
- Animation. Developers working with React Native framework actively utilize React Native Animatable during the creation of AR applications. This makes the design more dynamic and attractive to users. In addition, React Native utilizes a method of animation principally different to CSS, due to Animated API component usage. This feature excludes lagging during stress loads. Moreover, in order to achieve a full-fledged user experience, you can also enable React Native Sound, dedicated to sound clip creation.
- Image scanning. Scanning of objects delivered through a mobile device’s camera (UPC-codes, video, photos) can be realized via the special React Native Camera module. The development of this tool continues to this day.
- Location recognition. React Native Motion Manager is a wrapper for React Native Accelerometer, Gyroscope and Magnetometer. These tools can be very useful during the development of AR applications based on location recognition.
- User interaction. React Native uses PanResponder web API for interaction with users’ gestures. It provides thorough touch recognition and protection from unintended gestures. This tool can be especially useful during the development of applications targeted at the entertainment industry, where a player can use different types of touch commands.
- Testing application. Based on the experience of many developers, some images, places or surfaces that should be identified by your application can sometimes fail and not be recognized at all. While testing the created software solution, you must take into consideration each and every factor influencing the “readability” of streamed video. For example, lighting or the presence of various objects in the foreground blocking the spotlight are important to note. You must also think about a data backup plan. This strategy will help you save the data you’ve previously received, which is extremely important in the case of GPS or internet connection failure.
Augmented Reality app development costs, including React Native
Obviously, exact numbers of financial expenses required for the development of AR applications based on the React Native framework cannot be identified at once. Depending on the complexity of the functionality, the project’s timeline can vary dramatically, directly affecting the cost to develop an AR app.
“How do I calculate the cost, then?” We would like to try to help our clients estimate the expenses dedicated to the development of new AR-based software. We have made an approximate calculation of man hours required for the creation of its essential components:
involvement of Augmented Reality tools (about 10 hours)
development of 2D/3D objects (about 10 hours)
development of operating scripts for an application based on AR technology (over 30 hours)
CMS configuration (about 20 hours)
backend (about 30 hours)
Prospects for further development of augmented reality technologies
Apple developers have presented a new software utilities package for creation of augmented reality applications on WWDC 2017. The package is called ARKit, and it works with iOS 11 and is fully optimized for work with both the iPhone and iPad. Let us figure out its main features, below.
- Visual Inertial Odometry. The technology helps detect environmental objects more precisely via the device’s camera and concurrent CoreMotion data. The CoreMotion data is received through the iPhone’s or iPad’s gyroscopes, accelerometers, magnetometers, barometers and step counters. Such a complex approach to the identification of captured objects is, currently, one of the most precise on the market. It does not require the involvement of additional software calibration mechanisms.
- Definition of lighting levels and recognition of surfaces in a frame. ARKit possesses unique abilities of identification of horizontal surfaces in a frame. Thanks to this characteristic, it can place virtual objects over real-life scenes as correctly as possible. Furthermore, mechanisms that define the exact level of lighting create the adaptive illumination of AR elements. That way, the scene with objects laid over it looks seamless and balanced.
- Accelerated rendering. ARKit utilizes powerful Apple A9 and A10 processors. It is able to redraw augmented reality objects in a period of time that does not interfere with the threshold of comfortable animation perception – 150ms.
As we can see, the list of ARKit’s general features is quite impressive, and can be topped up with its ability for integration with third-party platforms – Unity and Unreal Engine (a huge number of web-designers work with them). Considering that, it is safe to say that ARKit is, indeed, a multipurpose tool for the development of AR applications at an essentially new level.
ApplikeySolutions has a huge team of developers that create applications focused on augmented reality conceptions, and based on React Native. We are currently the leading AR development company which promotes its own applications in the field of mobile development in Ukraine. We have quite an extensive list of achievements - among them are over 200 successfully finished projects and over 60 international clients from across the world. In addition, our four years of experience with active business functions in the domestic market defines ApplikeySoultions’ team of highly qualified professionals who are able to provide high levels of service in the confines of fierce competition. In particular, we implement a number of projects based on AR concepts.
We use the React Native framework during the development of AR applications, which allows us to create affordable software adapted for all known mobile platforms. This way, you do not have to apply extra financial and time resources for the large launch of a startup.
Already have some ideas for AR-based applications? Contact us today and we will gladly work with you on the most beneficial terms and in the most timely manner.