What is an ARKit?
AR – Augmented Reality. ARKit is Apple’s framework for AR apps and games. It is a developers tool that user don’t need to worry about. It is a development platform used by the developers of an iOS app development company.
The ARKit library was introduced by Apple in WWDC 2017. It allows the developers to build apps using the AR (Augmented Reality) technology.
~ AR apps allow us to view virtual objects in the real environment through the camera.
~ AR describes user experience that adds 2D or 3D elements to the live view from a device’s camera in a way that makes those elements appear to inhabit the real world.
~ For a 2D object, SpritKit Library is used. For 3D Objects, SceneKit is used.
~ With 2D AR, you can add overlays or signposts, which respond to geographic location or visual features in real-time or with 3D AR, you can visualize how a painting might look in the walls of your living room or on the walls of your without you having to leave your home.
~ At WWDC 2018, Apple released ARKit 2.0 with a slew of brands new APIs and features for Augmented Reality Development.
~ ARKit uses device’s motion sensors combined with visual information from the camera to track the real world. Some clever math uses this tracking data and maps specialties into the real 3D world to your 2D screen.
What’s new in ARKit 2?
ARKit has some new additions that have made it a lot better and fun to use. Developers have new options to add in their developments. Let’s take a look at the developments in ARKit2.
The interpretation of augmented reality objects is already quite good. However, Apple has improved rendering. ARKit 2 unlocks the ability for museums or similar organizations to “scan” their exhibition and allow information panels to appear above the statue.
3D Object Detection and Tracking
ARKit 1.5 adds support for 2D image identification, letting you trigger an AR experience based on 2D images like posters, artwork or signs. ARKit 2 gave this support to offer full 2D image tracking, so you can include movable objects like product boxes or magazines into your AR experiences. ARKit 2 also adds the capability to identify familiar 3D objects like sculptures, toys or furniture.
You can keep AR spaces and objects that are connected to the physical objects (like toys) or physical spaces (like classrooms), so you can choose up where you left off last time.
According to Apple’s description, a world map comprises of anchors, objects and other features that an ARKit uses to be aware of the space around the user. ARKit 2 brings the ability to endure these world maps, which opens the ability to yield these experiences with others or save them for use in the same application later.
Many users will be able to use their iOS devices to see the identical virtual environment, each from their own view. Apple is releasing a block-breaking multiplayer game as a code demand demo and Logo showed off a virtual play space where up to four players can interact with a merged virtual and real play space altogether.
Apple worked with Pixar to create a new file format for augmented reality objects called USDZ. This will be supported by third-party apps including ones by AutoDesk, Adobe, Sketchfab and more. USDZ is a new file format used to display and share 3D content natively in iOS, with optimizations for storage and sharing. On iOS 12, a built-in app such as Safari, Messages, Mail, News and Notes can natively Quick Look USDZ files of virtual objects in 3D or AR.
Environment textures are cube-map textures that depict the view in all directions from a specific point in a scene. In 3D asset interpretation, environment textures are the basis for image-based lighting algorithms where the surface can practically reflect light from their surroundings. Arkit can generate environment textures during an AR session using camera imagery, allowing SceneKit or a custom rendering engine to provide realistic image based lighting for virtual objects in your AR experience.
When using augmented reality, it’s necessary to make objects combine with the environment around them. In the first version of ARKit, highlights such as ambient light detection tried to make the virtual object “fit in” with the scene. ARKit 2 allows the object to reveal the texture around them.
The ARKit apps will be using SceneKit for its interpretation. SceneKit is the native 3D rendering engine for iOS, with direct hooks into ARKit.
There are a few requirements for using ARKit.
An A9 or later processor: Only the iPhone 6s and up, the 2017 iPhone and iPad Pros can run ARKit.
You will need plenty of space. To play ARmageddon, you will need clear space so you can capture bugs without falling over your furniture.
If your room has white tiles, furniture and walls, or maybe it is too dark, things won’t work too well; the camera requires a contrast in order to identify surfaces and distances of objects.
3D model format
3D Models are of the format DAE and SCN
We can create a customize 3D models using the free Blender Software. There are a few native formats available in SceneKit that we can use to load the 3D Model. The DAE format allows us to have multiple objects in the scene file, including cameras and lights as well as any geometry.
Sizing and Units
SceneKit makes use of meters as a unit of measurement for sizing and physics simulations. So when there is a mention of size, including the Scene Editor in Xcode, it is always in context to meters.
In Blender, you also need to ensure you’re working in meters. Blender’s default is a meter unit, but it’s always safer to check than deal with a giant or minuscule model in your scene.
SceneKit operates in a “Y-up” system, meaning the Y-axis is pointing up, whereas Blender works with the Z-axis pointing up. We should have knowledge of this before exporting our scene and when loading it into SceneKit. Usually, this isn’t a problem, as the exporter normally takes care of the conversion. Depending on whether you’re using a custom exporter or are working in a different coordinate system, you may need to rotate the model inside of your modeling application.
How World Tracking Works
To create a correspondence between real and virtual spaces, ARKit uses a technique called visual-inertial odometry. This process combines information from the iOS device’s motion-sensing hardware with the computer vision analysis of the scene visible to the device’s camera. ARKit recognizes notable features in the scene image, tracks differences in the positions of those features across video frames, and compares that information with motion sensing data. The result is a high-precision model of the device’s position and motion.
World tracking also examines and interprets the contents of the scene. Use hit-testing techniques to find the real-world surfaces matching to a point in the camera image. If you enable the Plane Detection setting in your session configuration, ARKit identifies flat surfaces in the camera image and communicates their position and sizes. You can use hit-test results or identified planes to place or communicate with virtual content in your scene.
In this article, we have covered points about what’s new in ARKit, its Format, Coordinate system, etc. With these new formats and features, we can move our application to the Next Level of the current world environment. These features will benefit the developers of a Mobile app development company allowing them to advance their apps.
Happy Coding! and feel free to share your own experience on “[email protected]”.