It will be no surprise to you that technology is changing and enhancing the world we live in. The digitalisation of everything from personal banking to song recognition, to cars that park on their own and motion-recognised robots, is a shining example of how the lines of human and digital are constantly being experimented with.
This wider exploration of the human world has led to the creation, experimentation, and development of technologies that can both alter our understanding of reality, but also adapt change to the world as we know it. No surprise then that huge technological advancements like AR and VR are becoming household terms.
What are the differences, you ask?
Augmented Reality (AR) alters the perception of a real world environment for its users, whereas Virtual Reality (VR) (which we’re all more familiar with) replaces the real world environment entirely. VR tends to be completely immersive as a concept, and in some ways that’s restrictive; with some people finding VR experiences too claustrophobic, too overwhelming, and can even experience negative physical affects, succumbing to ‘virtual reality sickness’ with motion sickness type symptoms. These kinds of experiences paved the way for a new alternative reality to be experimented with, aka Augmented Reality (AR).
How AR is integrated into a modern world is always a hot topic for those in the tech world; how to utilise this for consumer experiences, technological improvements, industry efficiencies, and more. But it was actually the U.S. Air Force who developed the first AR application in 1992, with ideas and theories about the concept of overlaying electronic data across real life germinating since the beginning of the 20th century. If you’re not familiar with AR, think of James Cameron’s T-800 in Terminator (1984). An early example of how it could be utilized.
This is where Mixed Reality (MR) comes into play too; of which our solutions constantly explore. The idea of real and virtual worlds coming together and creating new environments and visualisations, allowing physical and digitial worlds to co-exist. The phrase, lets get ‘phygital’ comes to mind!
Who's experimenting with AR?
The world’s biggest names in tech are quickly coming on-board with AR and investing huge resources in developing AR products. In particular, Apple with their hotly anticipated ‘ARKit’. Retail giants like Ikea are poised to take advantage of the Apple ARKit’s release with its own virtual AR catalogue allowing customers to envision their products to-scale in their own home.
Google also dipped their toes in the AR Kool-Aid early in the exploration of this technology, launching ‘Google Glass’ to a consumer public that wasn’t yet ready in 2013. We are glad to report however that the famous ‘Glass’ is on the cusp of a revival with adoption by the manufacturing industry and autism-friendly applications. So this is one to watch! Other notable mentions would be Microsoft’s MR headset - HoloLens2, and another example of head-mounted retinal display technology in ‘Magic Leap’.
The general public is also coming round to the adoption of AR technology. One of the biggest milestones in that process was the launch of ‘Pokemon Go’ in 2016, and the totally unprecedented ‘Pokemania’ that followed. Within a month of its release the mobile game had been downloaded 100 million times, and 231 million people engaged with the game in over a Billion interactions on social media in July 2016 alone.
An epic introduction to the world of AR, and how it can enhance our interaction with true reality!
Our ‘phygital’ relationship with AR
The real world is really important in the context of AR. Physical objects have a place and a value in the AR world, and provide context to the digital elements of AR applications. What the PufferSphere offers within this augmented world is a physical AND digital solution, and this is why we’re constantly interested in the interplay of AR. The PufferSphere doesn’t rely on AR but is an intuitive, real-world presence that can compliment it and form the basis for unique experiences.
Whether this be through the space industry’s explorations (how we visualize and contextualize planets, solar systems, orbits), Earth observations (how we live track satellite activity), imagining weather patterns (and further manipulate its relationship with the world as a whole), the study of the moon and tides, or the interplay of atmospheric effects – the PufferSphere delivers a digital world in the physical space, extending the intuitive qualities of information and making them the center of conversation. The PufferSphere offers the opportunity to intensify these experiences through the addition of AR overlays and intuitive applications that extend the display into the virtual space around the sphere.
Our slogan: ‘We Make Digital Real’ echoes the very sentiment of AR and VR – providing ‘phygital’ objects that everyone can interact with, debate, and use for further understanding. AR additions can extend the glow and warmth of our digital campfires.
PufferTouch technology and AR
At Pufferfish, we’re always looking at how our technology interacts with others and how it can enhance the PufferTouch experience. It’s no surprise then that we’re constantly experimenting, advancing, and improving our understanding of the AR landscape. Some of the ways in which we’ve done this are as follows:
Our first step was experimenting with markers based around the Sphere. We quickly found that with an object that can be viewed from all sides (as the PufferSphere can be), markers around the Sphere are not ideal. The markers could be hindered by the spherical shape, and the placement of markers on the wider unit were not always visible in view of the device.
We then looked at using markers embedded on the Sphere in digital content. These markers were far more functional and could be utilized in different forms on the PufferSphere - on a top level, or discoverable under a button, or through another interface. We were happy to see successful integrations for showing AR augmentations from key points on the Sphere, and will continue to experiment with these functions, and minimizing the visual scale and impact of these markers.
Then in collaboration with another Partner we started exploring globe-based AR with the outline of the Earth’s landmasses as the marker. We saw hugely exciting and intuitive results from this experiment, and while we cant say much more at this stage, what we can say is - watch this space!
the blending of dimensions
The ‘professional tech’ space is also exploring this pulling together of functionalities, this blurring of tech and physical worlds, and is centered on trying to make this blending of dimensions as seamless as possible – bringing AR, VR and MR to life in flexible, durable, and useable ways.
We were thrilled to see this idea of ‘seamless interoperability’ rising to the top of many of the conversations we had at ISE in Amsterdam. See more on this in our recent ISE2019 Lab post.
Ultimately the PufferSphere and AR relationship continues with further testing and adaptations, but these are truly exciting advancements for the Pufferfish team as it will allow us to further heighten our digital consumption.
And who doesn’t want that?!
[We've been featured as a Top CRM Software Development Company on