UI Design in 2020: Designing for the Users of the Future
User Experience is incredibly important these days, and we are fast approaching the day when user interfaces will mean much more than working with screens on devices.
By 2020, 30% of web browsing sessions will be done without a screen. As more interfaces are shifting from recognisable, screen-based forms, what will UI look like in 2020?
Can we relate this back to our readers? What they need to start thinking about now.
User interface design usually refers to the physical interaction with screens, pages, and visual features like buttons and icons on a device. It can also refer to other non-physical features such as voice, artificial reality and gesture user interfaces. User Interface Design (UI) design must not only be appealing to possible users, but must also be practical and developed with users in mind. The goal of UI design is to make the user's interaction as straightforward and effective as possible with a focus on increasing capabilities and responsiveness to create a good user experience. This has been the key to the success of thousands of products over the last few decades. But as more UIs are evolving from recognisable, screen-based models, the future of UI appears featureless.
Should there be a paragraph here that introduces the devices. Maybe asking questions of designer?
Are their skills redundant? Maybe link in that the UI design skills will be vital in the success of these products.
By 2020, the need to use your hands and eyes for browsing will be redundant and replaced by voice and speech only. Voice user interfaces (VUIs) make human to device interactions possible through a voice and speech platform that triggers a response automatically. An example of a VUI is the virtual assistant Siri in a smartphone, which is controlled by the user’s voice. Siri is presently the most popular virtual assistant with over 40 million monthly active users. Some voice and audio based products have already appeared in a huge way over the past few years like Alexa, Echo and Google Home. The main benefit of a Voice User Interface is that it allows for a hands and eyes-free way in which users can interact with a device while focusing their attention elsewhere. An analysis by Voicebot of smart speaker ownership found that 75% of households will own a smart speaker by 2020. However at present, people’s willingness to engage VUIs in public spaces is still relatively low. Speaking to a VUI can still feel awkward for some people and so users choose not to use their virtual assistants when they are in public. While there may be wrinkles to iron out in voice interface design, problems can be easily resolved, as technology evolves faster than the capabilities of those using it.
By 2020, 100 million consumers will shop in augmented reality. AR technology delivers captivating experiences that blends the real world with virtual objects. In AR apps, 3D virtual objects are overlapping on the device, building the illusion that they exist. This technology is changing the way users receive information. Boeing uses AR glasses to instruct technicians as they wire and fix multitudes of planes a year. The wearable technology helps them recognise and precisely connect thousands of wires using only their eyes and voice to control the technology. As a result, Boeing cut production time by 25% and error rates reduced to almost 0%.
Stores such as Ikea are also jumping on the rising trend of augmented reality. The Ikea app allows users to point their mobile to an area in their home, and generate an augmented view of the same area complete with a digitalised furniture. Consumers have already demonstrated their desire for this kind of experience: according to stats from MavenEcommerce. This emerging technology will bring enormous changes to UI design.
Gesture-based technology is already being implemented into design by organisations such as Reemo, which creates a wrist-watch powered mouse which is controlled by sensors and cameras. The “MYO” armband connects to a device while allowing users to control it through arm and hand gestures. The device enables users to gesture-control objects like lights, appliances, blinds and computer hardware. However, much like the voice user interfaces, there is a social stigma attached in the present day. Gestural interfaces require users to behave in unorthodox ways, by waving arms and hands around. This fear of social abnormality currently hinders gestural interface acceptance. For it to be profitable, developers will need to incorporate accepted and simple everyday actions rather than excessive actions. Gesture technology will be important over the next few years, when it can make key interactions with everyday objects easy and instinctive.
Essentially, the next surge of UI design will be gestural UI, while voice UI will no doubt be improved significantly by then. The future of User Interface is progressing in a path where screens will make up the smallest percentage of overall user interaction. Users are becoming more accustomed to engaging with devices in new ways. It is important that designers and developers start thinking about how to move UIs past screens, buttons, boxes and icons. Alternative UIs are here already and the more welcoming we are, they have a better chance to enrich our world beyond screens. A screen is just a medium for an interface, and we expect our devices to work regardless of whether we’re presently using them. So the question we need to ask is: Do we need screens at all?