Here's What Healthcare CIOs Can Learn From Apple's Worldwide Developer Conference
Apple historically has done a fine theatrical job in announcing its new capabilities or products. The announcements surrounding its latest iPhone operating system at its latest developer conference held earlier this month did more than that. The technology company’s plan for its iOS14 operating system creates easy-to-access interfaces that present the data in a very clear fashion.
Healthcare organizations, which have patients dealing with a similar glut of information and confusion, would be wise to study Apple’s lead and take notes. The key enhancements in iOS 14 aim to give iPhones more of the flexibility found on larger form factors. For example, it enables iPhones to deliver “picture in picture” capabilities, which enables users to keep a video or continue a Facetime session on the phone’s screen while performing other actions, such as accessing apps. The video component can be moved around the screen, resized or moved off-screen, which still allows the audio to play on.
The new iOS also revamps the way Siri operates and amps up its capabilities by enlisting advanced technologies such as artificial intelligence and neural engines. The visual interface has been redesigned to make interactions more seamless, enabling search results to appear at the top of the screen. In addition, search capabilities have been expanded, and developers say Siri now has access to 20 times the information that it had three years ago.
Dictation capabilities also have been enhanced in the new iOS. The new system lets Siri start a recording, which can quickly translate into written words or save as an audio message. Leveraging a neural engine, dictation has been improved as a user experience. In addition, onboard technology enables Siri to help with translation. Using a new translation app that can work offline, it’s able to translate text and voice between any of 11 different languages, showing both sides of the conversation when the phone is held in landscape mode.
Wearables also are getting a boost from the new operating system that Apple is releasing. For example, Watch OS7 will be able to deliver automatic detection that helps users improve their handwashing capabilities. It enables the watch to detect whether someone wearing a watch is washing their hands from motion detection, and it uses audio and haptic signals to ensure that the consumer washes long enough to be effective.
Features for healthcare platforms to consider
All this offers lots of potential for the healthcare industry. The omnichannel experience of looking up information while conducting a video chat is a really useful development. This is especially true as more clinicians have adapted to virtual interactions with patients because of concerns over face-to-face meetings in the Covid-19 era.
From the patient’s perspective, the ability to shift to smoother interactions with video can help with telehealth consultations. Patients don’t want to have to open a separate app to look up information or read from their medical records during a consultation, so the type of integrative experience that iOS14 offers can make virtual encounters easier for everyone.
Advances in voice technology can be a big help too. Siri dictation can already be used by physicians to perform documentation tasks, but the new translation capabilities offer the opportunity for clinicians to talk to patients who don’t speak the same language. And the capabilities of the Apple Watch to monitor and assist in more effective handwashing can help reduce the potential for infection from Covid-19.
Like other smartphones, Apple’s iPhones plus other wearables powered by iOS14 can also help with contact tracing issues, but privacy concerns exist among the users. The opportunity to use contact tracing coupled with the design thinking approach that Apple exhibits is the formula for success for healthcare CIOs to follow.