Key takeaways:
- Apple has introduced new features that will allow people with disabilities to interact with others and their environment through an iPhone.
- Apple says the features are designed to assist with cognitive, vision, hearing, and mobility accessibility.
- Some of the features will allow iPhone users to type to speak during calls or create a synthesised voice that sounds like them.
Apple might be subtly foraying into the AI race, but not in the most obvious ways.
On Tuesday, May 16, 2023, the tech giant previewed software features that are "for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are non-speaking or at risk of losing their ability to speak."
According to Apple, it is working in deep collaboration with groups that represent people with disabilities. This collaboration has birthed a pack of features that Apple says will help people with disabilities use their iPhones with more independence.
Making your iPhone sound like you
One of the features is called Live Speech. This will allow people who cannot speak to type what they want to say and have it spoken out loud during a call, FaceTime, or face-to-face conversations. They can also save commonly used phrases that can be used quickly during conversations.
For those at risk of losing their ability to speak, a feature called Personal Voice will allow them to train their iPhone to talk like them in 15 minutes.
According to Apple, iPhone users will be able to "create a Personal Voice by reading along with a randomised set of text prompts to record 15 minutes of audio on iPhone or iPad."
How the blind will see with Apple
Point and Speak is an upcoming feature that will help blind people see with the iPhone. Apple said people with vision disabilities will be able "to interact with physical objects that have several text labels."
Apple said the feature will read out the text on each button as the person moves their hands across the keypad. The feature uses the image from the phone camera, the LiDAR scanner, and on-device machine learning to recognise the text.
Apple did not give a specific date for when these features will be rolled-out but it says they will be out later this year.
Let the best of tech news come to you
Give it a try, you can unsubscribe anytime. Privacy Policy.
Apple has an unusual AI strategy
According to Bernard Marr, author and strategic advisor for influential companies, Apple's AI strategy centres around its devices.
While Apple's recently previewed features may seem like a new foray into the AI space, Marr notes that the company has been making powerful innovations since the release of the iPhone X in 2017 or its virtual assistant, Siri.
While companies like OpenAI, Google, and Microsoft's approach to AI is mostly software-based, Marr says Apple's vision is "powerful handheld devices that are capable of running their own machine learning on datasets gathered via their own array of sensors."
Is Apple's AI strategy smart?
Although Apple is taking a different approach to AI innovation, some feel it is not the best.
According to this article, Apple's AI strategy is not smart.
Apple's AI strategy limits it to people who only use its devices. OpenAI and Google have proven that open-source technologies are essential for AI development, and with Apple's tight ecosystem, will it be able to compete with the likes of OpenAI and Google?