iPhone helps blind users |
Apple introduced a new feature on iPhone called Screen Recognition, which uses machine learning to automatically identify and label buttons, slides, slides, and stickers.
Apple has done its best to add features for users with disabilities.
Assuming all UI elements are manually named, VoiceOver on iOS devices is a useful tool for the visually impaired.
Screen recognition can be used in iOS 14, a computer vision system that has been trained with thousands of app images to recognize the appearance of buttons and the meaning of the icons.
These systems are very flexible and can become experts in recognizing cats, facial expressions, or different parts of the user interface depending on the data you provide.
IPhone users can call this feature on any app to name any element on the screen in less than a second.
Screen recognition must include all content that a visual user can see and interact with, for example b. Pictures displayed in nearly every position, common icons and menu icons.
Apple said: We have been looking for accessibility improvements like image descriptions and in iOS 13 we have automatically labeled icons, but the new features are a big step up.
She added: We can look at the pixels on the screen and define a hierarchy of things that you can interact with. All of these things will happen through the device in a tenth of a second.
Due to the fuzzy logic of the machine learning system and the speed of the iPhone's built-in AI accelerator, screen recognition is more flexible and powerful.
This new feature makes it easier for visually impaired users to access millions of apps.
Apple may consider offering screen recognition to other platforms such as Mac computers in the future. However, the problem is that the template itself cannot be advertised through desktop applications. This is completely different from mobile apps.