We mentioned in a previous article that Apple has updated and created things in the Accessibility section or what was previously known as Accessibility - this link - These things are aimed mainly at people with special needs and other healthy people also under the slogan "iPhone-for-all." part One.
The back button gives you two more shortcuts to the actions you use the most
In previous versions of iOS, you could use the side button or the Home button to launch various accessibility options. With a triple click of the home button, you can quickly start any feature you have enabled. This is still available on iOS 14, but there is now a new way and more.
In the list of facilities for use, you will find the setting “Touch"Or"touchThere is a new option, known as “A.To press from the back"Or"Back tap“. Choose that, and you'll see options. ”Double click" And the "Triple click“Both of them have useful shortcuts.
These commands or shortcuts can be executed just by clicking on the back of the iPhone with one of your fingers! You do not need to press any button or anything, just click on the back of the iPhone two or three times to execute the pre-prepared command.
And the great thing about this feature is that you can use it on shortcuts easily, as it has been tried on a shortcut that you have created for the video saving tool in the iPhone Islam application, and as soon as you copy the link and click your finger on the back of the iPhone, the tool will run and download the video in no time. You can also create a shortcut for photography, and by clicking on the back of the iPhone, the camera is opened and filming without anyone noticing.
Unlike accessibility shortcuts, these gestures allow you to turn on more useful options. So that you can control the volume, turn off the screen, show the notification center, and swipe up or down while you are surfing the Internet, or to undo writing instead of shaking the iPhone, and much more. There is also support for shortcuts created with the Shortcuts app.
Voice control improved in more languages
The new and improved Voice Control, which was introduced in iOS 13, allows you to operate an iPhone using just your voice. But in iOS 14, there is a huge improvement, especially the voices of British English and Hindi English.
Headphone facilities help you hear better
In the "Audio / Video Accessibility" setting, there is a new option called "Headphone Accessibility". When you turn this on, you get a chance to tune how things are heard in your headphones.
You can change the sound from a balanced tone to a tone enhanced for high-frequency or mid-frequency sounds. There is also a slider to adjust how the soothing sounds are reinforced, from mild to medium to strong. You can hear these changes in real time by using the "Play Sample" button. The best thing is that you can choose to apply these settings to media “music like movies, podcasts, etc., phone calls, or both”.
And it can be further customized on AirPods and Beats headphones
If you own AirPods, AirPods Pro, or Beats, there is another headphone facility option that allows you to create a 'custom audio setup'. It will guide you through the different voices, so that one of the best features of headphone facility is available for choosing Airpods and Beats headphones. When calling, you can create a custom audio setup to suit your hearing. AirPods Pro also support Transparency Mode, which improves hearing quieter sound around you.
Voice recognition helps you hear important things
Using the Neural Engine in the iPhone, iOS 14 can detect background sound. When enabled from the accessibility menu, the Voice Recognition feature will constantly listen to the surrounding sounds around you, searching for the specific sounds you choose, all without draining the battery.
When the feature is turned on, click "Sounds" and switch between anything under the alarms "Fire, sirens, smoke", animals "cat, dog", home appliances "appliances, car horn, doorbell, door knock, running water", Or the "baby crying, screaming" person.
These sounds are detected using artificial intelligence on the iPhone, so nothing is recorded or sent anywhere. When you receive an alert, you will see a notification and may feel vibration depending on your Sounds & Touches settings.
Real-time text lets you multitask
Since 2017, people with disabilities can use the RTT program "or real-time text messaging" to communicate with others in a phone call. Unlike messages, there is no need to press Send as text appears on the recipient's screen as it is typing, to simulate a voice call. In iOS 14, you can now multitask with this feature. When you are outside the mobile app, and thus out of view of the conversation, you will be able to receive RTT notifications.
FaceTime detects sign language
Usually, when using Group FaceTime, the active user will be maximized unless you disable the feature. But in this case, you do not speak with your mouth until this is understood and focused on. Starting in iOS 14, when FaceTime detects someone using sign language, it will be confirmed on the video call as the notable speaker.
Source: