Using AI to interpret flight instruments?

With the development of vision AI algorithims I am interested in finding out if there’s an ML algorithm that can be trained to interpret aircraft instrumentation in real time, giving a pilot feedback through voice and tones?

I’m an aviation enthusiast who is blind but has pilot skills to be able to fly an airplane through audio voice and audible cues. I’ve done it already with an iPhone program. But there’s too much lag with the iPhone’s internal sensors for it to be efficient.

This summer I’ve been in an actual airplane, using a program interface with the iPhone, and have been able to control the bank and pitch and direction of the aircraft. But because of the delay problems, I thought the visual AI would be more efficient, not only for an actual airplane but also flight simulators that I’m involved in. The flying I did was closely supervised with a licensed pilot in the left seat.

I’d like to expand flying capabilities to the blind community and encourage those who have an aviation interest to be able to understand what flying an airplane is all about and to learn the skills. It is not possible for sightless people to become licensed pilots, of course, but the understanding can be a worthwhile endeavor, and flight simulators are a viable learning tool and avenue for participation.

Is there anyone out there with enthusiasm and the vision to help me put this together. I can provide more information. Can we establish a contact and a dialogue. Reply by PM please.