Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.
Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.
Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.
Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.
To be fair, 98% of drivers seem to barely be able to hold a straight line and can’t see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I’d take my chances with the half-baked autopilot that can at least discern what a red light looks like.
I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.
Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.
Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we’re driving.
Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I’m hydroplaning even if I can’t see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren’t easy visual indicators, like at night.
Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn’t provide the feedback of a real car.
To be fair, 98% of drivers seem to barely be able to hold a straight line and can’t see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I’d take my chances with the half-baked autopilot that can at least discern what a red light looks like.
I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.