Vision doesn't have to be any better than humans' vision. If it "sees" equally well (acuity and sensitivity to EMR spectrum) but:
- can see in 360 degrees (it can)
- is always paying attention (it is)
- can make better decisions (work in progress...)
- can make above decisions more quickly (should be able to once it learns how to make the best decisions)
then it will be a much better driver than humans with equal vision.
Our roadways are designed for human vision and cognitive abilities. Not even perfect human vision as there's a fair amount of latitude as to how poor one's vision can get before one loses the privilege of driving. I see people arguing that there needs to be cameras on the front bumper to see around corners. Not true, the front cameras in the mirror console are already positioned more anteriorly than a driver's head. Seeing more into the IR/UV spectrum than a human can could maybe allow for earlier perception of wildlife at night for instance, but isn't necessary to be equally good as a human. Radar/lidar/ultrasonics could maybe add additional information that could make the autonomous vehicle better than a human, but aren't necessary to make it equally good.
While driving as good as a human may seem like a low bar, keep in mind that most accidents are not a result of the driver's vision (assuming they meet the standards for driving). Most accidents are from driver inattention, distractions, impairment, poor judgement, driving inappropriately for conditions, mechanical failure of the vehicle. None of these would be overcome with better vision.
The only failure I see for current camera positioning is there should be a forward facing camera on the driver's side of the vehicle to enable passing on 2-lane undivided highways - maybe the B-pillar cameras allow for this, I haven't seen enough footage from them to know one way or the other, but I'm skeptical. The other issue I can see with current cameras is a lack of dynamic range/sensitivity to glare when driving into the sun when it's near the horizon.