Where did we say it was non-functional? If it is demo quality by definition it is functional. The issue is if it is reliable enough to stop for traffic lights in consumer use (given the risk of a traffic accident is quite high if the detection is wrong).
However, traffic light detection AFAIK has not been used in any sense by automakers, even in limited senses: not for visualization as with stop signs (AP1 shows stop signs), nor warning, nor something more mild like slowdown.
See above. Not talking about "functional", but that it's hard to make it reliable (the corner cases).
Please show one example of traffic light detection (don't add traffic sign into it, we already said that was an easier problem) being used in a production vehicle (esp. for years and for stopping the vehicle).
First of all AP1 DOESN'T display stop signs. thats AP2.
Tesla’s new Autopilot update detected and displayed stop signs, but it didn’t act on them
Secondly, AP1 in debug mode shows data from both stop sign AND traffic lights. (verygreen can confirm)
Actually traffic light and signs detection are just as easy, Mobileye's Ammon Shahua said. The hardest part according to him is the long range stop line detection which Mobileye already solved and the relevancy of the stop lights is even easier than that, similar to the relevancy of stop signs although a lil bit more complicated.
But basically you are saying if a feature is not implemented by a automaker its only a demoware.
so that will mean that eyeq3 was just a demo before any automakers actually implemented it. its amazing that mobileye were able to sell demoware to automakers, talk about swindling entire companies. bravo!
Looks take a wide look, before AP1 and Volvo Pilot Assist 2 which surpasses AP1. Eyeq3 according to you would be considered a very crappy demoware as almost all automakers only used the lane detection and forward car detection feature and had a very poor implementation of it. You know what this means, all the other features of eyeq3 was considered useless demoware to them. right?
This includes holistic path planning,
pedestrian detection,
cars in other lane detection and multi lane detection,
semantic free space,
camera only AEB,
semantic lanes details, etc.
pavement markings, and many more, etc.
Those were all useless demoware even though mobileye said they had over 99% accuracy in their pedestrian, came only AEB, sign and traffic detection, etc.
Obviously until Tesla came along and not only used the typical lane and forward car detection. But also used HPP to drive when lanes disappear and in conditions like snow. They also used speed limit from the traffic/sign detection to adjust their adaptive cruise control set speeds. Went even further and used pedestrian detection and also added displaying cars in other lanes by using the lane and car detection.
Now all these features in AP1 didn't come all at once even though they were available from the eyeq3 from the get go.
But basically other automakers who didn't implement these consider it demo-ware then.
Again, completely illogical nonsense.
Plus There are other things that even tesla didn't implement, they only show cars in other lanes not the lanes itself even if there are no cars and things like semantic free space (which can be used to implement actual REAL summon feature if AP1 had wide lens FOV camera), also camera only AEB, or pothole and bumps detection by adjusting suspension to handle them and many many many more features.
Finally Nissan propilot i believe was the first automaker who implemented eyeq3 camera only AEB.
Infact the Nissan serena doesn't even have a radar. so was mobileye camera only AEB demoware all these years till nissan implemented it last year? or semantic free space which audi L3 uses? or the debris detection (includes thousands of objects including traffic cones, etc) that audi L3 uses aswell? or the uses of other signs detected besides just speed limits that the audi l3 which is coming in 8 days uses?
again, completely illogical.
Lastly REM is using the traffic light/sign detection and relevancy in the already current eyeq3 to build the world scale HD map.
Again if it didn't have 99% accuracy like their other features then it would be useless. As the whole mapping and management is completely 100% automatic.
You think mobileye will put out a feature that is 75% accurate? You realize that every accident using their system hurts them badly as you can see from the tesla accident. Secondly as i mentioned before almost all automakers used/uses eyeq3 for all camera functionality of their SDC prototypes. Nissan, audi for example.
@AnxietyRanger
@MarcusMaximus