Uber has admitted that there is a “problem” with the way its autonomous vehicles cross bike lanes. The firm began testing the cars on public roads just two days after a cycling campaigner had pointed the issue out to them, informing him that engineers would work on the problem.
San Francisco cycling campaigner Brian Wiedenmeier attended a demonstration of Uber’s autonomous vehicle last week and says he twice saw it make an “unsafe right-hook-style turn through a bike lane.”
California state law is that a right-turning car should merge into the bike lane before making the turn to avoid cutting across the path of a cyclist who is continuing straight on.
Writing on the San Francisco Bicycle Coalition website, Wiedenmeier said: “I told staff from Uber’s policy and engineering teams about the safety hazards of their autonomous vehicle technology. They told me they would work on it. Then, two days later, they unleashed that technology on San Francisco’s streets. Your streets.”
Uber spokeswoman Chelsea Kohler told the Guardian that, “engineers are continuing to work on the problem”, and said that the company had instructed drivers to take control when approaching right turns on streets with bike lanes.
Coalition spokesman Chris Cassidy said: “The fact that they know there’s a dangerous flaw in the technology and persisted in a surprise launch shows a reckless disregard for the safety of people in our streets.”
Wiedenmeier says that launching autonomous vehicle technology before it’s regulated and safe is unacceptable and the San Francisco Bicycle Coalition has therefore launched petition to tell Uber to address the issue immediately.
Jeffrey Tumlin, director of Oakland’s department of transportation said that bike lanes presented a unique challenge for driverless technology due to the speed and manoeuvrability of cyclists. “It can be more difficult to predict their behaviour,” he said, adding: “I get uncomfortable with private industry doing their experimentation in the public right of way without first collaborating with the public.”
One of Uber's self-driving vehicles was recently caught on a dashcam driving through a red light. California's attorney general reacted by instructing the firm to acquire a test permit for its vehicles, cease autonomous driving immediately, or face consequences.
Uber's vice-president of Advanced Technologies, Anthony Levandowski, reacted by saying that Uber could not "in good conscience" comply with regulations that it doesn't believe applies to itself. "You don't need a belt and suspenders if you're wearing a dress," he commented, somewhat bizarrely.
John Simpson, privacy project director for non-profit consumer organisation Consumer Watchdog, has since said: "We believe their activity is a criminal offence under the Motor Vehicle Code, punishable with up to six months in jail."
Add new comment
25 comments
And I know it's only got 1 lane marking, but 3 of the entrances have 3 lanes on them
https://www.google.co.uk/maps/place/Caversham+Rd,+Reading+RG1/@51.4638597,-0.9745163,227a,20y,303.44h,45t/data=!3m1!1e3!4m5!3m4!1s0x48769b197bed75ff:0xbee36ce57d10a8c1!8m2!3d51.4607172!4d-0.9755443
Those are just the 2 roundabouts nearest to me, ones that I ride around. Scary ones.
I don't drive. By the time I should have learnt, I had been run over too many times to ever want to do it to anybody. Those driver aids have made it much safer for the ones inside the vehicles, death rates for us cyclists and pedestrians keep going up.
And
https://www.google.co.uk/maps/place/Caversham+Rd,+Reading+RG1/@51.4617969,-0.9747469,64m/data=!3m1!1e3!4m5!3m4!1s0x48769b197bed75ff:0xbee36ce57d10a8c1!8m2!3d51.4607172!4d-0.9755443
that one has slightly more than 2
Bud, "Our desire, (selfishness?) to get to our destination a few seconds before the bloke in the car next to us creates mayhem everyday. But we put it all, your stat 11 people every 2 days, down to acidents, yet we do not accept the techology to deliver safety improvements and we do not learn. [2]"
You spelt it wrong, but you said it.
Bud, couple of things,
Google have said they are not going to make cars, as I said I kind of like their attitude, but I don't see that attitude selling the heavy metal.
I couldn't see the video, the idea of autonomous cars all communicating and coordinating with each other is wonderful, but that requires near universal adoption, which will take a generation at least, and has no place for us. Volvo make very safe cars, for the occupants, but please believe me, for those on the outside, they really, really hurt. And don't get me started on the attitude of their truck division...
Roundabouts are nothing like traffic light controlled junctions.
And 11 people don't die every 2 days because of acide... Acci..ac.. No I cannot say it, it's the use of that word that makes it acceptable, that allows those deaths, every 2 days, from the desire to just get about. Ambulance men and paramedics may use that word, but from them it's kind of OK, a Traffic officer, never will, they use incident, collision, crash, they see them and the results and have to tell the families of the victims. A local bobby might, even though I asked him not to, multiple times, it means it's one of those things that just happens, that it couldn't be avoided, that no one was to blame and no lessons could be learned. RDRF (Road Danger Reduction Forum) sum up my attitude on this one.
There are some councils in this country where 3 people have to die at a junction before they will consider changing it. And there are similar "rules" abot the siting of speed cameras, to enforce the law that everyone should be obeying. The airline and rail industry start with safety.
If it's 5 a day with human drivers then 4 a day will be acceptable for autonomous vehicles. Not for me.
Show me a busy roundabout with more than 2 lanes that doesn't now have traffic lights. The control has been taken away from the driver already, bar those that disobey.
I think it was your stat but get the point and I do not believe I referrred to them as accidents either all collisions are not accidental, in fact most are avoidable. Lessons can always be learned but not by a closed mind. Oh we are back to driver knowing best again. Not sure if you drive or what you may drive but if we removed all the tech from any vehicle you had such as ESP, ABS etc etc. Would you feel as safe?
Interesting thought about airlines and rail, especially after the Croydon tram. Safety is enforced upon them by the regulator it is not voluntary compliance yet humans continue to wrecklessly overcome the safety systems. Aircraft are built to standards because EASA will not approve them to fly unless they comply. Some operators continually fight this because it costs them money (profit). Road haulage operators also will not install technology that will save lives unless it is enforced. Me I am happy to get what I can and trust it.
My biggest wonder about AI cars is how all the different AIs will play together. Will BMW AI be more aggressive than Smart AI? Surely the whole thing would be better if they all used the same driving AI?
Getting old people into these things will probably take loads off your average journey. I was stuck behind come codger doing 35 in a 60 the other day and the conditions easily allowed the speed limit. Even if the AI did 50 instead of 35 it would be a godsend. Old people will probably be last group of people persuaded into them though.
Two arguments there bikingbud: I'm largely in agreement with your driverless car points, but largely in disagreement with 'don't ride up the inside and you won't get squashed'.
But while we're on driverless cars... Implementation will be a nightmare. Interaction with human drivers willing to cut up and take the mick out of easily identifiable carbots will see them deferring everywhere...
I can see it working if we went big bang with driverless cars everywhere, so the cars could interact on the same level (maybe even communicate so they knew exactly what their neighbours were doing) as each other. But I doubt that'll happen and I'm skeptical that they'll reasonably figure out illogical human behaviour, and that's something humans are actually quite good at.
Bud, you make some very good points, but you did say "if you don't go down the inside you will not get squashed", now I been fairly squashed, several times and I was nowhere near "going up the inside", and I'm sure many of the contributors on here have.
We all know we have to take extra care when filtering, especially around large vehicles, in fact we have to take extra care whenever we are anywhere near other vehicles. Lay off that bit, and let's discuss the tech.
I kind of like the Google thing, small, cautious cars, and "The answer is almost always 'slam on the brakes'", but they are not going to make cars. The big manufacturers will continue to make big fast cars, and people will continue to buy them, no one will want a car that will stick to the speed limit and slow down "just in case". The will want a car that drives like they do, that swerves (as little a possible) so as to not lose much important speed. And the person in the driving seat will be not concentrating so much that they are never going to be able to do anything if the computer messes up (The Tesla death).
The sky is a very big place, where safety is a priority. Incredibly well managed and regulated. Where every death is a lesson to learn. The Shoreham air crash has changed how all air shows are run, the 11 deaths, whilst incredibly tragic, represent just 2 days of slaughter on our roads.
Just look at any 2-3 lane roundabout at rush hour. Now I'm not saying humans do it well (really I'm not), but it's an incredibly complex system, and Google cars would never make it through, and that does not sell.
Aircraft land at busy airports, google SESAR or NextGen and 4-D trajectory Management. You comment upon a roundabout but almost like a software driven 4 way stop, vehicles can negotiate the priority, red light jumpers don't do this
Yes it does require a saturation level to achive the aim which is why ICAO has imposed these technologies for Commercial Air Traffic.
It is humans that are the problem, both interacting with and accepting the technology.
Correct it's not just Google that will build cars but see this from Volvo from 2011:
https://www.youtube.com/watch?v=kJwWBzfTnMk
Vehicles communicating, letting each other know they have come off the gas or have started applying the brakes or steering, can happen in milli if not micro seconds. Response time for drivers to realise the car in front has changed speed? You estimate that as you may wish but watching cascading brake lights on the motorway does indicate we don't do it very well.
I have been driving a car with Adaptive Cruise Control for about 9 yrs, set the speed, set the gap and it sorts it all out. Even if I wish to go at 70 if the car in front is not going 70 it will follow and maintain the gap I have set, accelerating once the car in front picks up. If some muppet jumps in front because he thinks the gap is for him, it will slow down and reset the gap. Simples or I could act like a lots of drivers that feel the road is theirs switch it off and close the gap reducing my, and others, safety margin. And new features are coming to all new cars, autopark anyone? The same sensors can be fitted to reversing lorries and lorries that may wish to cahnge lane or turn left or right. Is that a safety improvement you'd buy?
Our desire, (selfishness?) to get to our destination a few seconds before the bloke in the car next to us creates mayhem everyday. But we put it all, your stat 11 people every 2 days, down to acidents, yet we do not accept the techology to deliver safety improvements and we do not learn.
From my perspective the technology exists that can improve safety. We can all complain that it doesn't work but I think few really understand the benefits, possibly the same mindset that drivers may not understand cyclists or motorcyclists.
Cheers
Well, exactly. Which is why there's a significant danger that the logic of self-driving cars will be a push to remove all those pesky humans from the roads and streets. So more-and-more restrictions on cycling or walking.
And more and more cars cruising around, including many with nobody in them at all (as if cars with just one person in them aren't wasteful enough already).
And the selfish humans will be deciding which self-driving cars to buy. And the producers of those vehicles will have to cater to them if they wish to attract customers.
Stats from the DfT:
Seems bizarre we can accept this when humans are driving doesn't it.
The point I was contesting is that the human intervention/vigilence clearly does not pass the fitness for purpose test. I think we can all agree that being on the road as a; cyclist, driver, motor cyclist or pedestrian, is hazardous but if you decide to go up the inside of a vehicle then you are taking unecessary risks and more likley to become a statistic. So protect yourself as far as you can. Seems simple and straightforward to me. Supported by RoSPA:
Cyclists
Why are you sceptical about the cars? Is it fear of the unknown or some other deep seated phobia? Some info for collisions with google cars:
My bold.
The cars operate in a deterministic way, if they are unsure they will pause, hence the six second delay above and the issue with the cyclist trackstanding. However when humans do random things such as jumping lights, falling asleep, speeding or not maintaining sufficient space the vehicle is limited in what it can actually do and it becomes a victim.
Luddites were also sceptical about the benefits of progress and stymied developments through their actions.
Do you fly as a passenger? Do you have any idea what goes on in an aircraft? It's not witchcraft you know
Happy to chat further if you wish.
Firstly because I have seen how these sort of AI developments have played out in the past. The wider topic has a long history of over-optimistic promises.
So far the self-driving cars don't seem very close to mass adoption. A handful of slow electric cars on restricted routes, mostly with humans ready to jump in for the tricky bits. And as your own list acknowledges, they've been involved in a fair few collisions already. (Doesn't matter if they've been hit by non-robo-cars, that's part of the problem, they will exist in an environment were not everything is a self-driving car).
So let's wait and see, no? But I am sure I won't see mass adoption of fully-autonomous vehicles in my lifetime.
And then there are the social issues involved, even if they do eventually come to pass. There's the effect on traffic volumes, the effect on people's attitudes to active travel ('the algorithms would be so much simpler if we could get those awkward bikes and pedestrians off the road'). The effect on obesity.
(It's just occurred to me to wonder if they might also be of great use to terrorists, but that's just a passing thought).
What will happen with regard to the volume of cars produced and the resources used to do so? (Self-driving cars will be used a lot more and parked unused a lot less, so they will wear out much quicker, so more cars will have to be built - good for the car industry, I guess, perhaps not so good for the environment)
And are you sure that the corporations who program the things will be any more benevolent than the individual drivers who currently drive them? The latter will constitute the market that the former will have to pander to.
Look what happened with the computer-controlled gaming of the emissions tests. Did that bit of in-car electronics do the rest of us any good?
I think that scandal gives an insight into how corporate-controlled AI is likely to work out. Code won't necessarily be written with the best interests of anyone outside the vehicle in mind. It's all about the politics (i.e. power), not technology, and that is really, really hard to predict.
The big problems that need to be solved are political and social. It seems to me that technology rarely fixes any of those.
Flying is a totally different issue - the skies are nothing like as crowded with other objects as are the roads. No pedestrians or cyclists in the sky.
Besides, I'm not keen on the current scale of air travel either, but that's a different topic.
Oh, and just wait till the robo-taxis attain self-awareness and form themselves into a Robo-LTDA and start Tweeting and posting comments on the Daily Mail.
So it is rise of the machines
Seems like Uber has adopted Facebook's old "Move Fast and Break Things" motto.
Time will tell if it is a smart guiding principle for how it deals with its administrative entalgements. But IMHO it is criminally unwise as a guiding directive for its self driving cars.
This doesn't sound too bad to me. Uber is going down the "they're not autonomous, we've got a driver as backup" route, so it seems reasonable for them to continue testing. After all, testing is for identifying the defects and then hopefully fixing them.
This particular problem is to do with the car not filtering into the bike lane (as required), rather than recklessly turning into a cyclist. The rule is there to prevent drivers turning recklessly into cyclists, so I'm not saying that it's a pointless law, but I can appreciate that the most important point is to not hit anything.
It looks like Uber is annoying a lot of people at the moment and getting some bad press (mostly deserved) but this issue doesn't look as bad as the red light one the other day.
Has he never seen Gok, a dress with no belt!
I'm still wondering about the human error with the red light running, did the car start slowing and the driver floored in (in front of a marked police car). One thing that, hopefully, autonomous cars will not do is attempt to accelerate through problems. From yesterdays Guardian "Google similarly seemed unbothered by the thorny thought experiment [the trolley problem], telling the Guardian in the summer: "The answer is almost always 'slam on the brakes'""
The human 'driver' will at critical moments be more vigilant, and better able to act decisively, than the car's observation and control system? That is not reassuring.
Really?
How naive is that thought?
How many people have been killed because they put themselves in a hazardous position with vehicles. Yes things may unfold in a way that might not be safe but if you don't go down the inside you will not get squashed. Observe the traffic around you and establish your own safety bubble. Hence the real safety comes from designed and constructed separation in a holistic transport system.
Also just slamming the brakes on has an often repeated, significant and tragic outcome on motorways as well as a less dangerous but still very frustating outcome on urban/city roads. So acting decisively may not be acting safely.
Automated cars are the way forward thee oberve and can control multiple events much quicker and more precisley than humans. But when interacting with humans that do many random acts, frequently deliberately as in road rage, close and rtisky overtakes and brake checking following traffic, the deterministic operation of the asutomated car will have minor areas where it does not know what to do and suspends:
http://gizmodo.com/a-cyclists-track-stand-totally-befuddled-one-of-googl...(link is external)
Comment from the obviously enlightened rider: "the odd thing is that even tho it was a bit of a CF, I felt safer dealing with a self-driving car than a human-operated one."
Perhaps we need the bloke with the red flag in front of all vehicles again [kiss] or is it a deeper seated fear, Terminator Rise of the Machines?
BTW how many have humans we killed on the roads so far this year?
Wow! What?
The answer to your question is 'not very many', unless by 'putting themselves in a hazardous position with vehicles' you mean choosing to cross the road on foot, walk on the pavement near a road, or sit in a coffee shop or on a train that a motorist might manage to drive into...or just cycle on roads. Really no idea what you are trying to say there.
As for ' if you don't go down the inside you will not get squashed'. I mean that's quite obviously not true, but I can't tell if its a deliberate untruth or if you are genuinely that naive.
And then you follow that up with a final sentence which seems to take an entirely different position. I can't figure out what you are trying to say there.
And I'm very, very, skeptical about self-driving cars.
Firstly because I don't think they are anywhere near ready yet. In fact I have a suspicion the whole project may eventually be quietly abandoned. There's a long history of AI projects that turned out to be far more difficult than anticipated - its always the last, awkward, real-world complications that scupper the whole thing.
And secondly, if they do actually happen, the social consequences are going to be far harder to predict than some people seem to think. You still will have to deal with human nature, even if it gets expressed through the market and the political system rather than via individual driver behaviour.
It sounds like the problem isn't right hooking when a cyclist is there, but the failure of the Uber vehicles to enter the bike lane prior to the right turn in accordance with Californian law.
It would be interesting to see what would happen if there was a cyclist in the bike lane when the Uber vehicle wants to turn right, and whether it has been programmed to let the cyclist pass first.
Furthermore, I am expecting that each vehicle must be programmed to know what jurisdiction it is operating in, and can respond to changes in legislation, while at the same time prioritising not endangering other road and "sidewalk" users ahead of law complicance.
Worth also remembering that human drivers are, as we all know, quite capable of driving very badly.
Can't wait for Uber's driverless lorries. Not.
So the immediate control action, pending a software rewrite, is to return the car to manual control. Scary stuff indeed.
Uber scary
It does seem somewhat reckless to test lethal weapons on the streets with the public as the possible victims. Given the litigious nature of American society, and the level of compensation and punitive damages, the first collision could cripple Uber.
"You don't need a belt and suspenders if you're wearing a dress," I'm sure there are plenty of women who wear stockings who would disagree!
Ummm, American English I think. I imagine he means what you & I would call "belt & braces". The highly decorative but rather impractical item of female clothing we call suspenders would be a garter belt in the colonies. Braces, meanwhile, are not devices for holding up trousers but wire contraptions children wear on their teeth to show they have wealthy parents. It's a strange country...