
A seven-second self-driving
In the video, we watch from inside the car’s cabin as the robo-Tesla edges close to a pedestrian who enters an intersection. The software display clearly shows the software picking up on the presence of the completely unprotected and vulnerable human, but Tesla’s software, it appears, doesn’t trigger compliance with the relevant law that says cars have to yield to pedestrians.
The text of the tweet couldn’t be more jolly about the victimless (in this case) moving violation, calling it “One of the most bullish / exciting things I’ve seen on Tesla
Whole Mars Catalog, which posted the video, is the online identity of a Tesla and SpaceX fan apparently
For a rundown of this not-at-all obscure law, check out the third second of the video, in which a sign can be seen which says “STATE LAW YIELD TO [PICTURE OF PEDESTRIAN] WITHIN CROSSWALK,” just to the right of the pedestrian in the crosswalk not being yielded to.
Elon Musk recently touted the latest update to Tesla Full Self-Driving beta, version 11.4.1, as a steep improvement. It’s been
The video has some Twitter users concerned:
The Whole Mars Catalog Twitter account is, so far, full-throated in its defense of FSD 11.4.1,
That a California driver may well also ignore the law in a similar situation is irrefutable (the author of this article is a reluctant Los Angeles driver). However, whether or not we should program our robotic cars to also drive this way should, perhaps, be a matter for public debate, and, maybe, for the legal system to decide. Such a democratically-informed decision could perhaps be made before this new norm is rolled out as a software update, and exuberantly lapped up by fans of a billionaire who then get to experiment with it on public roads.