Safe Integration of Automated Driving Systems-Equipped Commercial Motor Vehicles
Docket ID: FMCSA-2018-0037-0131
Written on August 27-28, 2019, by
Vicki Simons, President
NKBJ InfoNet, LLC
Having nearly 3 years of experience in driving a commercial motor vehicle; having been married to a man who has driven professionally for over 16 years; and having for over 10.5 years published information online that is geared to helping professional truck drivers save money (which is distributed to thousands of readers); I am commenting on the FMCSA's Proposed Rule: Safe Integration of Automated Driving Systems-Equipped Commercial Motor Vehicles.
For the purpose of my comments,
There is no question that in its proper place, automation can save a great deal of time by doing what would otherwise be done by humans manually, such as washing machines being set to automatically wash, rinse and spin water out of laundry.
On the front end, one is compelled to ask, "Why are automated driving systems wanted or needed?"
Some people have identified autonomous vehicles as the solution for those with mobility issues.
While some people may argue that there are vast differences between transporting people and transporting freight, when it comes to self-driving vehicles, there needs to be
Theoretically, when a segment of the transportation system has been automated, it will perform
Thus, it is thought that by removing the human component from transportation and relying strictly upon computers and machines to do the work, all of the potential flaws and limitations that humans inherently bring with them to the industry will be removed.
But will those flaws and limitations actually be removed?
And even if they are, will other flaws and limitations take their place?
Those who want to implement automated driving systems may believe that they stand to save a great deal of money by
This article goes so far as to say that UPS made a "minority investment" in TuSimple for the purpose of replacing truckers.
Predictions vary as to when truly driverless vehicles will be on the road, but before they can be launched on a wide scale,
In this article, we read, "Autonomous truck company, TuSimple, believes they will meet their stated goal of beginning to replace America's truck drivers by the end of 2020..."
So time is of the essence when it comes to making sure that the law is comprehensive when it covers the topic of automated driving systems.
Some people are pushing for autonomous vehicles because they believe that their use will increase safety.
While it is certainly commendable that various entities want to improve safety on the roads by removing such factors as
it must be remembered that car drivers are assigned the majority of factors in fatal car-truck crashes.
Furthermore, traffic slowdowns and sudden stops can potentially lead to trucks
A disturbing article was published on October 24, 2018, which revealed that "Self-driving cars will have to decide who should live and who should die. Here's who humans would kill."
In response to part of that article, if the brakes fail on a "self-driving" vehicle -- and assuming there are no passengers aboard -- why not automatically veer to a place where no life will be in danger, like a shoulder?
The askers of the questions in the article obviously want to herd people into an "either-or" mindset, when there may be other, previously unconsidered possibilities.
Experienced professional truck drivers wrote:
"See the difference is whether or not you TRUST technology. I can barely trust it to make a phone call correctly. I would never sleep behind technology going 65 mph down a highway, let alone with 80,000#. Every single day we, as truck drivers get to decide if people live or die. Never in my lifetime would I be OK with a computer making those decisions."
"… Anyone who has ever experienced
- the aftermath of a computer glitch;
- what happens after a computer file becomes corrupted;
- the tragedy of a crashed hard drive; or
- the loss of photos after a camera has been dropped accidentally;
should think hard about this. …"
Can computers and machines make mistakes?
Anything that runs on technology is only as good as:
While software can be programmed to anticipate certain things -- like a calendar program alerting the user that an appointment is coming up -- machines, hardware and software do not have the reasoning ability of humans.
Wikipedia lists the following definition: "The standard Safe Practices for Motor Vehicle Operations, ANSI/ASSE Z15.1, defines defensive driving skills as 'driving to save lives, time, and money, in spite of the conditions around you and the actions of others.'"
Can an automated driving system "drive defensively" -- or is that a skill relegated to humans only?
CCJDigital.com posted a video about a car-truck crash, the assessment with which adamantly disagree.
It is our concerted opinion regarding this accident that:
Some truckers have rightly been criticized for over-reliance on Global Positioning System (GPS) information for routing, including:
While these are legitimate concerns regarding human drivers, we read in the caption of the top image in an April 18, 2019, article:
"A 'Differential GPS' guidance system with 2-centimeter resolution keeps Alaskan snowplow drivers on the road even in zero-visibility white-out conditions."
Granted, there are big differences between snow plows and big trucks.
In late 2018, my husband Michael and I took a test drive in "a car with some 'driver assist' or autonomous technologies." One of these new technologies is called "Lane Departure Alert".
Upon "what" does such "driver assistance" rely for staying in one's lane:
If driver assist or automated driving systems rely on the painted lines on roads, what happens when trucks travel on roads with badly faded lines, no lines at all, or lines that have been covered by snow?
If driver assist or automated driving systems rely on GPS tracking, what would happen if the GPS service became unavailable for some reason -- or the GPS unit on the truck stopped functioning?
How will autonomous trucks react in places where lanes are shifted, changed or closed?
The only mentions of ADS trucks on the roads that I have read about to date involve test runs or operations
In bad winter weather, trucks must be able to do more than just stay in their lanes; they must maintain traction on the road.
A June 20, 2019, article stated, "The hard part of commercial driving is the nuance and experience involved that is required to navigate a tractor-trailer in real-world conditions" and these are "Things [that] a computer can't do. Not yet, anyway."
In response to this article about a crash between a human-driven truck and an autonomous vehicle carrying passengers, I wrote the following in a trucking Facebook group:
then why is this the trucker's fault, especially since by his maneuvers, he was preventing other accidents from taking place?
I'm thinking ahead about an army of autonomous vehicles zipping around each other in backing situations like this.
How many more "live" truckers will be blamed for accidents involving vehicles with automated driving systems?
A July 16, 2019, article addressed driving "autonomous trucks" via remote control or "remote teleoperators".
Other aspects about remote operation of trucks that must be considered are:
Is it enough for drivers to control only "the first and last mile" of a trip?
In my opinion, this theory may work for routes that are mostly made up of highways and where shippers and receivers are located very close to highways.
But unless safeguards are put in place, automation will most likely never work in large cities with a great deal of traffic and tight turns.
Will limitations be put in place regarding when and where a remote operator will take over from an autonomous truck?
Here is a list of reasons why a truck cannot be stopped in time to avoid a collision:
Many of these reasons, obviously, can be prevented by a human driver.
Can the brakes on an autonomous truck ever fail?
If so, who is responsible for that?
Checks and balances must be put in place to make sure trucks can stop in time, every time.
In May 2014, it was reported that Toyota was replacing robots with humans to make even better robots.
It's ironic that while seeking to replace humans, in a June 21, 2019, article, we read that Volvo Trucks "wants its vehicles to think like humans".
When it comes to automated driving systems:
A June 12, 2019, article addressed drivers still being required in trucks with Level 2 driver assistance technology:
"As a driver assistance system, vehicles at Level 2 can simultaneously steer, brake and manage speed without driver intervention for short periods, but the driver must remain attentive and be prepared to regain control of the vehicle at any time -- unlike several now-infamous Tesla drivers who put a little too much faith into that Level 2 system." (emphasis added)
The article addresses "Lane Keep Assist" (LKA) and "Lane Departure Protection" features of Daimler trucks, stating that "LKA is not a self-driving feature."
Regarding "self-driving", we read in a March 19, 2018, article:
"A car with the [Uber] ride-sharing service was operating in self-driving mode with a human in the driver's seat when it struck and killed a woman in Arizona overnight."
The article did not answer these questions:
We later learned that Uber had not informed the "operator" of a self-driving vehicle that "all advanced driver assistance functions were disabled when it was put into computer control mode."
This included the fact that the "automatic braking feature" was not enabled.
When the time comes for drivers to drive autonomous vehicles, each vehicle needs to have a panel of highly visible switches on the dashboard that show which assistance functions are either on or off.
At least two different kinds of truck equipment failure are cited for killing other motorists:
For example, an August 15, 2019, article reports that a motorist was killed when her vehicle was struck by the loose tire from a big rig.
After reading about accidents like this, I wonder: Did the trucker do a thorough pre-trip inspection on his vehicle -- or was this a mechanical failure that could not have been prevented?
Furthermore, regarding automated vehicles:
According to this source, "Road hazard means a hazard encountered while driving a motor vehicle and may include, but not be limited to, potholes, rocks, wood debris, metal parts, glass, plastic, curbs or composite scraps".
Regarding road hazards, I made the following comment through a trucker Facebook group regarding a June 12, 2019, article:
"There's no substitute for 'boots on the ground'. Eventually, one of these driverless trucks is going to encounter *something* bad that could have been avoided had a real live human being been present and watching."
Autonomous vehicles are not flawless, which we learn from federal investigators following the fatal crash of a Tesla Model 3 which crashed into a semitrailer in Florida on March 1, and was found to be "operating on the company's semi-autonomous Autopilot system".
Again, drivers need to know which systems are operational in each vehicle with an automated driving system at any given time, and what their limitations are.
It was reported on April 3, 2019, that hackers have identified two methods of tricking Tesla autopilot technology into "driving into wrong lanes or towards oncoming traffic."
Until autonomous vehicle manufacturers develop some kind of "hack-proof" system of operating vehicles, this technology is going to have major limitations and may kill people.
Notice in an August 4, 2014, article the author's assertion of the need for humans being present in autonomous trucks.
… The human driver is able to switch control of the truck to the vehicle's embedded system and ride hands-free as a passenger. …
… The most important part to remember is that there is still a human in control of a switch to turn on-and-off the autonomous control. …
According to an April 12, 2018, article, "As of this writing, however, humans triumph over computers in numerous real-world tasks -- ranging from identifying a bicycle or a particular pedestrian on a crowded city street to reaching for a cup of tea and moving it smoothly to one's lips -- let alone conceptualization and creativity."
There are numerous legal implications for drivers of trucks with automated driving systems, including these:
I have asked a lot of questions and will ask some more here that don't seem to fit anywhere else.
Every vehicle with an automated driving system -- no matter how basic or advanced -- must be held to the same standards of equipment maintenance and safe operations as a vehicle that
There have already been accidents involving vehicles with an automated driving system.
Over time, technology will likely improve, but the motoring public needs to have proof that such technology will not hurt them.
What is the U.S. Department of Transportation going to do to ensure that no more people will be hurt by vehicles with automated driving systems?
Vicki Simons, President
NKBJ InfoNet, LLC