Comments to FMCSA on Automated Driving Systems

Safe Integration of Automated Driving Systems-Equipped Commercial Motor Vehicles

Docket ID: FMCSA-2018-0037-0131

https://www.regulations.gov/comment?D=FMCSA-2018-0037-0131

Truck-Drivers-Money-Saving-Tips.com - Because truckers know that no trucker deserves to be stung financially.

DAT TruckersEdge - Most Loads. Best Rates. Serving Owner-Operators Since 1978. 10% off your first 12 months. New customers only.

Written on August 27-28, 2019, by

Vicki Simons, President

NKBJ InfoNet, LLC

Truck-Drivers-Money-Saving-Tips.com

Having nearly 3 years of experience in driving a commercial motor vehicle; having been married to a man who has driven professionally for over 16 years; and having for over 10.5 years published information online that is geared to helping professional truck drivers save money (which is distributed to thousands of readers); I am commenting on the FMCSA’s Proposed Rule: Safe Integration of Automated Driving Systems-Equipped Commercial Motor Vehicles.

For the purpose of my comments,

  • “Automated driving systems” (ADS) may also be called autonomous, driverless, or self-driving vehicles; and
  • My main focus will be on vehicles within the trucking industry.

INTRODUCTION

There is no question that in its proper place, automation can save a great deal of time by doing what would otherwise be done by humans manually, such as washing machines being set to automatically wash, rinse and spin water out of laundry.

On the front end, one is compelled to ask, “Why are automated driving systems wanted or needed?”

Some people have identified autonomous vehicles as the solution for those with mobility issues.

While some people may argue that there are vast differences between transporting people and transporting freight, when it comes to self-driving vehicles, there needs to be

  • A level of uniformity between all vehicles; and
  • A set of specifications put in place that is unique to commercial motor vehicles (especially as it concerns vehicle dimensions and weight).

Theoretically, when a segment of the transportation system has been automated, it will perform

  • Flawlessly;
  • Every time; and
  • Around the clock except for times of required maintenance.

Thus, it is thought that by removing the human component from transportation and relying strictly upon computers and machines to do the work, all of the potential flaws and limitations that humans inherently bring with them to the industry will be removed.

But will those flaws and limitations actually be removed?

And even if they are, will other flaws and limitations take their place?

MONEY AND TIME

Those who want to implement automated driving systems may believe that they stand to save a great deal of money by

  • not having to pay truckers wages, benefits, wait time, etc.;
  • not being limited by truckers’ Hours of Service compliance; and
  • not having to invest in products that keep truckers
    • compliant with the law (example: sleeper berths) and
    • comfortable (example: climate control systems).

This article goes so far as to say that UPS made a “minority investment” in TuSimple for the purpose of replacing truckers.

Predictions vary as to when truly driverless vehicles will be on the road, but before they can be launched on a wide scale,

  • many issues need to be addressed and
  • many questions need to be answered.

In this article, we read, “Autonomous truck company, TuSimple, believes they will meet their stated goal of beginning to replace America’s truck drivers by the end of 2020…”

So time is of the essence when it comes to making sure that the law is comprehensive when it covers the topic of automated driving systems.

SAFETY

Some people are pushing for autonomous vehicles because they believe that their use will increase safety.

While it is certainly commendable that various entities want to improve safety on the roads by removing such factors as

  • Distracted driving;
  • Fatigued driving;
  • Impaired driving (due to substances used by drivers); and
  • Medical emergencies;

it must be remembered that car drivers are assigned the majority of factors in fatal car-truck crashes.

Furthermore, traffic slowdowns and sudden stops can potentially lead to trucks

  • either jackknifing
  • or being rear-ended by motorists from behind.

ETHICS

A disturbing article was published on October 24, 2018, which revealed that “Self-driving cars will have to decide who should live and who should die. Here’s who humans would kill.

In response to part of that article, if the brakes fail on a “self-driving” vehicle — and assuming there are no passengers aboard — why not automatically veer to a place where no life will be in danger, like a shoulder?

The askers of the questions in the article obviously want to herd people into an “either-or” mindset, when there may be other, previously unconsidered possibilities.

TRUSTING COMPUTERS AND MACHINES

Experienced professional truck drivers wrote:

“See the difference is whether or not you TRUST technology. I can barely trust it to make a phone call correctly. I would never sleep behind technology going 65 mph down a highway, let alone with 80,000#. Every single day we, as truck drivers get to decide if people live or die. Never in my lifetime would I be OK with a computer making those decisions.”

I replied:

“… Anyone who has ever experienced

– the aftermath of a computer glitch;

– what happens after a computer file becomes corrupted;

– the tragedy of a crashed hard drive; or

– the loss of photos after a camera has been dropped accidentally;

should think hard about this. …”

Can computers and machines make mistakes?

Anything that runs on technology is only as good as:

  • The hardware (which can fail);
  • The software and programming (which can be faulty); and
  • Its inability to be hacked by hackers.

While software can be programmed to anticipate certain things — like a calendar program alerting the user that an appointment is coming up — machines, hardware and software do not have the reasoning ability of humans.

Wikipedia lists the following definition: “The standard Safe Practices for Motor Vehicle Operations, ANSI/ASSE Z15.1, defines defensive driving skills as ‘driving to save lives, time, and money, in spite of the conditions around you and the actions of others.'”

Can an automated driving system “drive defensively” — or is that a skill relegated to humans only?

CCJDigital.com posted a video about a car-truck crash, the assessment with which adamantly disagree.

It is our concerted opinion regarding this accident that:

  • The set up — based on the speed and location of the vehicles as well as the configuration of the exit ramp and barrels — was almost a guaranteed crash from the beginning;
  • The only evasive action that could have been taken by the trucker would have been to slow down, but it is not the responsibility of the driver in the lane of travel to slow down for a car preparing to pass in the exit lane; and
  • Not even an automated driving system would have been able to “anticipate” the actions of the car driver.

GPS RELIANCE AND DRIVER ASSIST TECHNOLOGY

Some truckers have rightly been criticized for over-reliance on Global Positioning System (GPS) information for routing, including:

While these are legitimate concerns regarding human drivers, we read in the caption of the top image in an April 18, 2019, article:

“A ‘Differential GPS’ guidance system with 2-centimeter resolution keeps Alaskan snowplow drivers on the road even in zero-visibility white-out conditions.”

Granted, there are big differences between snow plows and big trucks.

In late 2018, my husband Michael and I took a test drive in “a car with some ‘driver assist’ or autonomous technologies.” One of these new technologies is called “Lane Departure Alert”.

Upon “what” does such “driver assistance” rely for staying in one’s lane:

  • Painted lines on the road;
  • GPS tracking; or
  • Something else?

If driver assist or automated driving systems rely on the painted lines on roads, what happens when trucks travel on roads with badly faded lines, no lines at all, or lines that have been covered by snow?

If driver assist or automated driving systems rely on GPS tracking, what would happen if the GPS service became unavailable for some reason — or the GPS unit on the truck stopped functioning?

How will autonomous trucks react in places where lanes are shifted, changed or closed?

LOCATIONS, TERRAIN AND TRACTION

The only mentions of ADS trucks on the roads that I have read about to date involve test runs or operations

  • Either in or between southern states (Arizona, Florida and Texas);
  • Or during non-winter months (Colorado, Nevada and Ohio).

In bad winter weather, trucks must be able to do more than just stay in their lanes; they must maintain traction on the road.

A June 20, 2019, article stated, “The hard part of commercial driving is the nuance and experience involved that is required to navigate a tractor-trailer in real-world conditions” and these are “Things [that] a computer can’t do. Not yet, anyway.”

REMOTE OPERATION OR DRIVER-IN-TRUCK?

In response to this article about a crash between a human-driven truck and an autonomous vehicle carrying passengers, I wrote the following in a trucking Facebook group:

OK, so

  • if the autonomous vehicle didn’t automatically respond to its surroundings;
  • if the attendant didn’t have control over the autonomous vehicle;
  • if the attendant had had control over the autonomous vehicle and could have prevented the accident;
  • if the autonomous vehicle was operated by remote control but the remote operator didn’t prevent the collision; and
  • if the trucker — who is very experienced in making deliveries to that location — could not reasonably have anticipated the autonomous vehicle being on its passenger side;

then why is this the trucker’s fault, especially since by his maneuvers, he was preventing other accidents from taking place?

I’m thinking ahead about an army of autonomous vehicles zipping around each other in backing situations like this.

How many more “live” truckers will be blamed for accidents involving vehicles with automated driving systems?

A July 16, 2019, article addressed driving “autonomous trucks” via remote control or “remote teleoperators”.

Other aspects about remote operation of trucks that must be considered are:

  • The number of trucks that any one remote operator can monitor simultaneously; and
  • How remote operators will keep themselves from being distracted behind their computer monitors.

Is it enough for drivers to control only “the first and last mile” of a trip?

In my opinion, this theory may work for routes that are mostly made up of highways and where shippers and receivers are located very close to highways.

But unless safeguards are put in place, automation will most likely never work in large cities with a great deal of traffic and tight turns.

Will limitations be put in place regarding when and where a remote operator will take over from an autonomous truck?

STOPPING TRUCKS

Here is a list of reasons why a truck cannot be stopped in time to avoid a collision:

  • failure to keep brakes in good condition;
  • failure to keep brakes properly adjusted;
  • carrying too much weight for the brakes to handle;
  • distracted driving;
  • not maintaining safe following distance;
  • driving too fast, including
    • on downhill grades or
    • in poor visibility situations (such as in snow, fog and smoke);
  • being cut off by another vehicle;
  • mechanical failure of otherwise properly adjusted brakes; and
  • (futuristically speaking) when an autonomous vehicle overrides a driver’s or remote control operator’s brake usage.

Many of these reasons, obviously, can be prevented by a human driver.

Can the brakes on an autonomous truck ever fail?

If so, who is responsible for that?

Checks and balances must be put in place to make sure trucks can stop in time, every time.

RETURN FROM ROBOTS TO HUMANS

In May 2014, it was reported that Toyota was replacing robots with humans to make even better robots.

It’s ironic that while seeking to replace humans, in a June 21, 2019, article, we read that Volvo Trucks “wants its vehicles to think like humans“.

REQUIRING HUMAN INTERACTION

When it comes to automated driving systems:

  • What situations will require a trucker’s intervention?
  • Will the truck ever override the driver’s actions?
  • As long as automated trucks allow for human interaction, will a human always be able to override the truck’s computer?

A June 12, 2019, article addressed drivers still being required in trucks with Level 2 driver assistance technology:

“As a driver assistance system, vehicles at Level 2 can simultaneously steer, brake and manage speed without driver intervention for short periods, but the driver must remain attentive and be prepared to regain control of the vehicle at any time — unlike several now-infamous Tesla drivers who put a little too much faith into that Level 2 system.” (emphasis added)

The article addresses “Lane Keep Assist” (LKA) and “Lane Departure Protection” features of Daimler trucks, stating that “LKA is not a self-driving feature.”

SELF-DRIVING FEATURES IN VEHICLES

Regarding “self-driving”, we read in a March 19, 2018, article:

“A car with the [Uber] ride-sharing service was operating in self-driving mode with a human in the driver’s seat when it struck and killed a woman in Arizona overnight.”

The article did not answer these questions:

  • If there was a “human in the driver’s seat”, did the driver see the pedestrian that the car was about to hit?
  • If the driver in the car saw the pedestrian — and if it was apparent that a collision was imminent — did the driver attempt to take any corrective action (like applying the brakes)?
  • If the driver applied the brakes, did the car “take over” or “override” the driver’s actions?

We later learned that Uber had not informed the “operator” of a self-driving vehicle that “all advanced driver assistance functions were disabled when it was put into computer control mode.”

This included the fact that the “automatic braking feature” was not enabled.

When the time comes for drivers to drive autonomous vehicles, each vehicle needs to have a panel of highly visible switches on the dashboard that show which assistance functions are either on or off.

EQUIPMENT FAILURE AND INSPECTIONS

At least two different kinds of truck equipment failure are cited for killing other motorists:

  • Brake failure; and
  • Wheel separation.

For example, an August 15, 2019, article reports that a motorist was killed when her vehicle was struck by the loose tire from a big rig.

After reading about accidents like this, I wonder: Did the trucker do a thorough pre-trip inspection on his vehicle — or was this a mechanical failure that could not have been prevented?

Furthermore, regarding automated vehicles:

  • What systems are in place to detect truck problems?
  • Who does a pre-trip inspection on them — and how often?
  • How often will driverless trucks undergo regular maintenance?

ROAD HAZARDS

According to this source, “Road hazard means a hazard encountered while driving a motor vehicle and may include, but not be limited to, potholes, rocks, wood debris, metal parts, glass, plastic, curbs or composite scraps”.

Regarding road hazards, I made the following comment through a trucker Facebook group regarding a June 12, 2019, article:

“There’s no substitute for ‘boots on the ground’. Eventually, one of these driverless trucks is going to encounter *something* bad that could have been avoided had a real live human being been present and watching.”

AUTONOMOUS VEHICLE FLAWS

Autonomous vehicles are not flawless, which we learn from federal investigators following the fatal crash of a Tesla Model 3 which crashed into a semitrailer in Florida on March 1, and was found to be “operating on the company’s semi-autonomous Autopilot system”.

Again, drivers need to know which systems are operational in each vehicle with an automated driving system at any given time, and what their limitations are.

It was reported on April 3, 2019, that hackers have identified two methods of tricking Tesla autopilot technology into “driving into wrong lanes or towards oncoming traffic.”

Until autonomous vehicle manufacturers develop some kind of “hack-proof” system of operating vehicles, this technology is going to have major limitations and may kill people.

HUMANS OVER COMPUTERS

Notice in an August 4, 2014, article the author’s assertion of the need for humans being present in autonomous trucks.

… The human driver is able to switch control of the truck to the vehicle’s embedded system and ride hands-free as a passenger. …

… The most important part to remember is that there is still a human in control of a switch to turn on-and-off the autonomous control. …

According to an April 12, 2018, article, As of this writing, however, humans triumph over computers in numerous real-world tasks — ranging from identifying a bicycle or a particular pedestrian on a crowded city street to reaching for a cup of tea and moving it smoothly to one’s lips — let alone conceptualization and creativity.”
nautil.us/issue/59/connections/why-is-the-human-brain-so-efficient (no longer online)

LEGAL IMPLICATIONS AND FINANCIAL REPERCUSSIONS FOR TRUCKERS

There are numerous legal implications for drivers of trucks with automated driving systems, including these:

  • Who is responsible or liable if the truck makes a critical “decision” that the trucker would not have made — and the truck becomes involved in an incident or accident?
  • What happens if another vehicle cuts in front of the autonomous truck too closely?
  • What sequence of events would follow and could the truck ever be involved in a more costly accident than the trucker would have been had he/she been in control?
  • How will an autonomous truck “react” to being operated in certain weather conditions, particularly rain, fog, snow, sleet and wind?
  • If a trucker is ticketed or fined for his/her role in operating an autonomous truck when the truck is involved in an incident or accident, what sort of legal argument could be offered in his/her defense?
  • After an autonomous truck is involved in an incident or accident that it caused, will the trucker’s
    • CSA score take a hit?
    • MVR include the accident?
    • Future trucking applications have to include that accident?
    • Personal auto insurance rates go up?

OTHER QUESTIONS

I have asked a lot of questions and will ask some more here that don’t seem to fit anywhere else.

  • By how much will “vehicle to vehicle” (V2V) technology need to improve before autonomous trucks are completely reliable?
  • How will ADS trucks be programmed to always avoid accidents at railroad crossings, at low clearances, and on non-truck routes?
  • Can autonomous trucks be programmed to avoid rear-ending other vehicles in slowed or stopped traffic each and every time — and also avoid being rear-ended by other vehicles?
  • Will trucks with automated driving systems be programmed to pass slower vehicles — and if so, how will that programming prevent it from passing improperly?
  • How can an automated driving system prevent accidents when others improperly pass it or pull out in front of it?
  • When one or more vehicles is on the shoulder or breakdown lane, will autonomous trucks be programmed to obey “move over” laws?
  • How can an automated driving system help prevent crashes with bicycles and motorcycles?
  • How many cameras will need to be installed — and at what points on a vehicle — to help prevent self-driving vehicles from being involved in accidents?
  • How will the FMCSA assign a crash score to a vehicle with an automated driving system?
  • Regarding speed, how will an ADS vehicle of any kind be able to detect not only the standard speed on a road, but also a temporary reduction in speed, such as in a construction zone or due to weather?
  • Will autonomous trucks be brought into weigh stations for weight checks or other types of inspections?
  • Can a driverless truck be placed out of service — even at a temporary inspection station?

CONCLUSION

Every vehicle with an automated driving system — no matter how basic or advanced — must be held to the same standards of equipment maintenance and safe operations as a vehicle that

  • either does not such a system
  • or is driven by a human being.

There have already been accidents involving vehicles with an automated driving system.

Over time, technology will likely improve, but the motoring public needs to have proof that such technology will not hurt them.

What is the U.S. Department of Transportation going to do to ensure that no more people will be hurt by vehicles with automated driving systems?

Vicki Simons, President

NKBJ InfoNet, LLC

Truck-Drivers-Money-Saving-Tips.com


Return from Comments to FMCSA on Automated Driving Systems to our About Us page or our Truck Drivers Money Saving Tips home page.