The trucking company moved a driver to a vehicle with a lower maximum truck speed and poorer pulling power.
Then they had the audacity to suggest that the driver was wasting time when it took him longer to do his work!
The way it was described to us:
- his old truck could go as fast as 75 mph and
- his new lease truck could go only as fast as 66 mph (after it finally got up to speed after running through all the gears).
Who was "winning" in the process?
We took it upon ourselves to make a few calculations (with a few assumptions built in).
If you decide to personalize your calculations, please note that your results may be different from ours.
When we drove for Swift -- in the days when they speed-limited the trucks to 57 miles per hour -- we figured that as a transcontinental team, our average speed including all stops was about 43 miles per hour.
We figured that this gave us a percentage compared to our maximum speed of
43 / 57 = 75.4%
Note, that was for a coast-to-coast team at the time.
The driver described above runs locally within a radius of about 200 miles, so the amount of stopping, in-city driving and waiting on customers increases dramatically.
Until his logs are examined, we will not know the exact percentage of maximum speed he had in his old truck or has in his new truck.
However, we're going to hazard a guess that it is around 60%.
75 mph * 60% = 45 miles per hour average speed
66 mph * 60% = 39.6 miles per hour average speed
That's a difference of 5.4 miles per hour average speed.
Let's find out how much extra time it will take the driver to drive the same distance with reduced truck speed, based on our assumptions.
5.4 miles per hour * 10 hours per day = 54 miles/day
54 miles/day / 66 miles/hour = 0.82 hours per day
0.82 hours per day * 60 minutes per hour = 49.2 minutes extra to travel the same distance as the old truck.
Obviously, maximum speed affects travel time.
According to a 2008 article from MSNBC:(1)
"Truckers and industry officials say slowing a tractor-trailer rig from 75 mph to 65 mph increases fuel mileage by more than a mile a gallon..."
For a truck running twin 120-gallon (240 gallons total) fuel tanks, more than a mile per gallon saved translates into at least 240 more miles driven.
Farther down in that same article, we read that the cost of fuel surpassed (at that time) the cost of labor as the biggest expense for some carriers.
So let's see if that's true in this case.
If a truck goes from, let's say, 6 miles per gallon to 7 miles per gallon in truck fuel economy, then the costs of operating that truck (fuel costs only being considered) go down.
Using the same 200-mile radius (400 mile round trip) figure as above:
400 miles per day / 6 miles per gallon = 66.7 gallons per day
400 miles per day / 7 miles per gallon = 57.2 gallons
66.7 - 57.2 = 9.5 gallons per day difference
If the price of fuel is $4.00 per gallon:
9.5 gallons * $4.00 per gallon diesel = $38 per day saved
Where does the savings in fuel cross the line with respect to driver pay?
We have already shown that it takes better than 49 minutes extra for a driver to drive the same number of miles in the slower truck as shown above.
Unless the local driver is earning $38 per hour, the
company should save money by cutting back on truck speed.
Money saving tip: For the truck driver being paid by the mile, reducing truck speed equates to a pay cut -- because he/she is paid according to the most miles he/she can drive in a driving shift.
Only until driver pay is increased will this be off-set.
For the local truck driver being paid by the hour, reducing truck speed means extra money for extra time spent doing the same work.
Travel time to go the same distance is lengthened when speed is reduced.
On the other hand, it takes away from "home time."
Finally, an owner-operator should take into account not just the fuel savings from cutting back on truck speed, but also other effects (such as wear and tear on the engine and tires).
Each truck has its own "best" maximum speed and we encourage you to find out what that is.