Hoffa on self driving trucks

dudebro

Well-Known Member
Anyone ever wonder what tyep of retaliation would be taken if this were to ever happen? I mean you already see people shooting up the work place for losing their job.

I can already imagine people purposely getting into accidents with the driverless vehicle. Multiple truck tires slashed a day. It would be crazy. Because if UPS can get rid of us than you know many other companies have already done the same thing.

Sorry to kinda of go off topic. Just always wondered how people would react to losing their livelihood.
They would be vilified, called terrorists in the media, have all the racist or sexist posts they've ever made n private groups on social media made public, etc.... so you'd have to look like a loony toon to support them.
 

brett636

Well-Known Member
I find all the talk of self driving trucks to be a bit disturbing. Not because they could one day replace me, which I do feel is unlikely in the 25 years I have left to drive, but that the race to self driving vehicles will end up in getting people killed like the man in the Tesla. There are simply too many situations where the judgement of a human outweighs that of a software program. In the Tesla situation the car simply did not see that huge semi truck in front of it. As we all know anything built by a human can and will fail, and the electronic devices enabling a self driving car will not be immune to this simple fact. Planes have had auto pilots for years which do control a lot of the factors of flight these days, yet we still have humans in the cockpit to take over incase of failure or if the plane ends up in conditions that are beyond the autopilot's ability to compensate. Then of course we have the issue of hacking and terrorism. What if some person or group is able to take control of the millions of self driving cars that are in this supposed future, and as Ford has indicated will have no means for the humans inside to take control, who will watch helplessly as they are turned into 3k lb. battering rams running into each other, other people, and structures. Finally, I think we can all agree that as a species we are horribly flawed, and my question to anyone who believes this is our future how can we, as a flawed people, create something that has to be flawless and do so by the millions.

The question isn't whether self driving vehicles are in the near future, the question should be how many of our lives are expendable in the eyes of the ego driven maniacs who really believe they can create a perfectly flawless machine to move us around.
 

dudebro

Well-Known Member
The question isn't whether self driving vehicles are in the near future, the question should be how many of our lives are expendable in the eyes of the ego driven maniacs who really believe they can create a perfectly flawless machine to move us around.

You're right. That IS the most important question. But you pose this as though the only concern is jobs. What if you knew that humans behind the wheel of motor vehicles are on pace to kill 37,000 people this year in the US alone, and vehicular deaths are going UP this year vs. previous years..

The mistake I think most people make, is they're totally fine with 40,000 people a year dying in cars due to human error, but they can't tolerate a SINGLE fatality in an autonomous vehicle.

You bring up all the modalities of electronic failure. But you minimize the fact that humans get fatigued, high, drunk, inattentive, angry, distracted, etc. and cause mayhem that way, and autonomous vehicles won't.
 

LeadBelly

Banned
You're right. That IS the most important question. But you pose this as though the only concern is jobs. What if you knew that humans behind the wheel of motor vehicles are on pace to kill 37,000 people this year in the US alone, and vehicular deaths are going UP this year vs. previous years..

The mistake I think most people make, is they're totally fine with 40,000 people a year dying in cars due to human error, but they can't tolerate a SINGLE fatality in an autonomous vehicle.

You bring up all the modalities of electronic failure. But you minimize the fact that humans get fatigued, high, drunk, inattentive, angry, distracted, etc. and cause mayhem that way, and autonomous vehicles won't.
When is there to much technology to where you can't afford to eat anymore?
 

sandwich

The resident gearhead
You're right. That IS the most important question. But you pose this as though the only concern is jobs. What if you knew that humans behind the wheel of motor vehicles are on pace to kill 37,000 people this year in the US alone, and vehicular deaths are going UP this year vs. previous years..

The mistake I think most people make, is they're totally fine with 40,000 people a year dying in cars due to human error, but they can't tolerate a SINGLE fatality in an autonomous vehicle.

You bring up all the modalities of electronic failure. But you minimize the fact that humans get fatigued, high, drunk, inattentive, angry, distracted, etc. and cause mayhem that way, and autonomous vehicles won't.
You talk as though safety is the main reason self driving cars are here. Big companies don't care about the safety of people. The only reason self driving cars exsist is to eliminate jobs period the end. They just have to sell the safety aspect. And they have you hooked like a fool. The government nor large companies do not care about human deaths. Do you think freightliner cares if 40,000 people a year die in car accidents or do you think they care about selling 4 million self driving trucks to big companies who are itching to eliminate their work force. "Safety" is smoke in mirrors to get the public to eat it up.
 

brett636

Well-Known Member
You talk as though safety is the main reason self driving cars are here. Big companies don't care about the safety of people. The only reason self driving cars exsist is to eliminate jobs period the end. They just have to sell the safety aspect. And they have you hooked like a fool. The government nor large companies do not care about human deaths. Do you think freightliner cares if 40,000 people a year die in car accidents or do you think they care about selling 4 million self driving trucks to big companies who are itching to eliminate their work force. "Safety" is smoke in mirrors to get the public to eat it up.

Exactly. dudepro completely missed the point and went straight for jobs when I said from the beginning that jobs are not my main concern.

When we have companies like GM who cannot design an ignition switch capable of doing the simple job of keeping a car running how are we to expect them to design an array of sensors and cameras that will work 100% of the time to keep us safe as we travel down the road. Where does the ego of creating this perfect machine end and the reality begins that understands that nothing we create is perfect. The safety side of it sounds great, but I highly doubt it will be so rosey once the technology is mass produced by the millions. When that self driving semi truck fails and rams itself into your car killing your entire family are you going to think "Well, atleast the number of human caused fatalities are down overall!" Don't get me wrong, I am all for technology making our lives easier, but the race to replace humans with technology won't necessarily make things better.
 

DriveInDriveOut

Inordinately Right
Bottom line is Hoffa never said self driving vehicles aren't going to happen, he said it's not going to put current union drivers out of a job. I agree with him.
 

FAVREFAN

Well-Known Member
I find all the talk of self driving trucks to be a bit disturbing. Not because they could one day replace me, which I do feel is unlikely in the 25 years I have left to drive, but that the race to self driving vehicles will end up in getting people killed like the man in the Tesla. There are simply too many situations where the judgement of a human outweighs that of a software program. In the Tesla situation the car simply did not see that huge semi truck in front of it. As we all know anything built by a human can and will fail, and the electronic devices enabling a self driving car will not be immune to this simple fact. Planes have had auto pilots for years which do control a lot of the factors of flight these days, yet we still have humans in the cockpit to take over incase of failure or if the plane ends up in conditions that are beyond the autopilot's ability to compensate. Then of course we have the issue of hacking and terrorism. What if some person or group is able to take control of the millions of self driving cars that are in this supposed future, and as Ford has indicated will have no means for the humans inside to take control, who will watch helplessly as they are turned into 3k lb. battering rams running into each other, other people, and structures. Finally, I think we can all agree that as a species we are horribly flawed, and my question to anyone who believes this is our future how can we, as a flawed people, create something that has to be flawless and do so by the millions.

The question isn't whether self driving vehicles are in the near future, the question should be how many of our lives are expendable in the eyes of the ego driven maniacs who really believe they can create a perfectly flawless machine to move us around.

The human death rate is 1 per 84million miles driven on U.S. roads. Tesla's death rate under "autopilot" is 1 in 222million as of 10 days ago. By the end of the month, it will be 200% safer than humans in it's "beta" mode. They are on level 2 of 4 levels of autonomous driving and its already 200% safer than humans.

You might be safe as a driver for your 25 years. All the major truck manufacturers are making and testing some form of autonomous semi trucks already. UPS will start bringing these in most likely in the 8-10 year range. Obviously these things are grandfathered. It may happen where a new contract is negotiated where say 25% of all new feeder jobs would be autonomous. And then say the next contract might be 50% and then 75%, etc. Until no new jobs would be filled by humans and all current drivers would be used where needed. So it would happen over time, not overnight.

As you say, humans are flawed. Right now, the biggest danger to a Tesla Model S driving on autopilot is the human driving the other vehicle that is unpredictable. The hardest thing according to the experts in developing self driving software is the unpredictability of humans.

As far as hacking goes, 99% of new cars today are already hackable. If you use your phone connected to your Bluetooth connection on your car, it's hackable. If you use internet connectivity on your car, it's hackable. Peeps need to watch 60
minutes more. They covered all these topics multiple times. Including Amazons drone delivery program which is going to kick UPS's(our) luddite asses if we don't get our heads out of our ass in a big hurry.

In order to defeat your enemy, you must know your enemy. All the naysayers in here need to get out of the way, and stop holding progress back. Our competitors are doing it, so we better freakin get caught up, or you/we/all will be out of a job in 10 years. Drones and autonomous driving is coming. Either we excel at that to compete or fall by the wayside. Your choice.
 
Last edited:

FAVREFAN

Well-Known Member
You're right. That IS the most important question. But you pose this as though the only concern is jobs. What if you knew that humans behind the wheel of motor vehicles are on pace to kill 37,000 people this year in the US alone, and vehicular deaths are going UP this year vs. previous years..

The mistake I think most people make, is they're totally fine with 40,000 people a year dying in cars due to human error, but they can't tolerate a SINGLE fatality in an autonomous vehicle.

You bring up all the modalities of electronic failure. But you minimize the fact that humans get fatigued, high, drunk, inattentive, angry, distracted, etc. and cause mayhem that way, and autonomous vehicles won't.

Spot on man, spot on.
 

DriveInDriveOut

Inordinately Right
The human death rate is 1 per 84million miles driven on U.S. roads. Tesla's death rate under "autopilot" is 1 in 222million as of 10 days ago. By the end of the month, it will be 200% safer than humans in it's "beta" mode. They are on level 2 of 4 levels of autonomous driving and its already 200% safer than humans.

You might be safe as a driver for your 25 years. All the major truck manufacturers are making and testing some form of autonomous semi trucks already. UPS will start bringing these in most likely in the 8-10 year range. Obviously these things are grandfathered. It may happen where a new contract is negotiated where say 25% of all new feeder jobs would be autonomous. And then say the next contract might be 50% and then 75%, etc. Until no new jobs would be filled by humans and all current drivers would be used where needed. So it would happen over time, not overnight.

As you say, humans are flawed. Right now, the biggest danger to a Tesla Model S driving on autopilot is the human driving the other vehicle that is unpredictable. The hardest thing according to the experts in developing self driving software is the unpredictability of humans.

As far as hacking goes, 99% of new cars today are already hackable. If you use your phone connected to your Bluetooth connection on your car, it's hackable. If you use internet connectivity on your car, it's hackable. Peeps need to watch 60
minutes more. They covered all these topics multiple times. Including Amazons drone delivery program which is going to kick UPS's(our) luddite asses if we don't get our heads out of our ass in a big hurry.

In order to defeat your enemy, you must know your enemy. All the naysayers in here need to get out of the way, and stop holding progress back. Our competitors are doing it, so we better freakin get caught up, or you/we/all will be out of a job in 10 years. Drones and autonomous driving is coming. Either we excel at that to compete or fall by the wayside. Your choice.
I can't wait until planes get autopilot. It's going to be so much cheaper to fly once we get rid of all those overpaid pilots.
 

brett636

Well-Known Member
The human death rate is 1 per 84million miles driven on U.S. roads. Tesla's death rate under "autopilot" is 1 in 222million as of 10 days ago. By the end of the month, it will be 200% safer than humans in it's "beta" mode. They are on level 2 of 4 levels of autonomous driving and its already 200% safer than humans.

These stats may be true, but if you lose a loved one due to a self driving vehicle failure are you going to lean on these stats as a means of comfort?
 

wide load

Starting wage is a waste of time.
If a post gets reported and it violates our rules I will delete it. Browncafe averages 1700 posts a day. It's amusing that some members believe that I read them all. I usually only have time to read whatever was reported.

If I want to browse through the forum union topics would never be where I'd read. I have no interest in union topics. That and some of you simply bully other members and make personal attacks. I see no value in that.
Stop screaming at us.
 

sandwich

The resident gearhead
The human death rate is 1 per 84million miles driven on U.S. roads. Tesla's death rate under "autopilot" is 1 in 222million as of 10 days ago. By the end of the month, it will be 200% safer than humans in it's "beta" mode. They are on level 2 of 4 levels of autonomous driving and its already 200% safer than humans.

You might be safe as a driver for your 25 years. All the major truck manufacturers are making and testing some form of autonomous semi trucks already. UPS will start bringing these in most likely in the 8-10 year range. Obviously these things are grandfathered. It may happen where a new contract is negotiated where say 25% of all new feeder jobs would be autonomous. And then say the next contract might be 50% and then 75%, etc. Until no new jobs would be filled by humans and all current drivers would be used where needed. So it would happen over time, not overnight.

As you say, humans are flawed. Right now, the biggest danger to a Tesla Model S driving on autopilot is the human driving the other vehicle that is unpredictable. The hardest thing according to the experts in developing self driving software is the unpredictability of humans.

As far as hacking goes, 99% of new cars today are already hackable. If you use your phone connected to your Bluetooth connection on your car, it's hackable. If you use internet connectivity on your car, it's hackable. Peeps need to watch 60
minutes more. They covered all these topics multiple times. Including Amazons drone delivery program which is going to kick UPS's(our) luddite asses if we don't get our heads out of our ass in a big hurry.

In order to defeat your enemy, you must know your enemy. All the naysayers in here need to get out of the way, and stop holding progress back. Our competitors are doing it, so we better freakin get caught up, or you/we/all will be out of a job in 10 years. Drones and autonomous driving is coming. Either we excel at that to compete or fall by the wayside. Your choice.
If I'm going to lose my job to automation i will try as hard as I can to bring the company down with me.
 
Top