A quick Google search for average car age hints that it's nearly 12 years [0]. Even if everyone buying a new car bought an EV, we're probably 12 years away from 50% of the cars on the road being EVs. And since the age trend is actually increasing, we're probably looking at 13-15 years in actuality. And again, that assumes 100% of new vehicles purchased are EV, which definitely isn't the case.
This doesn't even begin to look at self-driving. We're still years away from reliable level 4 automation.
> Self driving cars will have their AI-winter time soon
I don't see this happening any time soon. There might be a contraction in highly speculative investments (i.e., VC cash) aimed at autonomous driving systems. However, even if tech never gets beyond level 2, this will still be a high-growth and high-wage field for the next couple of decades as automobile manufacturers and their tier one suppliers incorporate existing ADAS across the lineup.
And that's to say nothing of the many limited domains where level 4 is definitely doable, including closed-environment mining and manufacturing sites. As well as adjacent industries (e.g., maritime) where even levels 1 and 2 could help a lot.
Self-driving is a scorching hot fireball right now. It might cool down as people realize that level 4-5 is not happening any time soon, but a prolonged winter is hard to believe.
>And that's to say nothing of the many limited domains where level 4 is definitely doable, including closed-environment mining and manufacturing sites. As well as adjacent industries (e.g., maritime) where even levels 1 and 2 could help a lot.
Yeah, I have no doubt about special purpose level 4 in limited environments. My comment was about seeing general purpose 4 and 5.
The definition of level 5 being "under all roadway and environmental conditions that can be managed by a human driver" means that whatever mechanism being used would have a better "brain" (at least in the area of "driving") than the human one, correct?
If yes: kind of scary, as such capability would probably have to include the comprehension/understanding of e.g. "context", the ability to "abstract", be able to generalize from indirecly related informations, etc.. to understand potential upcoming dangers.
For example when driving behind a truck which has its trailer that wobbles continuously left and right (but which can still keep its lane) because of a partially flat tire would probably need a very advanced AI to recognize the potential danger.
Yes... the problem of autonomous cars is likely that of strong, general purpose AI.
Unfortunately, the politics around autonomous cars insists that human beings are basically idiots behind the wheel, so the problem seems more trivial than it is. After all, how hard can it be to design an AI smarter than an idiot?
>For example when driving behind a truck which has its trailer that wobbles continuously left and right (but which can still keep its lane) because of a partially flat tire would probably need a very advanced AI to recognize the potential danger.
I think it would be surprising if self driving cars didn't launch in the next 10 years. Consider that most people never drive more than 10 miles from their home on a given day. It's pretty trivial to imagine even "limited" AI vehicles being able to serve vast swaths of the population.
In that case, likely a lot of people would want to not have to invest in the hefty price of owning, maintaining, and insuring a vehicle, let alone driving which everyone knows is more dangerous than most things a person does in a given day.
> Consider that most people never drive more than 10 miles from their home on a given day.
This is simply untrue if you're considering average commute distance of drivers in major cities.
> hefty price of owning, maintaining, and insuring a vehicle
These costs are highly variable, and many reasonable options are quite cheap.
I don't see self-driving cars catching on any time soon in the US. Possibly in Europe. I think they will be limited to smaller vehicles that operate in glorified bike lines, only on known routes.
> This is simply untrue if you're considering average commute distance of drivers in major cities.
I'm sure that'd be correct if you'd said suburbs and rural areas, but according to this[0], the average commute for almost every major city is under 10 miles.
>In that case, likely a lot of people would want to not have to invest in the hefty price of owning, maintaining, and insuring a vehicle, let alone driving which everyone knows is more dangerous than most things a person does in a given day.
All of the costs are worth it, as long as I don't have to share transportation with anyone else.
Even if self-driving cars become the norm in developed countries, there is no way they'll be so in the rest of the world.
In many places, there are no lines drawn on the road and you have to imagine the lanes. It could be a space big enough for 3 or 4 lanes. It makes me wonder if cars would recognize the general direction of the road and not drive diagonally through the imaginary lanes when there's a turn.
Some lanes may also have horrible holes that seem like they could easily take your wheel off if you fall on them with speed. Sometimes they're sinkholes, other times they're part of a construction job that was left midway for months seemingly until someone has a horrible accident. Some lanes look like they're at risk of becoming large sinkholes, and you'd rather avoid them lest the whole car suddenly falls meters below the ground. For both of these, you know they're there, and you know they're basically unavoidable once they become visible. How would you communicate these risks to the car?
Jaywalkers on high-speed, high-traffic highways might be common due to lack of bridges or any other alternative to crossing the road. They coordinate their movements with the incoming traffic and the drivers also coordinate their movements with them. Behaving unexpectedly, like simply changing lanes, even at a distance, could be fatal because it changes the shape of the incoming traffic the jaywalkers depend on once they decided to start crossing. Emergency stopping might worsen the situation when tailgaters are common.
Pedestrians might also be suicidal and you might be able to discern their intent from a distance by watching their behavior, but the car is not going to interpret that and it'll get close enough for the pedestrian to throw themselves at the road when it can't avoid them.
Advancing when a streetlight becomes green might generally be unsafe in zones where it's common for cars to cross at high speeds when the light is about to or just turned red. Would cars be on the lookout for high speed traffic coming from the left or right at a distance?
Simply put, there's lots to be on the look out for, especially when pedestrian city infrastructure, vehicle city infrastructure, traffic law enforcement, driving education, etc. is lacking. If self-driving cars become the norm in developed countries, I think it'd be in great part because they're not lacking in any of these things, and the car software can deal with a somewhat consistent environment. That wouldn't be the case elsewhere, though.
As to what my opinion is on the issue at hand, as a software developer, I wouldn't put so much trust in software as to not have a way to take manual control when this software has the ability to bring physical harm or death to me and others. Even disregarding the possibility of malice through malware and assuming whatever code that executes was written with the best intentions, there's just too many ways for things to go wrong in this problem domain and the consequences can be deadly.
Good design lies in simplicity. I cannot imagine the behemoth of code that must be required to implement safe automated driving. That's a lot to put faith in when lives are on the line.
Obviously I don't have a dog in the race of how other people raise their own kids, so I'm a little hesitant to wade in to you guys' discussion.
That said, I mean:
"...If my parents said they were limiting my internet I would've laughed in their face..."
???
If I had witnessed an exchange between parents and their child where the child "...laughed in their face...", I'm certain I would not have characterize whatever parenting style those parents were using as "good".
It would have rediculous for my parents to suggest it. They trusted me to drive around pretty much anywhere and hold a job without getting in much trouble. How much more dangerous could the internet be than driving through a sketchy area.
Their trust made me far more responsible than I would have been if they tried to treat me like a child when I was in my mid-late teens. I knew if I screwed up it was on me and would've felt bad that they misplaced their trust. Kids loooovvveee being treated like adults, I wish more people would use that as leverage.
I think many parents these days make the transition from guardian to mentor years too late
While I can't speak to 'all parents', I can say that as a parent of five, kids develop significantly differently. We've been treating our oldest daughter as basically an adult since she was 13 or 14; the next-oldest is getting there, at 15. Next couple down the line we have to monitor more closely.