Self-Driving Cars Waymo secures bigger award against workers who went to rival Uber |
- Waymo secures bigger award against workers who went to rival Uber
- Comma 2 CES Demo
- Intel’s Mobileye has a plan to dominate self-driving—and it might work
- Waymo's CEO John Krafcik at Fortune's Brainstorm Tech dinner at CES
- Elon 4 years ago today: "In ~2 years, summon should work anywhere connected by land & not blocked by borders, eg you're in LA and the car is in NY"
- Emphasis on Cameras over Lidar on Autonomous Vehicles Sets Mobileye Apart From Competition, CEO says
- Waymo Driverless video from inside car
- QUESTION: Eyes vs Cameras
- Hesai releases PandarQT, an Ultra-wide FOV LiDAR optimized for blind spot detection
- Autonomous Vehicle Technologies – A CES2020 Gallery
Waymo secures bigger award against workers who went to rival Uber Posted: 10 Jan 2020 11:52 AM PST
| ||
Posted: 09 Jan 2020 06:25 PM PST
| ||
Intel’s Mobileye has a plan to dominate self-driving—and it might work Posted: 10 Jan 2020 12:26 PM PST
| ||
Waymo's CEO John Krafcik at Fortune's Brainstorm Tech dinner at CES Posted: 09 Jan 2020 07:13 PM PST
| ||
Posted: 10 Jan 2020 01:26 PM PST | ||
Emphasis on Cameras over Lidar on Autonomous Vehicles Sets Mobileye Apart From Competition, CEO says Posted: 10 Jan 2020 01:17 AM PST
| ||
Waymo Driverless video from inside car Posted: 09 Jan 2020 03:43 PM PST
| ||
Posted: 09 Jan 2020 07:04 PM PST There are a few different approaches to self-driving cars, but for the purposes of this discussion let's talk about camera based self-driving. With tools such as machine learning and neural networks, it's only a matter of time until self driving vehicles become equal to or better than human drivers (how much time is anyone's guess). The fact that computers are exponentially faster than us, are never distracted, and have the ability to improve can only lead us into a self driving future. However, my main concern with this is the limitations of the cameras relative to the capability of the human eye. The amount of instantaneous dynamic range our eyes are capable of far exceeds what modern day cameras can currently achieve, especially in low light settings. It doesn't matter how good the algorithm is if the camera can't physically "see" something. How do camera based self-driving vehicles plan to overcome this? [link] [comments] | ||
Hesai releases PandarQT, an Ultra-wide FOV LiDAR optimized for blind spot detection Posted: 09 Jan 2020 07:00 PM PST
| ||
Autonomous Vehicle Technologies – A CES2020 Gallery Posted: 09 Jan 2020 05:43 PM PST
|
You are subscribed to email updates from Self-Driving Cars – Look reddit, no hands!. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment