Driverless vehicles capable of monitoring and reacting to the health of their passengers will be hitting the roads by 2030, car-maker Mitsubishi claims.
The technology could tell whether its passenger is paying attention or if their health suddenly deteriorates – such as a heart attack – by tracking their vital signs.
And if someone is suddenly ill while in control of the vehicle, it could even pull over and phone their doctor or call for emergency help, the company said.
The Japanese motor firm unveiled a new concept driverless vehicle — dubbed the EMIRAI S — during the Tokyo Motor Show, which ran from October 24–November 4.
The design of the EMIRAI S features sensors that can monitor the comfort and health of its passengers and change in-car settings in response.
Scroll down for video
The Japanese motor firm unveiled a new concept driverless vehicle, pictured — dubbed the EMIRAI S — during the Tokyo Motor Show, which ran from October 24–November 4
Driverless vehicles capable of monitoring and reacting to the health of their passengers will be hitting the roads by 2030, said Mitsubishi Electric. Pictured, how the inside of the car may look
According to Mitsubishi’s executive officer for automotive equipment, Hiroshi Onishi, driverless vehicles like the EMIRAI S could help tackle ‘societal challenges.’
‘To give an example, one of the issues is an ageing society,’ he said at the unveiling.
‘What we have already actually experienced is that there is an increased number of accidents involving older drivers.’
The EMIRAI S, he explained, would use sensors to track its passengers’ vital signs, from which it could act in response.
‘So with this kind of sensing systems, you will hopefully be able to get the number of these accidents down.’
‘If it detects the driver is not paying attention or is not feeling well it can initiate an emergency, or stop the car at the side of the road and call a doctor, if necessary,’ said Mr Onishi said.
The company also demonstrated how the EMIRAI S could connect to businesses at a destination — such as an airport — to allow passengers to order food and other services to be either ready for their arrival or even delivered to them.
The vehicle will include voice recognition technology that is capable of processing requests and distinguishing between different voices.
Mr Onishi expects that the firm’s driverless vehicles would be hitting the roads ‘by about 2030’ — and would travel alongside traditional vehicles.
Mitsubishi is currently developing new systems to control and guide autonomous vehicles during their journeys.
These include an approach which combines satellite data with high-definition 3D maps to offer positioning accuracy of around 25 centimetres.
The design of the EMIRAI S features sensors that can monitor the comfort and health of its passengers and chance in-car settings in response. Pictured, the Tokyo Motor Show
According to Mitsubishi’s executive officer for automotive equipment, Hiroshi Onishi, driverless vehicles like the EMIRAI S could help tackle ‘societal challenges.’ Pictured, an attendee at the Tokyo Motor Show
Mr Onishi said that managing the mix of human-driven and autonomous vehicles on roads is a challenge that still needs to be addressed.
‘That’s a difficult issue because we will still have people driving cars and they make mistakes,’ he said.
‘The question is who needs to look after who? Who comes first?’
‘That is still something that the manufacturers can’t decide, it is basically a societal issue where everyone needs to work together.’
HOW DO SELF-DRIVING CARS ‘SEE’?
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
However, others make use of visible light cameras that capture imagery of the roads and streets.
They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.
In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.