Autonomous driving technology is advancing around the world, and with it are expected solutions to current social issues through reductions of accident-related deaths, elimination of driver shortages and provision of new transit methods. Japan has launched a government-led project, and in 2021 a Japanese manufacturer released a vehicle equipped with Level 3 capabilities that can handle all driving operations. Reporter Lemi Duncan experiences the functions of a Level 3-equipped vehicle, automated water taxis solving island transit problems and futuristic vehicles achieving human-like communication.
We sit at the wheel of the car of the future.
It's equipped with state-of-the-art driving technology.
It brings a new experience to the freeway.
Here we go. Hands-free mode has been activated.
It's really hands free! Amazing.
This is autonomous driving.
The car turns the wheel, accelerates, and brakes without input from the driver.
And more...
Pressing that button activates the system to
judge and execute lane changes.
Wow! Even lane changes are automated!
It's changing.
It changed lanes. That's really something.
Currently, development of autonomous driving technology is progressing globally.
In 2021, a vehicle with level 3 autonomous driving, which takes conditional control of all driving systems, was released in Japan.
In April of 2023, a law was passed allowing use of level 4, which requires no human input, in selected regions and environments.
Several reasons exist for the worldwide development of autonomous driving.
With the lack of drivers in the shipping industry, as well as mobility needs for the elderly, the technology is poised to address issues on a global scale.
Another important reason is the more than 1.3 million lives lost yearly in automobile accidents.
Autonomous driving is looked upon as a way to reduce those numbers, building a safer society.
The technology for autonomous driving also
works towards reducing accidents.
We need to make that our priority.
With the major innovations that autonomous driving could offer society, just how far has its technology advanced?
Hi, I'm Lemi Duncan.
On today's episode we'll be looking at the cutting edge of Japan's currently in development autonomous driving technology.
We're not far from a world of driverless cars.
Let's check it out!
The quest to realize autonomous driving is being led by the Japanese government.
In 2014, the Cabinet Office led organizations such as the Ministry of Land, Infrastructure, Transport and Tourism, the Ministry of Economy, Trade and Industry,
automobile and equipment manufacturers, business ventures, and universities, commencing the national project "SIP-adus."
At the helm of the project is Kuzumaki Seigo.
Kuzumaki describes the meaning behind the united push for autonomous driving.
Automobile development requires very high development costs.
Competitive areas are developed independently
by each company.
The cooperative areas are determined by
the companies, and done by SIP.
Automobile development is known to be fiercely competitive.
But in some areas, it's more efficient for manufacturers to work together.
One such area is the detailed map data used in navigation for autonomous driving.
SIP-adus leads in its construction.
I'm involved with a concept called dynamic maps.
It's like a highly accurate map on which various real-time
information and planned traffic regulations are updated.
Dynamic maps are digital maps that use a base of static information such as roads, lanes, and buildings, and update it with changing information on things like traffic, construction, accidents, pedestrians, and traffic signals.
The system accurately judges the position of the vehicle, and uses predictive driving based on the traffic conditions around it.
The government manages the infrastructure for this data, while manufacturers develop the vehicles that utilize it.
However autonomous driving comes with its own risks.
In 2021, a vehicle collided with a pedestrian at a Tokyo intersection, raising doubts as to the safety of the technology.
Level 3 autonomous driving had been unveiled in March of that year.
Autonomous driving is designated on a worldwide standard from levels 1 to 5.
With level 3, the vehicle can assume control in certain situations such as heavy traffic and operate all driving functions including the steering wheel, accelerator, and brakes.
The hands-free driving and lane changing that I showed you earlier was possible with just level 2.
It operates the gas, brakes, and steering, but only as an assistance to the driver.
They hold responsibility for the vehicle, and need to pay attention to its surroundings at all times.
So what's new with Level 3, then?
Oh, we’re slowing down.
OK. We are stuck in traffic.
This is Level 3. It activates autonomous driving
in situations like heavy traffic.
Even if you release the wheel or look away,
the system handles the driving for you.
When Level 3 activates, it switches from driver operation to the autonomous system.
The driver can shift their vision away from the road, and even view the built-in television.
If an accident occurs during Level 3 driving, the responsibility is generally considered to be the system's.
I'm not moving my arms or legs,
and yet it automatically accelerates and brakes.
It's very convenient.
The system should allow the driver stress-free transit.
We consider that to be its greatest value.
What went into the development of Level 3-enabled vehicles?
We spoke with the head engineer of a manufacturer's autonomous driving technology development, Shikama Mahito.
What was the most important technology in realizing Level 3?
Although it's very difficult to make one single choice,
sensing technology is of course important.
Different sensors each have their pros and cons.
The technology seeks to make the best use of them.
To create autonomous driving technology that would be certain to prevent accidents, sensing technology was a key element.
Level 3-enabled vehicles carry a total of 10 sensors and 2 cameras.
As to why so many sensors are necessary,
they replace the driver's field of vision.
When people face forward, their field of vision is limited.
Sensors can detect danger more quickly than drivers
across 360 degrees, allowing the vehicle to take the safest action.
That brings us to the 2 cameras equipped on this vehicle.
They interpret the images they film to judge the surroundings.
They're capable of distinguishing other cars, pedestrians, traffic signals, and signs.
However, their sensing capabilities are negatively affected by factors like darkness, fog, and glare.
Those weaknesses are compensated for by radar and LiDAR sensors.
Radar reflects radio millimeter waves off of surrounding objects, using the time with which they return to determine the distance and speed of the objects.
Although its sensing accuracy isn't reduced at night or in bad weather, it has difficulty precisely judging small objects,
or materials like cardboard with non-reflective surfaces.
Then there's the infrared sensor known as LiDAR, or "light detection and ranging."
It reflects lasers off of objects to judge things like distance, direction, position, and shape.
Since infrared light has a shorter wavelength than millimeter waves, LiDAR is well-suited for discerning small objects, as well as shapes.
But its accuracy is affected by weather conditions such as fog, rain and snow.
Both kinds of sensors have their strengths and weaknesses.
By combining their information and working together,
they yield results allowing more accurate movement.
By combining radar, LiDAR, and camera image recognition, the sensors are able to accurately scan the surroundings and judge them in place of a human's field of vision.
But being aware of the vicinity isn't enough to make an accident-proof car.
This device was prepared to accomplish the task.
This is our driving simulator.
It looks very dynamic.
An actual car is encircled by a screen on all sides.
This simulator develops something that could be compared to the brain.
It researches the actions the vehicle should take in various situations.
First, what kind of dangers are posed on an actual road?
To gather that data, driving tests were conducted on around 1.3 million kilometers of highways across Japan.
From that data, situations of potential danger such as being cut off by another vehicle or blocked by falling objects were extracted.
Then those situations were recreated within the simulator.
A number of civilian drivers also drove the simulator, which recorded their responses to the hazardous situations and compiled the data.
I took a drive in the simulator and tested it out for myself.
It's an odd sensation.
Who's going to cut me off?
You avoided the accident.
I was getting pretty nervous.
The simulator recorded how people responded to situations like these.
It then used the information to create lifelike responses within autonomous driving.
All of this work culminated in the world's first release of a Level 3-equipped car.
Automobiles aren't the only areas where autonomous driving technology is advancing: Another is with ships.
Autonomous ships have been developed to solve challenges currently faced by Japan.
We examine the science necessary to let them deal with marine obstacles such as waves and shifting tides.
The Seto Inland Sea is Japan's largest inland sea.
Its tranquil waters are dotted with over 700 islands.
Around 150 of them are inhabited, but many are not connected by bridges, making the residents dependent on liner ships with routes to mainland Japan.
Hiroshima's Ujina Port is a major shipping and distribution hub.
In January of 2023, an autonomously piloted ferry business was attempted here.
With no staff operating it, the ships make the voyage entirely automatically.
After simply selecting the destination on a touch panel, the ship decides the route, controlling the direction and speed on its own.
The law requires an operator to be present, but once he presses the tablet, he does nothing to steer the ship or adjust its speed.
From shore to shore, every inch of the voyage is automated.
What amazed me the most was how rather than just follow
a set route, the ship is aware of obstacles.
How it could stop or avoid them in response really surprised me.
The ship was developed by Kimura Yujin, who put his experience in AI and robots to use.
There are over 400 inhabited islands across Japan,
where many people reside.
Water travel is a very integral element
of their of their day-to-day lives.
Despite that importance, there are issues
to contend with like labor shortages.
Those factors had made it very difficult
to provide sustained services to the islands.
Recent years have seen the progression of depopulation and a lack of ship operators, causing routes to be rearranged or dropped entirely.
People living on the islands were in danger of losing their only means of transportation.
In response, Kimura stepped up to develop the autonomously piloted ship.
But developing for water revealed unique challenges not found in cars.
Kimura began his field trials in 2021.
With its calm waters, the Seto Inland Sea is an ideal testing site for the technology.
But due to the changing variable of its tides, the passage of ships is sometimes impeded.
Kimura began by creating a map with the areas where stoppages can occur marked in red.
Students from a local college studying ship navigation aided him in this process.
They used ultrasonic waves to detect the ocean depth, and marked areas where nets had been stationed on the map.
The result was a unique map just for autonomous piloting.
The vessel uses this map to decide the route between the current location and the destination, but actual autonomous piloting is more complicated.
Since factors like the weather, waves, and tides exert a major effect on it, the system needs to be advanced enough to deal with unexpected situations.
Like autonomous driving automobiles, ships use sensors with radar, LiDAR, and cameras, but Kimura was especially focused on the cameras' image recognition technology.
That image recognition proves vital in certain situations.
On this day, another ship approaches from straight ahead.
This is the screen of our navigation system.
The blue mark is our ship, and the nearing ship is marked with red.
To avoid any risk of collision, our ship changes course in response to it.
What process went into improving image recognition so greatly?
Here, research is being performed to distinguish other vessels filmed with the cameras.
Collaborating with Kimura on autonomous ship development is Yokoyama Tomoaki.
These ships are equipped with 4 cameras,
providing a 360 degree range of vision.
It searches out other vessels and mark them
with a blue frame, estimating the distance.
Then the course of the ship is calculated to avoid them.
The image recognition technology used to identify and mark other ships is made possible using AI.
Yokoyama gathered a large number of images picturing ships, and marked where they appear.
Then, he allowed AI to memorize the data.
As it became able pick out their characteristics, the AI learned how to categorically recognize ships within images.
However, it wasn't all smooth sailing.
The boats sway.
Unlike cars, the boats sway on the waters,
causing camera images to sway as well.
Strongly swaying with the waves disrupts AI, and could prevent it from avoiding a collision.
But their efforts achieved image sensing technology that could accurately judge other ships even among the shifting waters.
First it obtains information on the ship's position,
such as the direction of the swaying and which way it's facing.
It then uses that information to internally calculate
which way the camera is facing.
Given that, it adjusts its recognition of the surrounding ships
and their locations.
Their company seeks to improve image recognition technology to a level of safety where ships could operate completely unmanned.
What if we could make a system where one person
could safely and remotely control several ships?
I think we could contribute to
solving the problem of labor shortages.
Ships could be so accessible that instead of a rare experience,
they'd be integrated into daily life.
Social implementation of this technology could
provide people with new access to the sea.
Autonomous driving technology continues its steady advancement.
In the near future, cars will not only be able to drive, but also to think and communicate, like a member of the family.
The world of science fiction is knocking at our door!
What other advancements will next-generation autonomous driving bring?
Japanese manufacturers are hard at work answering that question.
Some manufacturers are developing vehicles with greatly boosted evasion capabilities in the event of emergencies.
Others focus their development on technology that responds to driver-side problems.
One manufacturer is creating an especially futuristic form of vehicle in their research lab.
This work in progress is known as the Micromobility.
It's designed to allow safe transit with very simple controls.
It accomplishes autonomous driving without radar or LiDAR sensors, making use of just cameras.
Analyzing the camera information, it understands the lanes and intersections as it drives.
Oh, nice!
And, maybe, I want to turn left here.
When arriving at an intersection, the passenger only needs to press the joystick in the desired direction.
The car makes the turn for them.
Oh, nice and smooth.
The speed is automatically adjusted based on the angle of the corner.
But the manufacturer has an even more futuristic vehicle to show us.
What kind of technology are you developing here?
Come to Beat's Burger.
Understood.
Wow! It comes right over to you.
This manufacturer is in the process of developing a next-generation vehicle.
It's capable of responding to human voice, automatically coming when called.
The car has sophisticated enough communication abilities to think and make suggestions to the user.
Come to Beat's Burger.
Understood.
It avoids the areas where other people are sitting.
I have arrived near Beat's Burger.
Are you looking at your smartphone?
That's me.
The vehicle converses with the user and recognizes that he's using his phone, asking whether it's him.
I'll stop in front of you.
Could you stop at that vending machine?
I'll stop near the blue vending machine.
Actually, could you stop by that car?
Due to danger in this area,
may I stop near the yellow cone?
That's fine.
Understood.
The vehicle judged parking near a car to be dangerous.
Instead, it suggested that it stop in a safer nearby area.
It views the surroundings, and considers appropriate communication.
I have arrived.
Thanks.
It shows a consideration that's human-like.
That's right. In human conversation,
one person doesn't always just follow the other's instructions.
They often figure out where to meet
while holding a conversation.
That's true.
We're recreating that in the vehicle.
The next-generation vehicle realizes highly lifelike communication.
It's all made possible by uniquely developed AI technology.
The AI behind the vehicle puts an emphasis on mutual communication.
The user and vehicle talk to decide the location together, or the vehicle can make suggestions.
The company holds a vision for society in which the next-generation vehicles become a facet of everyday life.
Hey, stop over here!
Hey Misa, there's a fence.
So stop a little further down.
All right!
Thanks for waiting.
Autonomous driving will greatly increase our quality of life.
But more importantly it will decrease accidents and lead to a safer future in transportation.
I was deeply touched by the dedication of the developers and their passion to create a future where we can feel safer in vehicles.
I'm really looking forward to that future.