Autonomous driving is a key talking point within the transport and automotive industries. It’s not just the consumer market and passenger vehicles that have been exploring the world of autonomy. A new project in Britain has been looking into the autonomous operation of long-mile logistic vehicles.
The 5G Connected and Automated Logistics (5G CAL) project has seen autonomy company StreetDrone fit autonomous and teleoperation technology into a Terberg EV truck.
The truck, built for a consortium comprising the North East Automotive Alliance and others, has seen partners spending over 23 months on trial, understanding this new technology in an operational environment for the very first time in the UK.
From the progress made so far by the team, the 5G CAL project is proof that an autonomous truck can be automated to drive routes, backed up with teleoperation systems connected to a private 5G network to facilitate remote operations where necessary.
We spoke to Sander van Dijk, head of autonomous systems, StreetDrone, to find out more about this technology and the challenges the team overcame along the way.
Just Auto (JA): Could you tell me about your involvement with the project and your role within the company?
Sander van Dijk (SVD): I joined StreetDrone last year, before that, I spent a lot of time in robotics, AI, machine learning and data science. Then the opportunity came up to help StreetDrone build the self-driving software and autonomous capabilities. So last year, I decided to join them and at that time the 5G CAL project was already underway.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataMy specific role was to lead the self-driving software team, the team is responsible for all the higher level self-driving work assist on top of the embedded software system The embedded software system interfaces with the driver wire system and provides functional safety. The self-driving software communicates with the embedded software, and is the piece that uses the LiDAR as part of its sensor suite.
When I came in, the project was kind of at the stage where we were using small test vehicles like a seven seater van to test the software stack. One of the first things that I got to do when joining was to take it to Sunderland and try to drive it around there autonomously.
At that point, I thought okay, this is quite a major thing; it was already pretty difficult to have the vehicle drive reliably around the route in the cramped spaces that sometimes were found there. I remember thinking oh, we have to do this whole thing with a truck – that’s going to be quite a challenge!
Could you explain how the technology installed within the Terberg EV truck works and what this involves?
We started out with an electric tug or yard tractor, which was originally made for manual driving. The first steps then were to install driver wire systems, actuators, motors, but also hydraulic pumps to be able to drive the brakes automatically.
So that’s something that StreetDrone has already quite a bit of experience in, retrofitting these kind of electric vehicles with this kind of system. The new element was definitely the pneumatic braking system which was a pneumatic system which is very different from on a regular vehicle, but once that’s done we have an interface to drive the actuators.
Then we had to fit the sensors just to be able to carry out the perception part of the project. Some of the most important sensors are cameras so we put a whole bunch of cameras fitted together to get a really good surround view of the truck for the autonomous system but also for the teleoperation system.
That was very important. Besides that we put on LiDAR and laser scanners to be able to get a 3-D sense of the environment. Beyond that some safety sensors like radar in front to have another source of possible obstacle detection.
Finally, the last one that was the most difficult and the most custom for this project was some kind of sensor to be able to figure out the orientation of the trailer. When the trucks drive forward, the trailer just follows, but when you are reversing you really need to know what the trailer is doing.
Part of the aim of the project was to come up with a solution that relies on adjustments on the trailer as little as possible or ideally none at all, so that this could work with any trailer that could come into a facility and it’s just the truck where all the intelligence lies and we can focus everything on there.
How did you overcome the reversing challenge?
The original goal was to try reversing in different ways. There’s a teleoperation system where in certain very complex scenarios the autonomy needs authorization or a remote driver to take over, then the system could do that.
So on one end, we had a teleoperator do the reversing but even when using trained drivers, they had lot of trouble reversing the truck as well because it was very different from sitting in the vehicle. Even with all the sensors you have a limited view compared to a human being able to look around and poke a head out of the window and all those kinds of things.
We had a separate small team looking into this, to figure out what a solution to this sensor issue would be. Then we had to figure out, okay if we have the sensors sorted then how do you actually manoeuvre it in such a way that it goes backwards because it has a quite unstable and difficult to control system.
Were there any other challenges alongside the issue of reversing yourself and the team faced?
Besides reversing another challenge was when we looked at how the large vehicles interact with the teleoperation system. How do we get a truck to stop at a barrier?
We also have to work on object detection, things like sensors and cameras, they’re all in different places and placed quite a bit higher than within normal vehicles due to the height of the lorry. We have a method that uses the LiDAR quite a lot to figure out where is it in 3D space, but because there’s this big trailer at the back that’s blocking the side of one of the LiDARs, that made it a challenge.
We had to work around that by using other LiDAR or increasing the accuracy and the reliability of localisation methods. All these things came out of working with a completely different platform than a regular passenger vehicle.
What would you say are the key benefits of this technology for the industry?
I think efficiency is the main word to use. The idea is not so much about replacing people or putting them out of jobs, it’s about making people more efficient. This solution, combined with the teleoperation supervision, means you can run more trucks with the same amount of people.
There are also safety aspects. It’s well known in regular driving that people make a lot of mistakes and innate human error results in accidents. For professional driving like this people get tired, overconfident and things like that. So, by being able to make it automated and take humans out of the cab and out of the area, then you reduce the impact of any accidents.
You also have more opportunities to make things safer because you can have the vehicle hook up to other vehicles around it and have them talk to each other, coordinate with each other.
Part of the project was also to install cameras and LiDARs on the infrastructure along the road so that the autonomous vehicle has much more information to make things safer, more efficient, and also energy wise, more efficient. An autonomous vehicle can drive in a much more optimal energy saving acceleration pattern then any human can.
What is the current stage of the technology?
At the moment, the project is very much a proof of concept. None of these efficiency gains are really there. Instead of one driver we need two drivers to run it. We need a person in the teleoperations station, but we also have a safety driver in the vehicle at all times as well.
In terms of the sensors and the cost of installing them into vehicle itself, it’s much more expensive than the regular diesel powered trucks that fleets have running around now. It’s still an early stage but for now we have shown that the technology is possible; we now need to resolve all the major issues that we still need to solve.
What are the next steps for the project?
We will start looking at getting more trucks on a route which is normally driven by three or maybe four trucks, so that the whole route can be automated rather than just one part of it.
We are also looking outside of the fixed facility to prove the technology in more complex scenarios as well. Maybe there’s other infrastructure and traffic lights that we have to pay attention to, or roundabouts and the rest of the road network structures. We will try different kinds of trailers to really expand on this to make sure its viable.
The next big step is to get the driver completely out of the vehicle and achieve a level four autonomous system solution.
When talking to people about this project what we often see is that there’s this big divide between people who think full autonomy is just around the corner, that there are going to be self-driving cars on the streets everywhere and soon. Then there’s people on the other side, and they say well I don’t see this ever happening, I don’t want a self-driving car and I always want to drive myself.
I think with this project we’re working on it really shows there’s a middle way and transition as well, that there are key areas where autonomy works and is beneficial and is needed. That said, we have to be careful of all the big hypes happening out there.