File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1609/icaps.v31i1.15998
- Scopus: eid_2-s2.0-85109417641
- Find via

Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: DeepFreight: A Model-free Deep-reinforcement-learning-based Algorithm for Multi-transfer Freight Delivery
| Title | DeepFreight: A Model-free Deep-reinforcement-learning-based Algorithm for Multi-transfer Freight Delivery |
|---|---|
| Authors | |
| Issue Date | 2021 |
| Citation | Proceedings International Conference on Automated Planning and Scheduling Icaps, 2021, v. 2021-August, p. 510-518 How to Cite? |
| Abstract | With the freight delivery demands and shipping costs increasing rapidly, intelligent control of fleets to enable efficient and cost-conscious solutions becomes an important problem. In this paper, we propose DeepFreight, a model-free deep-reinforcement-learning-based algorithm for multi-transfer freight delivery, which includes two closely-collaborative components: truck-dispatch and package-matching. Specifically, a deep multi-agent reinforcement learning framework called QMIX is leveraged to learn a dispatch policy, with which we can obtain the multi-step joint dispatch decisions for the fleet with respect to the delivery requests. Then an efficient multi-transfer matching algorithm is executed to assign the delivery requests to the trucks. Also, DeepFreight is integrated with a Mixed-Integer Linear Programming optimizer for further optimization. The evaluation results show that the proposed system is highly scalable and ensures a 100% delivery success while maintaining low delivery time and fuel consumption. |
| Persistent Identifier | http://hdl.handle.net/10722/361603 |
| ISSN | 2020 SCImago Journal Rankings: 0.470 |
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Chen, Jiayu | - |
| dc.contributor.author | Umrawal, Abhishek K. | - |
| dc.contributor.author | Lan, Tian | - |
| dc.contributor.author | Aggarwal, Vaneet | - |
| dc.date.accessioned | 2025-09-16T04:18:06Z | - |
| dc.date.available | 2025-09-16T04:18:06Z | - |
| dc.date.issued | 2021 | - |
| dc.identifier.citation | Proceedings International Conference on Automated Planning and Scheduling Icaps, 2021, v. 2021-August, p. 510-518 | - |
| dc.identifier.issn | 2334-0835 | - |
| dc.identifier.uri | http://hdl.handle.net/10722/361603 | - |
| dc.description.abstract | With the freight delivery demands and shipping costs increasing rapidly, intelligent control of fleets to enable efficient and cost-conscious solutions becomes an important problem. In this paper, we propose DeepFreight, a model-free deep-reinforcement-learning-based algorithm for multi-transfer freight delivery, which includes two closely-collaborative components: truck-dispatch and package-matching. Specifically, a deep multi-agent reinforcement learning framework called QMIX is leveraged to learn a dispatch policy, with which we can obtain the multi-step joint dispatch decisions for the fleet with respect to the delivery requests. Then an efficient multi-transfer matching algorithm is executed to assign the delivery requests to the trucks. Also, DeepFreight is integrated with a Mixed-Integer Linear Programming optimizer for further optimization. The evaluation results show that the proposed system is highly scalable and ensures a 100% delivery success while maintaining low delivery time and fuel consumption. | - |
| dc.language | eng | - |
| dc.relation.ispartof | Proceedings International Conference on Automated Planning and Scheduling Icaps | - |
| dc.title | DeepFreight: A Model-free Deep-reinforcement-learning-based Algorithm for Multi-transfer Freight Delivery | - |
| dc.type | Conference_Paper | - |
| dc.description.nature | link_to_subscribed_fulltext | - |
| dc.identifier.doi | 10.1609/icaps.v31i1.15998 | - |
| dc.identifier.scopus | eid_2-s2.0-85109417641 | - |
| dc.identifier.volume | 2021-August | - |
| dc.identifier.spage | 510 | - |
| dc.identifier.epage | 518 | - |
| dc.identifier.eissn | 2334-0843 | - |
