Reference
D. Sun, A. Jamshidnejad, and B. De Schutter, "A novel framework combining MPC
and deep reinforcement learning with application to freeway traffic control,"
IEEE Transactions on Intelligent Transportation
Systems, vol. 25, no. 7, pp. 6756-6769, 2024.
Abstract
Model predictive control (MPC) and deep reinforcement learning (DRL) have been
developed extensively as two independent techniques for traffic management.
Although the features of MPC and DRL complement each other very well, few of
the current studies consider combining these two methods for application in the
field of freeway traffic control. This paper proposes a novel framework for
integrating MPC and DRL methods for freeway traffic control that is different
from existing MPC-(D)RL methods. Specifically, the proposed framework adopts a
hierarchical structure, where a high-level efficient MPC component works at a
low frequency to provide a baseline control input, while the DRL component
works at a high frequency to modify online the output generated by MPC. The
control framework, therefore, needs only limited online computational resources
and is able to handle uncertainties and external disturbances after proper
learning with enough training data. The proposed framework is implemented on a
benchmark freeway network in order to coordinate ramp metering and variable
speed limits, and the performance is compared with standard MPC and DRL
approaches. The simulation results show that the proposed framework outperforms
standalone MPC and DRL methods in terms of total time spent (TTS) and
constraint satisfaction, despite model uncertainties and external disturbances.
Publisher page
Downloads
BibTeX
@article{SunJam:24-006,
author = {Sun, Dingshan and Jamshidnejad, Anahita and De Schutter, Bart},
title = {A Novel Framework Combining {MPC} and Deep Reinforcement Learning
With Application to Freeway Traffic Control},
journal = {IEEE Transactions on Intelligent Transportation Systems},
volume = {25},
number = {7},
pages = {6756--6769},
year = {2024}
}