Machine learning approach for self-learning eco-speed control
Significant fuel consumption occurs by traffic signals due to their periodic disruption of traffic flow. This paper proposes a Q-learning based vehicle speed control algorithm to minimise the fuel consumption in the vicinity of an isolated signal intersection. Q-learning is a self-learning algorithm that learns the optimal control action(s) based on the trial-and-error approach. The speed control algorithm is trained in the Aimsun microsimulation platform under varying traffic signal and arrival speed conditions. The training and validation of the algorithm are conducted under the single vehicle scenario where only one control vehicle presents in the intersection approach. A comprehensive parametric analysis provides fine-tuning of the Q-learning parameters and the impact of parameter settings on the algorithm’s performance and convergence. Using the chosen parameter setting, the performance of the algorithm is demonstrated in comparison with a vehicle velocity profile for a baseline scenario where the speed control is disabled. The simulation results indicate that the algorithm can reduce the vehicle’s fuel consumption by 15.78% by adopting the suggested driving speeds.
- Record URL:
- Gamage, H D
- Lee, J
- Publication Date: 2016-11
- Pagination: 14p
- Monograph Title: 38th Australasian Transport Research Forum (ATRF 2016), Melbourne, 16th - 18th November 2016
- TRT Terms: Behavior; Driver improvement programs; Driver training; Drivers; Ecodriving; Education; Environment; Fuel consumption; Mathematical models; Travel behavior
- ATRI Terms: Driver improvement; Driver training; Ecodriving; Environment; Fuel consumption; Modelling; Travel behaviour
- Subject Areas: Environment; I15: Environment;
- Accession Number: 01627407
- Record Type: Publication
- Source Agency: ARRB Group Ltd.
- Files: ITRD, ATRI
- Created Date: Feb 27 2017 10:06AM