A Deep Reinforcement Learning-Based Resource Management Game in Vehicular Edge Computing

Vehicular Edge Computing (VEC) is a promising paradigm that leverages the vehicles to offload computation tasks to the nearby VEC server with the aim of supporting the low latency vehicular application scenarios. Incentivizing VEC servers to participate in computation offloading activities and make full use of computation resources is of great importance to the success of intelligent transportation services. In this paper, the authors formulate the competitive interactions between the VEC servers and vehicles as a two-stage Stackelberg game with the VEC servers as the leader players and the vehicles as the followers. After obtaining the full information of vehicles, the VEC server calculates the unit price of computation resource. Given the unit prices announced by VEC server, the vehicles determine the amount of computation resource to purchase from VEC server. In the scenario that vehicles do not want to share their computation demands, a deep reinforcement learning based resource management scheme is proposed to maximize the profits of vehicles and VEC server. The extensive experimental results have demonstrated the effectiveness of the authors' proposed resource management scheme based on Stackelberg game and deep reinforcement learning.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01847806
  • Record Type: Publication
  • Files: TRIS
  • Created Date: Jun 1 2022 9:21AM