Machine Learning with Decision Trees and Multi-Armed Bandits: An Interactive Vehicle Recommender System

Recommender systems guide a user to useful objects in a large space of possible options in a personalized way. In this paper, we study recommender systems for vehicles. Compared to previous research on recommender systems in other domains (e.g., movies or music), there are two major challenges associated with recommending vehicles. First, typical customers purchase fewer cars than movies or pieces of music. Thus, it is difficult to obtain rich information about a customer’s vehicle purchase history. Second, content information obtained about a customer (e.g., demographics, vehicle preferences, etc.) is also difficult to acquire during a relatively short stay in a dealership. To address these two challenges, we propose an interactive vehicle recommender system based a novel machine learning method that integrates decision trees and multi-armed bandits. Decision tree learning effectively selects important questions to ask the customer and encodes the customer's key preferences. With these preferences as prior information, the multi-armed bandit algorithm, using Thompson sampling, efficiently leverages the user’s feedback to improve the recommendations in an online fashion. The empirical results show that our hybrid learning method can effectively make interactive vehicle recommendations to users.

Language

  • English

Media Info

Subject/Index Terms

Filing Info

  • Accession Number: 01703438
  • Record Type: Publication
  • Source Agency: SAE International
  • Report/Paper Numbers: 2019-01-1079
  • Files: TRIS, SAE
  • Created Date: Apr 30 2019 9:19AM