Vision is the primary way to perceive the environment during driving. However, due to its low spatial and temporal resolution, a driver may fail to perceive agents on the road, which may lead to collisions. Modern vehicles are equipped with sensors that can better perceive the driving environment, as well as ADAS to provide driving assist. However, ADAS does not consider the driver’s perception, which may result in unnecessary warnings or actions against the driver’s will. These false-positives may cause distractions and confusions in complex driving scenarios, which pose safety threat. In this project, we proposed a driving assist system which can reduce the number of unnecessary warnings by taking into account the driver’s perception of the driving environment. The driver’s perception model combines estimation of driving environment update and driver’s observation. The driver’s observation is obtained from gaze tracking and the driving environment update is estimated based on the last observation. In this paper, we formulated inference problem on the driver’s perception, and developed a virtual driving simulator to evaluate the feasibility of the system.