Driver status monitoring systems are a vital component of smart cars in the future, especially in the era when an increasing amount of time is spent in the vehicle. The heart rate (HR) is one of the most important physiological signals of driver status. To infer HR of drivers, the mainstream of the existing research focused on capturing subtle heartbeat-induced vibration of the torso or leveraged photoplethysmography (PPG) that detects cardiac cycle-related blood volume changes in the microvascular. However, existing approaches rely on dedicated sensors that are expensive and cumbersome to be integrated or are vulnerable to ambient noise. Moreover, their performance on the detection of HR does not guarantee a reliable computation of the HR variability (HRV) measure, which is a more applicable metric for inferring mental and physiological status. The accurate computation HRV measure is based on the precise measurement of the beat-to-beat interval, which can only be accomplished by medical-grade devices that attach electrodes to the body. Considering these existing challenges, we proposed a facial expression-based HRV estimation solution. The rationale is to establish a link between facial expression and heartbeat since both are controlled by the autonomic nervous system. To solve this problem, we developed a tree-based probabilistic fusion neural network approach, which significantly improved HRV estimation performance compared to conventional random forest or neural network methods and the measurements from smartwatches. The proposed solution relies only on commodity camera with a lightweighted algorithm, facilitating its ubiquitous deployment in current and future vehicles. Our experiments are based on 3400 km of driving data from nine drivers collected in a naturalistic field study.