Intelligent data fusion system for predicting vehicle collision warning using vision/GPS sensing

Bao Rong Chang, Hsiu Fen Tsai, Chung Ping Young

Research output: Contribution to journalArticlepeer-review

65 Citations (Scopus)


In this study, fuzzy approach with fault-tolerance has proposed to fuse heterogeneous sensed data and overcome the problem of imprecise collision warning due to perturbed input signal when processing the pre-crash warning. Meanwhile, another problem relevant to the danger in drowsy driving, involving fatigue level, carbon monoxide concentration, and breath alcohol concentration, was considered and has approximately reasoned to an extra reaction time to modify NHTSA algorithm. A vision-sensing analysis cooperating with global-positioning system is applied for lane marking detection and collision warning, particularly exchanging the dynamic and static information between neighboring cars via inter-vehicle wireless communications. In addition to pre-crash warning, event data recording very useful for accident reconstruction on scene is also established here. In order to speed up data fusion on both quantum-tuned back-propagation neural network (QT-BPNN) and adaptive network-based fuzzy inference system (ANFIS), a distributed dual-platform DaVinci+XScale_NAV270 has been employed. Several tests on system's reliability and validity have been done successfully, and the comparison of system effectiveness showed that our proposed approach outperforms two current well-known collision-warning systems (AWS-Mobileye and ACWS-Delphi).

Original languageEnglish
Pages (from-to)2439-2450
Number of pages12
JournalExpert Systems With Applications
Issue number3
Publication statusPublished - 2010 Mar 15

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Computer Science Applications
  • Artificial Intelligence


Dive into the research topics of 'Intelligent data fusion system for predicting vehicle collision warning using vision/GPS sensing'. Together they form a unique fingerprint.

Cite this