Technical Survey on Sensor-Aided Automatic Parallel Car Parking Systems for Effective Vehicle Navigation
Abstract
Sensor-aided automatic parallel parking systems represent a cornerstone of modern Advanced Driver Assistance Systems (ADAS) and emerging autonomous vehicle technologies. These systems alleviate urban parking challenges by automating space detection, path planning, and precise vehicle control, thereby reducing driver stress, low-speed collisions (by up to 75%), circling time, and associated emissions. Early implementations (2000–2015) relied primarily on ultrasonic sensors for basic reverse aids, evolving into sophisticated multi-modal architectures incorporating radars, cameras, LiDAR, infrared, magnetic, and electromagnetic sensors. Sensor fusion strategies leveraging Kalman filters, probabilistic occupancy grids, and deep neural networks address individual sensor limitations such as weather sensitivity, noise, and limited range, achieving detection accuracies exceeding 95% in controlled settings. Recent advancements (2023–2025) integrate reinforcement learning, diffusion models, 4D imaging radars, transformers, and end-to-end deep learning for robust performance in dynamic, low-visibility urban environments. Path planning employs geometric (Reeds-Shepp), optimization-based (PSO), and model predictive control methods, while perception benefits from CNNs (YOLO) and RL for adaptive decision-making. Commercial systems (Tesla Autopark, BMW Parking Assistant Plus, Ford Active Park Assist) demonstrate varying strengths in vision-based autonomy, precision, and reliability, though challenges persist in adverse weather, computational constraints, sensor interference, regulatory compliance (ISO 26262), and user trust. Real-world benchmarks reveal success rates of 85-99% under ideal conditions but highlight degradation in clutter, rain, or unstructured lots. This review underscores the transition toward fully autonomous valet parking (AVP) and smart-city integration via V2X, while identifying critical needs for weather-resilient fusion, verifiable AI, and enhanced human-machine interfaces to accelerate safe, widespread adoption.
