||Build and autonomous ground vehicle in a modular way employing sensor fusion at various levels leading to software APIs for several sensors, an Attitude & Heading Reference System, a path planner and a map builder.
||This project aims to build an Autonomous All Terrain Vehicle capable of navigating almost all types of terrain through lane following and GPS waypoints loaded before the mission. It will employ stereo vision for detecting obstacles with the range data augmented by narrow beam ultrasonic sensors, with the data from the two being fused using a Kalman Filter to provide information about obstacles in from of the vehicle. Another Kalman Filter fuses information from GPS, an IMU and encoders on the driving and steering wheels to provide an estimate of it's pose and heading. A local map builder then integrates the pose & heading information with the range data to create a map of the immediate surroundings of the vehicle. A path planner plans the path of the vehicle and gives driving commands to a control unit. The control unit interprets the driving commands and breaks them into sub commands, which it executes by driving the actuators with the appropriate signals.
A global map builder can also be implemented to provide information about the terrain the vehicle traversed.