Show simple item record

dc.contributor.advisorMcCrink, Matthew
dc.contributor.advisorWhitfield, Clifford A.
dc.creatorLee, Hongyun
dc.date.accessioned2017-04-10T17:07:46Z
dc.date.available2017-04-10T17:07:46Z
dc.date.issued2017-05
dc.identifier.urihttp://hdl.handle.net/1811/80426
dc.description.abstractThis paper considers the development and implementation of a vision-based onboard flight management computer (FMC) and inertial navigation system (INS) that can estimate and control the attitude and relative position of the vehicle while interacting with an operator in an indoor environment. In this project, a hexacopter is developed which uses an onboard inertial measurement unit (IMU), a small-sized computer, and a Microsoft Kinect sensor. Since directional maneuver of the hexacopter can be achieved by changing the angular position of the vehicle, the project consists of inertial and vision sensor fusion. To accomplish stable control of the multicopter's orientation, a complementary filter is implemented on the IMU. Since the raw sensors do not provide an accurate angular position of the vehicle per se, the complementary filter is used to fuse the raw sensor data to provide low-noise and low-drift estimation of Euler angles. Autonomous indoor flight requires position sensors other than GPS whose accuracy is approximately 2 to 3 meters. GPS signals are typically unavailable in an indoor environment. In this project, a computer vision algorithm is used to provide position estimation. The vision algorithm uses an onboard Microsoft Kinect and a computer to execute EmguCV and Kinect SDK libraries. Since Kinect can provide color and depth data, it allows to detect an object and to provide its real-time local coordinates which can be used for an autonomous indoor flight. The coordinates are used to correct the position of the vehicle. Also, to input commands such as takeoff, landing, proceed, retreat during the flight, an artificial neural network is used to classify human gestures so that the gesture can be used as the commands. The system is expected to maintain vehicle position within 1 meter for the better accuracy when compared to low-cost GPS receivers, and to control corresponding angular orientation for the position control. Also, the system is expected to interact with an operator within a reasonable interaction while it can maintain the given flight control tasks.en_US
dc.description.sponsorshipOSU College of Engineeringen_US
dc.language.isoen_USen_US
dc.publisherThe Ohio State Universityen_US
dc.relation.ispartofseriesThe Ohio State University. Department of Mechanical and Aerospace Engineering Undergraduate Research Theses; 2017en_US
dc.subjectUmanned Aerial Systemen_US
dc.subjectComputer Visionen_US
dc.subjectHuman-Robot Interactionen_US
dc.subjectMachine Learningen_US
dc.titleVision-based Indoor Navigation of a Semi-autonomous and Interactive Unmanned Aerial Systemen_US
dc.typeThesisen_US
dc.description.embargoNo embargoen_US
dc.description.academicmajorAcademic Major: Aeronautical and Astronautical Engineeringen_US


Files in this item

Thumbnail

Items in Knowledge Bank are protected by copyright, with all rights reserved, unless otherwise indicated.

This item appears in the following Collection(s)

Show simple item record