Improving Tele-operated Robotics Using Object Affordances via Computer Vision.
Abstract: Michael is the team lead for the TSU team competing in the University Rover Challenge (http://urc.marssociety.org/). The TSU team will design and build an all terrain rover to complete potential off-world tasks to simulate a Martian rover. The rovers will weigh 50kg and fit inside of a 2-meter square. The rovers are to have autonomous ability and have a robotic arm that can lift 5kg.
The tasks the rovers are to complete include caching a soil sample and performing onboard science analysis, extreme retrieval and delivery of objects in the field, servicing equipment in the field, and an autonomous traversal where the rovers travel between GPS waypoints autonomously and detects markers using camera data.
This rover challenge is a high-level design-build competition that requires students to develop software for robotics to operate autonomously using camera data and the open source computer vision library. The camera data is translated into large arrays of values. Mathematical algorithms provided by the computer vision library are then applied to determine shapes and patterns and remove noise.
So, a large focus of the competition is how to utilize mathematics to process camera data with an end result that allows robots to autonomously make decisions. Mankind’s success in expanding into the solar system will greatly rely on the quality of autonomous robotics to interact with the real world and their ability to make critical decisions in real-time.
Michael will present mathematical algorithms from the computer vision library; He will explain how these algorithms are used to manipulate many large arrays of camera data to identify objects and patterns fast enough to have robotics make valuable decisions in a real world applied context.
Michael Osei* and Tu Nyugen
Judging Forms – Official judges only