Student research opportunities
Road Side Object Classification in Video+Point-Cloud Data+Hyperspectral Data
Project Code: CECS_722
This project is available at the following levels:
Engn4200, Engn R&D, Honours, Summer Scholar, Masters, PhD
Keywords:
Point-Cloud, Object Detection, Intelligent Vehicles, Machine Learning, Hyperspectral, Road-Analysis
Supervisors:
Dr Lars PeterssonDr Gary Overett
Outline:
The NICTA AutoMap Project is extending its object classification abilities to include a large and detailed database of Video+Point-Cloud Data.
This project allows for various avenues of investigation, including novel features for combining normal camera imagery (as in computer vision) with high resolution point cloud data collected using LIDAR.
Goals of this project
Detect/Classify objects such as, road, trees, power poles and lines, hazards, road signs, and other infrastructure.
Analyse road surface to extract road profile, edges, pavement, etc.
Requirements/Prerequisites
Basic knowledge of digital image processing.
Good knowledge of C/C++, Matlab, Linux
Experience shows that students with excellent programming skills are far more likely meet success and enjoy this work. Since the student will be required to grapple with large data and a fairly fast-changing library of code. Students without this ability are likely to get bogged down in the implementation details.
For students with these skills (or who discover these skills on the project) have an excellent opportunity for building strong research/programming credentials.
An interest in further PhD study is also considered favourable.
Student Gain
Excellent opportunity for students interested in computer vision, image processing and pattern recognition.
Students will have the opportunity to participate in genuine fundamental research with a commercial focus. Students who discover improved methods will have the opportunity to see their research used on very large databases of real-world data.
Links
AutoMap Project PageProject Video
Point Cloud Library
AutoMapic Video


