A minimum cover approach for extracting the road network from airborne LIDAR data

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

19 Scopus citations

Abstract

We address the problem of extracting the road network from large-scale range datasets. Our approach is fully automatic and does not require any inputs other than depth and intensity measurements from the range sensor. Road extraction is important because it provides contextual information for scene analysis and enables automatic content generation for geographic information systems (GIS). In addition to these two applications, road extraction is an intriguing detection problem because robust detection requires integration of local and long-range constraints. Our approach segments the data based on both edge and region properties and then extracts roads using hypothesis testing. Road extraction is formulated as a minimum cover problem, whose approximate solutions can be computed efficiently. Besides detecting and extracting the road network, we also present a technique for segmenting the entire city into blocks. We show experimental results on large-scale data that cover a large part of a city, with diverse landscapes and road types.

Original languageEnglish
Title of host publication2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops 2009
Pages1582-1589
Number of pages8
DOIs
StatePublished - 2009
Event2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops 2009 - Kyoto, Japan
Duration: 27 Sep 20094 Oct 2009

Publication series

Name2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops 2009

Conference

Conference2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops 2009
Country/TerritoryJapan
CityKyoto
Period27/09/094/10/09

Fingerprint

Dive into the research topics of 'A minimum cover approach for extracting the road network from airborne LIDAR data'. Together they form a unique fingerprint.

Cite this