Efficient joint stereo estimation and land usage classification for multiview satellite data

Ke Wang, Craig Stutts, Enrique Dunn, Jan Michael Frahm

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

We propose an efficient algorithm to jointly estimate geometry and semantics for a given geographical region observed by multiple satellite images. Our joint estimation leverages an efficient PatchMatch inference framework defined over lattice discretization of the environment. Our cost function relies on the local planarity assumption to model scene geometry and neural network classification to determine semantic (e.g. land use) labels for geometric structures. By utilizing the commonly available direct (i.e. space to image) rational polynomial coefficients (RPC) satellite camera models, our approach effectively circumvents the need for estimating or refining inverse RPC models. Experiments illustrate both the computational efficiency and high quality scene geometry estimates attained by our approach for satellite imagery. To further illustrate the generality of our representation and inference framework, experiments on standard benchmarks for ground-level imagery are also included.

Original languageEnglish
Title of host publication2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016
ISBN (Electronic)9781509006410
DOIs
StatePublished - 23 May 2016
EventIEEE Winter Conference on Applications of Computer Vision, WACV 2016 - Lake Placid, United States
Duration: 7 Mar 201610 Mar 2016

Publication series

Name2016 IEEE Winter Conference on Applications of Computer Vision, WACV 2016

Conference

ConferenceIEEE Winter Conference on Applications of Computer Vision, WACV 2016
Country/TerritoryUnited States
CityLake Placid
Period7/03/1610/03/16

Fingerprint

Dive into the research topics of 'Efficient joint stereo estimation and land usage classification for multiview satellite data'. Together they form a unique fingerprint.

Cite this