Generating summaries for large collections of geo-referenced photographs

Abstract
We describe a framework for automatically selecting a summary set of photographs from a large collection of geo-referenced photos. The summary algorithm is based on spatial patterns in photo sets, but can be expanded to support social, temporal, as well as textual-topical factors of the photo set. The summary set can be biased by the user, the content of the user's query, and the context in which the query is made. An initial evaluation on a set of geo-referenced photos shows that our algorithm performs well, producing results that are highly rated by users.

This publication has 3 references indexed in Scilit: