The conference agenda provides an overview and details of sessions. In order to view sessions on a specific day or for a certain room, please select an appropriate date or room link. You may also select a session to explore available abstracts and download papers and presentations.
1Space Research Institute of National Academy of Sciences of Ukraine and State Space Agency of Ukraine; 2University of Kent/ KEI at KSE, United Kingdom; 3National Technical University of Ukraine, Ukraine; 4University of Maryland College Park, United States of America; 5EOS Data Analytics, Ukraine
Remote sensing of agricultural land use is one of the essential objectives of the EBRD project “Supporting Transparent Land Governance in Ukraine”. The main goal of the project is Land Cover/Land Use classification based on free satellite data as well as development of efficient automated technologies of land management utilizing remote sensing data.
Within the project we have investigated the applicability of three information platforms — Sen2Agri (developed under ESA support), Google Earth Engine (GEE) cloud platform and our own approach based on artificial neural networks.
All the results of the pilot project should establish the preconditions for transparent functioning of agricultural land market, improving efficiency of land use and creation the foundations for investments in the agricultural sector and rural development.
Proximate sensing of food types and land uses in Thailand using street-level photography and deep learning
Martha Bohm, John Ringland, So-Ra Baek
University at Buffalo, United States of America
We present new tools to exploit street-level imagery to inventory crop types and land uses. We describe two classifiers using Google Street View imagery and a deep convolutional neural network. First, a multi-class classifier distinguishes six crops and three land uses. Second, a specialized detector recognizes the presence of a single species. We tested these tools along roadside transects in Thailand.
The overall accuracy of the multi-class classifier was 83.3%. For several classes the producer's accuracy was over 90%. This performance compares favorably to some remote-sensing classifiers. The overall classifier accuracy on the top 40% of images is excellent: 99.0%. The area under the specialized detector’s receiver operating characteristic curve was 0.9905, indicating excellent performance.
This approach shows potential for fine-grained analysis over large areas. We are developing it further for places where home gardens provide significant diet supplementation, but are poorly characterized when quantifying macro-economically important crops.