Skip to content

Agricutural datasets for developing AI and robotics systems applied to agriculture

License

Notifications You must be signed in to change notification settings

ricber/digital-agriculture-datasets

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Digital Agriculture Datasets

Agricutural datasets for developing AI and robotics systems applied to agriculture

Image classification

Semantic segmentation

Scene understanding

  • RELLIS-3D: A Multi-modal Dataset for Off-Road Robotics: Semantic segmentation on 2D RGB images and 3D LiDAR pointclouds - https://github.com/unmannedlab/RELLIS-3D/tree/main
  • RUGD Dataset: The RUGD dataset focuses on semantic understanding of unstructured outdoor environments for applications in off-road autonomous navigation. The datset is comprised of video sequences captured from the camera onboard a mobile robot platform - http://rugd.vision/
  • GOOSE dataset: GOOSE is the German Outdoor and Offroad Dataset and is a 2D & 3D semantic segmentation dataset framework. In contrast to existing datasets like Cityscapes or BDD100K, the focus is on unstructured off-road environments - https://goose-dataset.de/docs/
  • WildScenes: The WildScenes dataset is a multi-modal collection of traversals within Australian forests. The dataset is divided into five sequences across two forest locations. These sequences are both across different physical locations and across different times - https://csiro-robotics.github.io/WildScenes/
  • BotanicGarden: A robot navigation dataset in a botanic garden of more than 48000m2. Comprehensive sensors are used, including Gray and RGB stereo cameras, spinning and MEMS 3D LiDARs, and low-cost and industrial-grade IMUs. An all-terrain wheeled robot is employed for data collection, traversing through thick woods, riversides, narrow trails, bridges, and grasslands. This yields 33 short and long sequences, forming 17.1km trajectories in total - https://github.com/robot-pesg/BotanicGarden

Synthetic datasets for semantic segmentation

Object detection

Instance segmentation (detection + segmentation)

Tracking

Hyperspectral imaging

Robotics

These are multimodal datasets encompassing data from different sensors like RGB, stereo, and RGB-D cameras, LiDARs, IMUs, GPS, thermal cameras, hyperspectral cameras, etc. Normally, they do not have labels.

  • Sugar Beets 2016: https://www.ipb.uni-bonn.de/data/sugarbeets2016/
  • CitrusFarm Dataset: CitrusFarm is a multimodal agricultural robotics dataset that provides both multispectral images and navigational sensor data for localization, mapping and crop monitoring tasks - https://ucr-robotics.github.io/Citrus-Farm-Dataset/
  • A high-resolution, multimodal data set for agricultural robotics: A Ladybird's-eye view of Brassica: https://doi.org/10.1002/rob.21877
  • RELLIS-3D: A Multi-modal Dataset for Off-Road Robotics: Semantic segmentation on 2D RGB images and 3D LiDAR pointclouds - https://github.com/unmannedlab/RELLIS-3D/tree/main
  • RUGD Dataset: The RUGD dataset focuses on semantic understanding of unstructured outdoor environments for applications in off-road autonomous navigation. The datset is comprised of video sequences captured from the camera onboard a mobile robot platform. - http://rugd.vision/
  • GOOSE dataset: GOOSE is the German Outdoor and Offroad Dataset and is a 2D & 3D semantic segmentation dataset framework. In contrast to existing datasets like Cityscapes or BDD100K, the focus is on unstructured off-road environments - https://goose-dataset.de/docs/
  • WildScenes: The WildScenes dataset is a multi-modal collection of traversals within Australian forests. The dataset is divided into five sequences across two forest locations. These sequences are both across different physical locations and across different times - https://csiro-robotics.github.io/WildScenes/
  • BotanicGarden: A robot navigation dataset in a botanic garden of more than 48000m2. Comprehensive sensors are used, including Gray and RGB stereo cameras, spinning and MEMS 3D LiDARs, and low-cost and industrial-grade IMUs. An all-terrain wheeled robot is employed for data collection, traversing through thick woods, riversides, narrow trails, bridges, and grasslands. This yields 33 short and long sequences, forming 17.1km trajectories in total - https://github.com/robot-pesg/BotanicGarden

Collectors of datasets

  • Quantitative Plant: Website that collects datasets for image classification, semantic segmentation and phenotyping - https://www.quantitative-plant.org/dataset
  • A survey of public datasets for computer vision tasks in precision agriculture: Collection of datasets for detection and segmentation of weeds and fruits and phenotyping tasks (e.g., damage and disease detection, biomas prediction, yield estimation) - https://doi.org/10.1016/j.compag.2020.105760

Tools to create synthetic datasets

  • CropCraft: CropCraft is a python script that generates 3D models of crop fields, specialized in real-time simulation of robotics applications - https://github.com/Romea/cropcraft
  • TomatoSynth: TomatoSynth provides realistic synthetic tomato plants training data for deep learning applications, reducing the need for manual annotation and allowing customization for specific greenhouse environments, thus advancing automation in agriculture - https://github.com/SCT-lab/TomatoSynth

About

Agricutural datasets for developing AI and robotics systems applied to agriculture

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published