Abstract

Soil colour is a key indicator of soil health and the associated properties. In agriculture, soil colour provides farmers and advises with a visual guide to interpret soil functions and performance. Munsell colour charts have been used to determine soil colour for many years, but the process is fallible, as it depends on the user’s perception. As smartphones are widely used and come with high-quality cameras, a popular one was used for capturing images for this study. This study aims to predict Munsell soil colour (MSC) from the Munsell soil colour book (MSCB) by using deep learning techniques on mobile-captured images. MSCB contains 14 pages and 443 colour chips. So, the number of classes for chip-by-chip prediction is very high, and the captured images are inadequate to train and validate using deep learning methods; thus, a patch-based mechanism was proposed to enrich the dataset. So, the course of action is to find the prediction accuracy of MSC for both page level and chip level by evaluating multiple deep learning methods combined with a patch-based mechanism. The analysis also provides knowledge about the best deep learning technique for MSC prediction. Without patching, the accuracy for chip-level prediction is below (Formula presented.), the page-level prediction is below (Formula presented.), and the accuracy with patching is around (Formula presented.) for both, which is significant. Lastly, this study provides insights into the application of the proposed techniques and analysis within real-world soil and provides results with higher accuracy with a limited number of soil samples, indicating the proposed method’s potential scalability and effectiveness with larger datasets.

Original languageEnglish
Article number287
JournalSensors
Volume25
Issue number1
DOIs
Publication statusPublished - Jan 2025

Fingerprint

Dive into the research topics of 'Munsell Soil Colour Prediction from the Soil and Soil Colour Book Using Patching Method and Deep Learning Techniques'. Together they form a unique fingerprint.

Cite this