Automated Grading of Red Ginseng Using DenseNet121 and Image Preprocessing Techniques

Minhyun Kim, Jiyoon Kim, Jung Soo Kim, Jeong Ho Lim, Kwang Deog Moon

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Red ginseng is a steamed and dried ginseng that has more functional properties and a longer shelf-life. Red ginseng is graded by appearance and inner quality. However, this conventional process has a high cost in terms of time and human resources, and has the disadvantage of subjective assessment results. Therefore, the convolutional neural network (CNN) method was proposed to automate the grading process of red ginseng and optimize the preprocessing method, select an accurate and efficient deep learning model, and to explore the feasibility of rating discrimination solely based on external quality information, without considering internal quality characteristics. In this study, the effect of five distinct preprocessing methods, including RGB, binary, gray, contrast-limited adaptive histogram equalization (CLAHE), and Gaussian blur, on the rating accuracy of red ginseng images was investigated. Furthermore, a comparative analysis was conducted on the performance of four different models, consisting of one CNN model and three transfer learning models, which were VGG19, MobileNet, and DenseNet121. Among them, DenseNet121 with CLAHE preprocessing reported the best performance; its accuracy in the Dataset 2 test set was 95.11%. This finding suggests that deep learning techniques can provide an objective and efficient solution for the grading process of red ginseng without an inner quality inspection.

Original languageEnglish
Article number2943
JournalAgronomy
Volume13
Issue number12
DOIs
StatePublished - Dec 2023

Keywords

  • deep learning
  • grading
  • image preprocessing
  • red ginseng
  • transfer learning

Fingerprint

Dive into the research topics of 'Automated Grading of Red Ginseng Using DenseNet121 and Image Preprocessing Techniques'. Together they form a unique fingerprint.

Cite this