Document Type : Research Paper
Authors
1 Mechanical Engineering of Biosystems Department, Faculty of Agriculture, Shahrekord University, Shahrekord, I.R.Iran
2 Biosystems Engineering Department,, Sari Agricultural Sciences and Natural Resources University, Sari, Iran
3 Mechanical Engineering of Biosystems Department, Faculty of Agriculture, Shahrekord University, Shahrekord,
4 Mechanical Engineering of Biosystems Department, Faculty of Agriculture, Shahrekord University, Shahrekord, I.R. Iran.
Abstract
Keywords
Main Subjects
Food adulteration is a significant issue that impacts food safety, consumer trust, and regulatory compliance globally. One common form of adulteration in meat products involves the unauthorized addition of cheaper or lower-quality ingredients to higher-value meats. Specifically, the adulteration of chicken gizzard—a low-cost offal—into minced red meat, such as beef and mutton, poses both economic and health concerns. Traditional methods for detecting such adulteration often involve destructive, time-consuming, and costly laboratory analyses, which limit their applicability for rapid quality control.
This study aims to develop and evaluate an intelligent machine vision system capable of detecting chicken gizzard adulteration in minced red meat through non-destructive and rapid analysis. The core objective is to assess the feasibility of using digital color images captured by a mobile phone to identify and quantify the presence of chicken gizzard in mixtures of beef and mutton meat. By leveraging advances in image processing and machine learning, this approach offers a cost-effective and efficient solution for detecting food fraud.
To simulate real-world adulteration scenarios, standard minced red meat samples were prepared with a fixed base composition of 55% mutton and 45% beef. Chicken gizzard was added to these base samples in varying proportions, ranging from 0% (pure red meat) to 100% (pure chicken gizzard), creating a comprehensive gradient of adulteration levels for analysis.
Image acquisition was performed in a typical laboratory environment to evaluate the system’s robustness under less controlled circumstances. Each sample was imaged in two ways: directly from the exposed meat surface and through transparent plastic wrap packaging, simulating commercial retail conditions.
The color features of the captured images were extracted in the RGB (Red, Green, Blue) color space using MATLAB’s image processing toolbox. These color components served as the primary input variables for subsequent modeling.
Data analysis involved a combination of statistical and machine learning techniques. Principal Component Analysis (PCA) was first applied to visually discriminate the samples and identify the most significant color features related to adulteration. Partial Least Squares Regression (PLSR), a linear modeling technique, was then used to predict the percentage of chicken gizzard adulteration based on the extracted RGB features. To capture nonlinear relationships and enhance prediction accuracy, a Multilayer Perceptron (MLP) neural network was also developed and trained on the dataset.
The analysis revealed a clear trend in color changes corresponding to increasing levels of chicken gizzard adulteration. This shift in color composition reflects the intrinsic differences in tissue pigmentation between red meat and chicken gizzard, providing a reliable basis for discrimination.
Quantitative evaluation of the models showed that the linear PLSR model achieved a coefficient of determination (R²) of 0.7 and a root mean square error (RMSE) of 16.51 when predicting adulteration levels in images captured without plastic wrap. While this indicates moderate accuracy, the nonlinear MLP model significantly outperformed PLSR, achieving an R² of 0.97 and RMSE of 6.67 under the same conditions. This demonstrates the superior capability of nonlinear modeling in capturing complex relationships between color features and adulteration levels. These results were lower (0.9 and 12.54) for the data acquisition through plastic wrap due to the light reflections caused by the covering. Furthermore, the MLP model was implemented as a classifier to categorize samples into discrete adulteration intervals: 0–10%, 10–20%, 20–30%, 30–40%, 40–50%, and above 50%. The classification accuracies achieved were 85%, 96.4%, 92.6%, 73.7%, 76.2%, and 96.7%, respectively. The average precision, sensitivity, and F1 score for the developed model were obtained as 0.975, 0.974, and 0.975, respectively. These results indicate that the system can reliably detect levels of adulteration ranging from low to high.
This study demonstrates that color image analysis combined with advanced machine learning techniques offers a non-destructive, rapid, and reliable method for detecting chicken gizzard adulteration in minced red meat. The developed machine vision system effectively distinguishes adulterated samples and accurately estimates the level of adulteration, particularly when employing nonlinear models such as MLP neural networks. The ability to perform such analysis through images captured even under less controlled lighting conditions and through plastic wrap packaging highlights the practical applicability of the system in real-world food quality control environments. This approach can serve as a foundation for developing automated, software-based quality assurance tools in the food industry, enhancing the detection of food fraud and protecting consumer interests. Future work may focus on expanding the system to detect other types of adulterants, integrating hyperspectral imaging techniques for improved sensitivity, and developing user-friendly interfaces to facilitate adoption by food producers and regulatory agencies.
Mobin Rezazadeh: Writing – original draft, Methodology, Data curation, Software, Formal analysis.
Sajad Kiani: Writing – review and editing, Formal analysis, Investigation, Data curation, Validation.
Mahdi Ghasemi Varnamkhasti: Review and edit, conceptualize, supervise, and manage project administration.
Zahra Izadi: Advisor, methodology, and resources.
Data available on request from the authors. All the data used in this original research are presented throughout the text and in the form of Tables and Figures.
The authors avoided data fabrication, falsification, plagiarism, and misconduct.
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
The author declares no conflict of interest.