Real-Time Solid Waste Sorting Machine Based On Dee
Real-Time Solid Waste Sorting Machine Based On Dee
Imane Nedjar*
University of Tlemcen,
Biomedical Engineering Laboratory
Ecole Supérieure en Sciences Appliquées de Tlemcen, ESSA-Tlemcen,
BP 165 RP Bel Horizon, Tlemcen 13000, Algeria
[email protected]
Mohammed M’hamedi
University of Tlemcen,
Faculty of Sciences, Department of Computer Science
Ecole Supérieure en Sciences Appliquées de Tlemcen, ESSA-Tlemcen,
BP 165 RP Bel Horizon, Tlemcen 13000, Algeria
[email protected]
Mokhtaria Bekkaoui
University of Tlemcen,
Manufacturing Engineering Laboratory of Tlemcen
Ecole Supérieure en Sciences Appliquées de Tlemcen, ESSA-Tlemcen,
BP 165 RP Bel Horizon, Tlemcen 13000, Algeria
[email protected]
*Corresponding author
Abstract – The collection and separation of solid waste represent crucial stages in recycling. However, waste collection currently
relies on static trash bins that lack customization to suit specific locations. By integrating artificial intelligence into trash bins, we can
enhance their functionality. This study proposes the implementation of a sorting machine as an intelligent alternative to traditional
trash bins. This machine autonomously segregates waste without human intervention, utilizing deep learning techniques and
an embedded edge device for real-time sorting. Deploying a convolutional neural network model on a Raspberry Pi, the machine
achieves solid waste identification and segregation via image recognition. Performance evaluation conducted on both the Stanford
dataset and a dataset we created showcases the machine's high accuracy in detection and classification. Moreover, the proposed
machine stands out for its simplicity and cost-effectiveness in implementation.
Keywords: Waste, deep leaning, raspberry pi, artificial intelligence, sorting machine
Received: March 15, 2024; Received in revised form: June 12, 2024; Accepted: June 12, 2024
2.2. Machine composition The source code for the prototype machine is acces-
sible at the following link: https://github.com/Nedjar-
The machine has two main components: the elec- Imane/Sorting-Machine/tree/main
tronic component box and two physical parts.
The top part of the machine handles the processing
tasks, including the detection, recognition, and classifi-
cation of waste.
The bottom part houses the containers designated
for paper and plastic waste. The design of the proto-
type machine shown in Fig. 2 was modeled using Solid-
Works software.
• Electronic component box
The electronic components used in the prototype
machine, including the Raspberry Pi, Pi Camera, and
servomotors, are housed in the electronic component
box for protection. Fig. 3 shows the electronic circuit of
the prototype machine. Fig. 2. Machine’s design
(a) (b)
Fig. 4. Images from our dataset with different
backgrounds (a) Paper, (b) Plastic bottle
Fig. 3. Electronic circuit of the prototype machine
created using FreeCAD software 3.2. Convolutional neural network
model
• The upper part of the prototype machine
The remarkable achievements of CNN-based archi-
Window: the upper part of the prototype machine
tectures are notably outstanding, particularly in com-
features an opening where students can deposit the
puter vision, where accuracy levels often approach
waste. An obstacle avoidance sensor (E18-D80NK) is
perfection.
placed at the top of the window to detect incoming ob-
jects. The servomotor (Metal gears RG996R) operates Our system is specifically tailored for real-time de-
the window to ensure smooth and efficient operation. tection and classification of paper and plastic, utiliz-
ing CNN for these tasks. In this study, we opted for the
Sorting board: once the sensor detects an object, the
MobileNet and NASNet-Mobile neural network archi-
window opens automatically, and the user can throw
tectures for classification, chosen for their suitability in
their waste into the machine. There is a sorting board
the context of mobile and embedded devices. Several
inside the prototype machine. It turns to the right for
recent studies have used these neural network archi-
plastic waste and the left for paper waste.
tectures, such as [33, 34], in addition to real-time video
Box of pens: we developed this prototype machine applications where processing speed is crucial [35, 36].
for educational institutions to encourage students to
• MobileNet
recycle waste. When a student throws a specified num-
ber of plastic bottles, the box of pens opens, allowing Google's MobileNets architecture [37] is tailored for
the student to take a pen. The servomotor operates the mobile and embedded vision applications due to its
opening mechanism of the box. lightweight design. Its efficiency stems from the dep-
thwise separable convolutions instead of full convo-
• The lower part of the prototype machine
lutions. MobileNet introduces two parameters: Width
This part is designed to sort the waste into two con- Multiplier (α) and Resolution Multiplier (ρ), which en-
tainers. hance the architecture's flexibility.
• NASNet-Mobile
3. PROPOSED METHOD
Neural Architecture Search Network aims to discov-
3.1. Dataset er an optimal CNN architecture using reinforcement
learning. NASNet is a technique developed at Google
We have collected and organized our dataset titled Brain for searching through a space of neural network
'Plastic and Paper Waste' by taking photos with mobile configurations [38]. The optimized version, based on
phones in our homes and at the university (see Fig. 4). Normal and Reduction-Cells, is known as NASNet-Mo-
bile. Normal Cells are convolutional cells that return a
The 'Plastic and Paper Waste' dataset contains 400
feature map of the same dimension as the input, while
images for each class, encompassing diverse papers
Reduction Cells are convolutional cells that reduce
and plastic bottles captured in various positions, states,
the feature map’s height and width by a factor of two.
lighting conditions, and backgrounds.
These cells are combined to create the complete neural
In our experiment, we also used the Stanford data- network optimized for a specific task while minimizing
set [23], which includes images of trash against a white the computational resources needed for training and
background organized into six classes. inference.
(1)
(2)
Table 1. Kappa, Precision, Recall and F1 score We present the CLR schedule obtained in Fig. 9.
obtained for combined dataset Applying CLR to our model has improved the accuracy
Kappa Precision Recall F1 score obtained while accelerating the training process. Fig. 10
Paper Plastic Paper Plastic Paper Plastic shows that all the metrics have improved, reaching 100%.
MobileNet 0.98 0.99 1.00 1.00 0.99 0.99 0.99
NASNet-
0.94 0.96 0.98 0.98 0.96 0.97 0.97
Mobile
[10] A. Gupta, M. J. Van der Schoor, J. Bräutigam, V. B. ing deep learning”, Proceedings of the ACM In-
Justo, T. F. Umland, D. Goehlich, “Autonomous ser- ternational Joint Conference on Pervasive and
vice robots for urban waste management-multia- Ubiquitous Computing, Association for Comput-
gent route planning and cooperative operation”, ing Machinery, Heidelberg, Germany, 12-16 Sep-
IEEE Robotics and Automation Letters, Vol. 7, No 4, tember 2016, pp 940-945.
2022, pp. 8972-8979. [20] S. Chaturvedi, B. P. Yadav, N. A. Siddiqui, “An as-
[11] T. Schaffter et al. “Evaluation of combined artificial sessment of machine learning integrated autono-
intelligence and radiologist assessment to inter- mous waste detection and sorting of municipal
pret screening mammograms”, JAMA Network solid waste”, Nature Environment & Pollution
Open, Vol. 3, No. 3, 2020, pp. e200265-e200265. Technology, Vol. 20, No. 4, 2021, pp. 1515-1525.
[12] B. Richards, D. Tsao, A. Zador, “The application of ar- [21] D. Rutqvist, D. Kleyko, F. Blomstedt, “An automated
tificial intelligence to biology and neuroscience”, machine learning approach for smart waste man-
Cell, Vol. 185, No. 15, 2022, pp. 2640-2643. agement systems”, IEEE Transactions on Industrial
Informatics, Vol. 16, No. 1, 2020, pp. 384-392.
[13] V. Hocenski, A. Lončarić Božić, N. Perić, D. Klapan,
Ž. Hocenski, “Environmental impact estimation of [22] S. Dubey, P. Singh, P. Yadav, K. K. Singh, “Household
ceramic tile industry using modeling with neural waste management system using IoT and ma-
networks”, International Journal of Electrical and chine learning”, Procedia Computer Science, Vol.
Computer Engineering Systems, Vol. 13, No. 1, 167, 2020, pp. 1950-1959.
2022, pp. 29-39. [23] M. Yang, G. Thung, “Classification of trash for recy-
[14] C. Shi, C. Tan, T. Wang, L. Wang, “A waste classifica- clability status”, Stanford University, Stanford, CA,
tion method based on a multilayer hybrid convo- USA, CS229 project report, 2016.
lution neural network”, Applied Sciences, Vol. 11, [24] R. Khan, S. Kumar, A. K. Srivastava, N. Dhingra, M.
No. 18, 2021, p. 8572. Gupta, N. Bhati, P. Kumari, “Machine learning and
[15] M. Al Duhayyim, T. A. Elfadil Eisa, F. N. Al-Wesabi, A. IoT-based waste management model”, Computa-
Abdelmaboud, M. A. Hamza, A. S. Zamani, M. Riz- tional Intelligence and Neuroscience, Vol. 2021,
wanullah, R. Marzouk, “Deep reinforcement learn- No. 1, 2021, p. 5942574.
ing enabled smart city recycling waste object [25] M. W. Rahman, R. Islam, A. Hasan, N. I. Bithi, M.
classification”, Computers, Materials & Continua, M. Hasan, M. M. Rahman, “Intelligent waste man-
Vol. 71, No. 3, 2022, pp. 5699-5715. agement system using deep learning with IoT”,
[16] A. Mitra, “Detection of waste materials using deep Journal of King Saud University-Computer and In-
learning and image processing”, University San formation Sciences, Vol. 34, No.5, 2022, pp. 2072-
Marcos, California State, USA, Master Thesis, 2020. 2087.
[17] M. Malik, S. Sharma, M. Uddin, C. L. Chen, C. M. Wu, [26] A. Noiki, S. A. Afolalu, A. A. Abioye, C. A. Bolu, M. E.
P. Soni, S. Chaudhary, “Waste classification for sus- Emetere, “Smart waste bin system: a review”, Pro-
tainable development using image recognition ceedings of the 4th International Conference on
with deep learning neural network models”, Sus- Science and Sustainable Development, Ota, Nige-
tainability, Vol. 14, No. 12, 2022, p. 7222. ria, 3-5 August 2020, p. 012036.
[18] A. Altikat, A. Gulbe, S. Altikat, “Intelligent solid [27] Y. Zhao, S. Yao, S. Li, S. Hu, H. Shao, T. F. Abdelzaher,
waste classification using deep convolutional “VibeBin: A vibration-based waste bin level detec-