This paper is devoted to the solution of the problem of moving a rack with test tubes filled with blood to the area of aliquotation operation is considered. To solve this problem, a vision system based on the YOLOv8 neural network was used and a gripper was designed. The neural network is trained to perform image segmentation, the purpose of which is to determine the contour of the rack. To correct the resulting image segmentation defects, an algorithm has been developed that allows to create an accurate bounding frame and rack orientation. A software package has been developed in Python, which includes modules for image segmentation by a neural network, an algorithm for determining the contour and orientation of a rack, and a robot control module.

Method of Localization of Racks with Biomaterial for Robot Grasp Based on Segmented Contour Processing

Carbone G.;Malyshev D.;
2023-01-01

Abstract

This paper is devoted to the solution of the problem of moving a rack with test tubes filled with blood to the area of aliquotation operation is considered. To solve this problem, a vision system based on the YOLOv8 neural network was used and a gripper was designed. The neural network is trained to perform image segmentation, the purpose of which is to determine the contour of the rack. To correct the resulting image segmentation defects, an algorithm has been developed that allows to create an accurate bounding frame and rack orientation. A software package has been developed in Python, which includes modules for image segmentation by a neural network, an algorithm for determining the contour and orientation of a rack, and a robot control module.
2023
978-3-031-45769-2
978-3-031-45770-8
Aliquoting system
Gripper
Object detection
YOLOv8
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11770/362831
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact