Acessibilidade / Reportar erro

REAL-TIME SELECTIVE SPRAYING FOR VIOLA ROPE CONTROL IN SOYBEAN AND COTTON CROPS USING DEEP LEARNING

ABSTRACT

The cultivation of soy and cotton is of great importance in the Brazilian economic scenario, both of which move billions of reais per year in exports. Weed management is important for obtaining optimal yields. Among the plants that have gained resistance and tolerance are those of the genus Ipomoea spp. These plants affect soybean and cotton crops throughout their cycle, thereby affecting their productivity. In this context, the objective of this work was to develop an embedded system for the selective spraying of rope and viola in cotton and soybean crops using algorithms for the classification and detection of objects in real time (Faster R-CNN and YOLOv3). This project was developed at the Agricultural Machinery Laboratory of the Federal University of Rondonópolis. The algorithms were trained to detect three classes (soybean, viola, and cotton) and were evaluated in terms of precision and sensitivity in the laboratory and field. Control results using faster R-CNN sprays demonstrated that real-time object detection algorithms for the selective control of weeds can be used for soybean and cotton crops.

KEYWORDS
Artificial intelligence; machine learning; spray; convolutional neural networks

INTRODUCTION

Weeds are categorized as plants that develop in a place unsuitable for the objectives of humans, interfering negatively with economic crops, both in productivity and in the final quality of harvested products (Vasconcelos et al., 2012Vasconcelos MCC, Da Silva AFA, Lima RS (2012) Interferência de Plantas Daninhas sobre Plantas Cultivadas. Agropecuaria cientifica no semiarido 8(1):1-6. Disponível em: <http://www.cstr.ufcg.edu.br/acsa/%5CnRevista>.
http://www.cstr.ufcg.edu.br/acsa/%5CnRev...
).

However, it is widely known that losses caused by weeds exceed the losses of any category of agricultural pests, such as insects, nematodes, diseases, and rodents (Abouziena & Haggag, 2016Abouziena HF, Haggag WM (2016) Weed control in clean agriculture: a review. Planta Daninha 34(2):377-392.). In bulk crops like beans, the decrease in yield reaches 71 % (Kozlowski et al., 2002Kozlowski LA, Ronzelli Júnior P, Purissimo C, Daros E, Koehler HS (2002) Período crítico de interferência das plantas daninhas na cultura do feijoeiro-comum em sistema de semeadura direta. Planta Daninha 20(2):213-220.), the average losses in soybean vary from 37% (Fleck & Candemil, 1995Fleck NG, Candemil CRG (1995) Interferência de plantas daninhas na cultura da soja (GLYCINE MAX (L.) MERRILL). Ciência Rural 25(1):27-32) to 58% (Rizzardi et al., 2003Rizzardi MA, Fleck NG, Mundstock CM, Bianchi MA (2003) Perdas de rendimento de grãos de soja causadas por interferência de picão-preto e guanxuma. Ciência Rural 33(4):621-627.), and in some cases, the total loss of the crop.

Because of their fast growth characteristics and their ability to inhabit various environments (Ferreira & Miotto, 2009Ferreira PPA, Miotto STS (2009) Flora Ilustrada do Rio Grande do Sul: Sinopse das espécies de Ipomoea L. (Convolvulaceae) ocorrentes no Rio Grande do Sul, Brasil. Revista Brasileira Biociências 7(4):440-453.), these crops are found in vegetable crops (Moreira & Bragança, 2011Moreira HJC, Bragança HBN (2011) Manual de identificação de plantas infestantes: Hortifrútil. São Paulo, FMC Agricultural Products.), sugarcane (Correia & Kronka JR., 2010Correia NM, Kronka JR B (2010) Controle químico de plantas dos gêneros ipomoea e merremia em cana-soca. Planta Daninha 28:1143-1152.; Azania et al., 2011Azania CAM, Hirata ACS, Azania AAPM (2011) Biologia e manejo químico de corda-de-viola em cana-de açúcar. Campinas, IAC. Boletim Técnico 209.), soybean, cotton (Constantin et al., 2011Constantin J, Oliveira JR. RS, Gheno EA (2011) Controle de corda-de-viola com as opções de tratamentos herbicidas disponíveis para a cultura do algodão. In: Congresso Brasileiro de Algodão, COTTON EXPO. São Paulo, Anais…), and cereals. The areas where these species predominate have reduced productivity, for example, there was 34% reduction in the number of end of stalks and 46% reduction in productivity when Ipomoea hederifolia grew in competition with sugarcane crop (Silva et al., 2009Silva IAB, Kuva MA, Alves PLCA, Salgado TP (2009) Interferência de uma comunidade de plantas daninhas com predominância de Ipomoea hederifolia na cana-soca. Planta Daninha 27(2):265-272. DOI: http://dx.doi.org/10.1590/S0100-83582009000200008
http://dx.doi.org/10.1590/S0100-83582009...
), while in the soybean crop the reduction varied from 15.53% to 80% according to the density of cordgrass (Piccinini et al., 2018Piccinini F, Machado SLO, Martin TN, Kruse ND, Balbinot A, Guareschi A (2018) Interference of morning glory in soybean yield. Planta Daninha 36. DOI: https://doi.org/10.1590/s0100-83582018360100063.
https://doi.org/10.1590/s0100-8358201836...
; Pagnoncelli et al., 2017Pagnoncelli FB, Trezzil MM, Vidal RA, Portes AF, Scalcon EL, Machado A (2017) Morning glory species interference on the development and yield of soybeans. Bragantia 76(4):470-479.). Rizzardi et al. (2004)Rizzardi MA, Roman ES, Borowski DZ, Marcon R (2004) Interferência de populações de Euphorbia heterophylla e Ipomoea ramosissima isoladas ou em misturas sobre a cultura de soja. Planta Daninha 22(1):29-34. pointed out substantial increase in the incidence and density of cordgrass in recent years, directly affecting soybean crops.

In the area of selective spraying technologies, electronic optical sensors are used to measure reflectance intensities of spectral bands. Light Dependent resistors (LDR) were used to detect the presence of weeds or only the exposed soil from the reflectance of the red spectrum in the visible and near infrared (NIR) range, with 100% accuracy (Viliotti, 2002Viliotti CA (2002) Desenvolvimento de um sistema eletrônico para detecção da presença de plantas daninhas e controle da aplicação de herbicidas. Tese, Viçosa, Universidade Federal de Viçosa.).

These types of sensors are also employed in commercially available nozzle controllers such as WeedSeeker® (Trimble Navigation Limited®). The second generation WeedSeeker2® has two optical sensors equipped with normalized difference vegetation index (NDVI) readers, which increases detection efficiency. Another commercially available sensor is the WEED-IT® ², which works based on chlorophyll fluorescence in response to high-intensity near-infrared light (Visser & Timmermans, 1996Visser R, Timmermans AJMO (1996) Weed-It: a new selective weed control system. Proc. SPIE 2907 Optics in Agricuulture, Forestry, and Biological Processing II. DOI: https://doi.org/10.1117/12.262852.
https://doi.org/10.1117/12.262852...
).

Currently, controllers using image sensors and artificial intelligence have been developed and marketed as See & Spray® (Blue River technology), which process images obtained by cameras used as sensors, detects the presence of weeds in areas of cotton plantations, and allows a more selective spraying.

Embedded electronics coupled with the use of computer vision pose a challenge to the industry in the commercialization of related products for weed control. The main detection methods are image-based (Bakhshipour & Jafari, (2018)Bakhshipour A, Jafari A (2018) Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture 145:153–160.; Ferreira et al. (2017)Ferreira A dos S, Freitas DM, da Silva GG, Pistori H, Folhes MT (2017) Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture 143:314-324; Dyrmann et al. (2016)Dyrmann M, Karstof H, Midtiby HS (2016) Plant species classification using deep convolutional neural network. Biosytems Enginnering 151:72-80. DOI: https://doi.org/10.1016/j.biosystemseng.2016.08.024.
https://doi.org/10.1016/j.biosystemseng....
; Lee et al. (2017)Lee SH, Chan CS, Mayo SJ, Remagnino P (2017) How deep learning extracts and learns leaf features for plant classification. Pattern Recognit 71:1-13.; Carranza-Rojas et al. (2017)Carranza-Rojas J, Goeau H, Bonnet P, Mata-Montero E, Joly A (2017) Going deeper in the automated identification of herbarium specimens. BMC Evolutionary Biology 17:181.), spectrum-based (Shirzadifar, et al. (2018)Shirzadifar A, Bajwa S, Ahmad S, Howatt K, Nowatzki J (2018) Weed species discrimination based on SIMCA analysis of plant canopy spectral data. Biosystems Engineering 171: 143e154. DOI: https://doi.org/10.1016/j.biosystemseng.2018.04.019
https://doi.org/10.1016/j.biosystemseng....
, Li et al. (2017)Li L, Wei X, Mao H, Wu S (2017) Design and application of spectrum sensor for weed detection used in winter rape field. Transactions of the Chinese Society of Agricultural Engineering 33:127-133.) and spectral image-based to identify weeds from both ground and aerial photography (Louargant et al. (2018)Louargant M (2018) Unsupervised classification algorithm for early weed detection in row-crops by combining spatial and spectral information. Remote Sensing 10:761.; Lin et al. (2017)Lin F, Zhang D, Huang Y, Wang X, Chen X (2017) Detection of corn and weed species by the combination of spectral, shape and textural features. Sustainability 9:1335.).

Recent approaches using deep learning have shown satisfactory results in classification, segmentation, object detection, and image-tracking tasks (Zhou et al., 2019Zhou T, Ruan S, Canu SA (2019) Review: Deep learning for medical image segmentation using multi-modality fusion. ARRAY. DOI: https://doi.org/10.1016/j.array.2019.100004.
https://doi.org/10.1016/j.array.2019.100...
). The employment of machine vision with deep learning is growing in recent researches for developing selective application technologies. For example, Ferreira et al. (2017)Ferreira A dos S, Freitas DM, da Silva GG, Pistori H, Folhes MT (2017) Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture 143:314-324 developed a software for weed detection in soybean crops using deep convolutional networks and obtained accuracy above 99%.

Considering this, we aimed to evaluate the application of two object detection algorithms (FASTER R-CNN and YOLOv3) in real time and develop an embedded selective spraying system for the control of Ipomoea spp. in soybean and cotton crops.

MATERIAL AND METHODS

This work was conducted in the Agricultural Machines and Mechanization Laboratory of the Federal University of Rondonópolis. The project involved the development of an on-board system for spraying under field conditions, object classification and detection, and valve actuation for spraying.

Faster R-CNN (Ren et al., 2016Ren S, He K, Girshick R, Sun J (2016) Faster R-CNN: Towards Real-Time object detection with region proposal networks. Available: https://arxiv.org/abs/1506.01497v3. Accessed Nov, 2018.
https://arxiv.org/abs/1506.01497v3...
) is an evolution of FAST R-CNN (Girshick, 2015Girshick R (2015) FAST R-CNN. In: IEEE International Conference on Computer Vision (ICCV). Available: https://arxiv.org/abs/1504.08083. Accessed Ago, 2019.
https://arxiv.org/abs/1504.08083...
), which improves the computational performance of the network in terms of processing time and detection in selective search algorithms. YOLO (You Only Look Once) is a real-time object detection algorithm (Redmon & Farhadi, 2018Redmon J, Farhadi A (2018) YOLOv3: An incremental improvement. computer science, arXiv pré impressão arXiv: 1804.02767. Available: http://arxiv.org/abs/1804.02767. Accessed Nov, 2018.
http://arxiv.org/abs/1804.02767...
) based on convolutional neural networks for multiple object detection with good generalization ability and simultaneously performing predictions at multiple locations in the image, that predicts a class to which object belongs in an optimized manner and allows working at high frames per second (FPS) rates.

An image bank containing soybean, cotton, and rape leaves images was created as per interest for detection. Images were captured using a semi-professional Nikon P250 camera in areas of the Federal University of Rondonópolis and Mato Grosso Institute of Cotton (IMA) in Rondonópolis (Figure 1). The high variability and quantity of images helped in the learning process of a CNN, and should correspond as closely as possible to its real application (Olsen et al. 2019Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Jhons J (2019) Deepweeds: a multiclass weed species image dataset for deep learning. Scientific Reports 9:2058. DOI: https://doi.org/10.1038/s41598-018-38343-3
https://doi.org/10.1038/s41598-018-38343...
).

FIGURE 1
Photos of leaves of rape-string (1a, 1b, 1c), cotton (2a, 2b, 2c) and soybean (3a, 3b, 3c), at various angles, environments and weather conditions used to form the training and testing data set of the network.

The images were resized to 416 × 416 pixels before training the algorithm networks. The image bank contained 1,612 training images including 1,214 images acquired from crop modules planted at the UFR and 398 obtained at the IMA. Out of this, 708 images were of rape-string leaves, 487 of cotton leaves, and 417 of soybean leaves, and 10% of the images, that is 162 images, were randomly chosen to compose the test set, which was divided into 54 images of each class. The images varied with respect to the number of leaves contained in each sample at different angles, forming a dataset with representative variability.

After resizing all images in the bank, the manual tagging and labeling process was performed with the help of a program with a graphical interface known as Labelimg available in Darrenl's (2019)Darrenl 2019 LabelImg, repositório Github. Available: https://github.com/tzutalin/labelImg. Accessed Abr, 2019.
https://github.com/tzutalin/labelImg...
repository (Figure 2).

FIGURE 2
Labelimg interface, at the moment of performing the labeling and marking of the location of the vine leaf in the image.

The algorithms were trained on a computer with a 3.19 GHz i7-8700 processor, 16 GB memory, GeForce RTX 2070 video card, and 8 GB of dedicated memory. In this study, we used the smaller architecture version of YOLOV3, which is a smaller and faster network called YOLOv3 tiny (ALEXEY, 2019ALEXEY (2019) Yolo-v3 and Yolo-v2 for Windows and Linux, GitHub repositor. Available: https://github.com/AlexeyAB/darknethttps://Github.com/AlexeyAB/darknet. Accessed Apr, 2019.
https://github.com/AlexeyAB/darknethttps...
) developed for hardware with medium configurations, which makes it possible to achieve the goal with lower computational cost, instead of using the YOLOV3 backbone, which has a larger number of layers and parameters as the network architecture. Because of the size of memory available on the GPU and the processing capacity of the computers used in this study, the YOLOv3 tiny version was chosen. YOLO-tiny network has the advantage of being faster, although it loses accuracy compared to YOLO network (Partel et al., 2019Partel V, Kakarla SC, Ampatzidis Y (2019) Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture 157:339-350.).

For the training run, the number of classes and batches, filters, and values of the anchor boxes were varied according to the purpose of our study. The number of training classes was changed to three, and the defined batch size was 64 divided into 16 subdivisions, with the learning rate set at 0.001. The convolutional layer filters were changed according to [eq. (1)].

Equation 1: Calculation of the number of filters to be applied to the image in the YOLOv3 tiny network in the detection layers.

(1) Filters = 3 ( c + 5 )

c refers to the number of classes that will be submitted to the network for learning; in this study, the filters were set to 24.

The faster R-CNN (Ren et al., 2016Ren S, He K, Girshick R, Sun J (2016) Faster R-CNN: Towards Real-Time object detection with region proposal networks. Available: https://arxiv.org/abs/1506.01497v3. Accessed Nov, 2018.
https://arxiv.org/abs/1506.01497v3...
) used in this study was made available by Evan (2019)Evan 2019 TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10. Repositório GitHub. Available: https://github.com/EdjeElectronics/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10. Accessed Abr, 2019.
https://github.com/EdjeElectronics/Tenso...
, and the pre-trained incepton_v2 network architecture made available by Pkulzc (2019)Pkulzc (2019) Tensorflow detection model zoo. repositório GitHub. Available: https://github.com/pkulzc/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md. Accessed Abr, 2019.
https://github.com/pkulzc/models/blob/ma...
was used. The number of classes to be trained was changed according to YOLOV3 and the learning rate was 0.0002. The faster R-CNN model of this project was developed and executed in TensorFlow's object detection API.

The evaluations of the networks in field conditions were checked for their precision and sensitivity in each class in each algorithm by visually analyzing each frame of video of each repetition recorded at the time of the tests. Subsequently, the number of leaves captured by each camera were counted according to correctly detected leaves (VP), non-detected leaves (FN) and incorrectly detected leaves (FP), and the calculations were proceeded.

A prototype of autonomous sprayer was developed for the field tests, as shown in Figure 3.

FIGURE 3
A) front view, B) rear view, C) right side view, D) left side view of the prototype selective sprayer developed in conjunction with the Smart Agriculture group.

The developed spray system was composed of a tank with a capacity of 30 L, pressurization pump, and four sections connected by hydraulic connections and high-pressure hoses. Each section was composed of a solenoid valve and a spray tip.

The pump and reservoir were positioned on the upper part of the chassis, affixed with screws to a base wood; the nozzles and spray valves located at the rear of the prototype were fixed on a galvanized plate, that was spaced 0.63 m from the chassis by 25 x 25 mm metalon bars.

The solenoid valves were placed between the hoses and the tips of each section allowing the syrup to run through the hose. The spacing between the cameras and spray bar was 0.10 m and the support for the sensors was positioned 0.50 m from the ground, as shown in Figure 4.

FIGURE 4
Selective spraying system built in the prototype selective sprayer.

The tips for applying the product were installed at a height of 0.50 m from the ground, providing an overlap of 22% with a spray range of 0.70 m, JACTO nozzles, yellow color, model JSF, flow rate of 0.76 L.min-1 with an application angle of 110º at a working pressure of 310.26 kPa.

The on-board system for the detection and control of selective spraying was composed of four Logitech C920 HD PRO webcams with maximum resolution of 1080p/30qps - 720p/30qps and auto focus feature; a Pentium gold G5400 computer with 3.70 GHz processor, 8.00 GB RAM, NVIDIA GeForce GTX 1050 video card (4GB GDDR5 memory, Windows 10 operating system); a Plotting Board with Arduino Nano with Atmega Atmel 328P microcontroller; DVR Valves of stainless steel with working pressure up to 1172.11 kPa and voltage of 12 V; a Pressure pump (Seaflo brand) with a working pressure of 420 kPa and a flow rate of 4.92 L.min-1

The webcams (sensors) are connected to a computer that performs the processing of the detection algorithm, which sends a signal to the Arduino microcontroller when the leaf of the vine string is detected. The sent signal indicates the sensor that detects the weed, so that the microcontroller sends a signal to each relay channel where the detection occurred. In this way, the valves were actuated individually, because each solenoid valve was connected to a relay channel, which in turn was connected to four different digital ports on the Arduino that performed the actuation only when the port emitted 5V signal. In this case, as soon as the programming detects the leaf, the computer sends a command to the microcontroller, which sends a signal to the relay module that activates the solenoid valves to initiate the spraying.

The power to the entire system was supplied using a 12 V direct current (DC) battery with a 60 Ah supply. To power the computer with a bivolt source of alternating current, a 1000 W power voltage inverter that transforms 12 V DC voltage from the battery to 127 V alternating current (AC).

The average flow rate of the nozzles in this study was 0.64 L min-¹ at a 400 kPa working pressure of the pump and was obtained using Chaim's (2019)Chaim A (2019) Boas práticas agrícolas: aplicação de agrotóxicos e meio ambiente. Jaguariúna, Embrapa - Meio Ambiente. Available: https://www.embrapa.br/documents/10180/13599347/ID19.pdf. Accessed Set, 2019.
https://www.embrapa.br/documents/10180/1...
methodology (Equation (2)). To calculate this, the system was actuated for 1 min, and the sprayed volume was collected in a 1 L volumetric cup.

(2) q = V T s

Where:

q = the flow rate of the system in L.min-1,

V = volume obtained in the graduated cup in liters for the system actuation time (Ts) in minutes.

To evaluate networks in the field, Partel et al. methodology (2019)Partel V, Kakarla SC, Ampatzidis Y (2019) Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture 157:339-350. was adopted, in which two test modules were set up. The first module contained two rows of the soybean crop of 2 meters in length each, with plant spacing of 0.20 m and inter-row spacing of 0.90 m. The second module contained two rows of cotton crops 4 meters in length each with plant spacing of 0.15 m and inter-row spacing 0.90 m. The spacing adopted in this study does not correspond to the adequate spacing for the soybean crop because of the prototype gauge to avoid crushing the evaluated rows of the crop during the evaluations.

To obtain sufficient number of frames of images per second, the prototype sprayer was moved in the field evaluations with a maximum speed of 0.5 m.s-1, pump pressure at 400 kPa, and flow rate of the tips at 0.64 L.min-¹. Ten repetitions were performed for each culture in each module. A schematic of the experimental evaluation area is shown in Figure 5.

FIGURE 5
Sketch of the evaluation area of the algorithms in the field.

At the time of evaluations, the real-time videos were recorded by each camera for each repetition provided by the networks. These videos were saved, and later converted into images in order to analyze each image frame by frame for counting the number of correctly detected leaves (PV), non-detected leaves (FN), and incorrectly detected leaves (FP). The data were then tabulated, and precision and sensitivity were calculated for each class, camera, and repetition.

The number of sprayed string plants was also counted for each repetition, algorithm, and module. To determine the percentage of control of the target plants in the area, a person walked behind the prototype to take notes every time a spray nozzle was activated on the desired target.

RESULTS AND DISCUSSION

The YOLOv3 tiny during the evaluations carried out in the field on cotton crops detected only one leaf as being correct for the crop in only one out of ten repetitions that were performed. Thus, the average accuracy obtained was 10% (Figure 6).

FIGURE 6
In the image (a) the only cotton leaf detected, (b) leaves not detected, and (c) leaf incorrectly detected by the YOLOv3 tiny at the time of the evaluations.

The error in the cotton leaf classification may be related to the leaf format and angle at which the images are captured, as the cotton leaves are also cordate and, according to their stage and the angle in relation to the camera, present a format similar to that of a cotton cord.

The capture angle also has an effect on detection accuracy, as observed by Quan et al. (2019)Quan L, Feng H, Ly Y, Wang O, Zhang C, Liu JE, Yuan Z (2019) Maize seedling detection under different growth stages and complex field environments based on an improved Faster R-CNN. Biosystems Engineering 184:1-23., in which the authors evaluated different object detection algorithms (YOLOV2, FASTER R-CNN, and FASTER R-CNN with VGG19) for detection of different phenological growth stages in corn plants and weeds as a function of three different positioning angles for image capture (0°, 30°, and 75°), and observed better results at 75° angle for all algorithms. Higher values were observed for FASTER R-CNN with VGG19 delivering accuracy of 98.20% and sensitivity of 97.25%. The authors attributed this difference to the larger image size and view of the corn plants. The views of the corn plants were frontal and superior at 30° and 75° angle, while at 0° angle they were only superior as used in this study. Accuracy of 98.00% in case of Soybean plants shows that the algorithm sometimes misclassified weeds as crops.

Higher precision was observed in the detection of weeds in the cotton crop with 93% compared to 76% in the soybean crop (Table 1). Although the shape of the cotton leaves is more similar to that of the springtails, the algorithm confuses more in the detection because of similarity between leaflets of soybean and leaves of springtail, thus decreasing the precision of weed detection (Figure 7).

FIGURE 7
Camera images during field repetitions (a) incorrect and correct soybean leaf detection, (b) and (c) soybean leaves detected and not detected by the YOLOv3 tiny algorithm.
TABLE 1
Accuracy of the YOLOv3 tiny algorithms in detecting the classes of cordgrass, soybean and cotton for both modules evaluated.

This fact can perhaps be explained by the features extracted from the images of both plants at the time of training and learning and may present more similar points for the algorithm.

Authors Kazmi et al. (2015)Kazmi W, Garcia-Ruiz F, Nielsen J, Rasmussen J, Andersen HJ (2015) Exploiting affine invariant regions and leaf edge shapes for weed detection. Computers and Electronics in Agriculture 118:290-299. DOI: https://doi.org/10.1016/j.compag.2015.08.023.
https://doi.org/10.1016/j.compag.2015.08...
extracted 14 vegetation indices from creeping thistle and sugar beet to differentiate the plants and obtained 97.00% accuracy; however, the features extracted during training and learning of the CNNs in this study are unknown.

With respect to the sensitivity, for vine-string in both modules and classes, it was observed below 12.00%, as shown in Table 2. These results were below the values obtained in the laboratory test image sets, demonstrating the low capacity of the YOLOv3 tiny algorithm to recover all the objects contained in the image in real time.

TABLE 2
Sensitivity of the YOLOv3 tiny algorithm in real time for the three classes evaluated in the study.

These sensitivity values may have been influenced by the environmental conditions at the time of the test and by the camera used, which was programmed for use with an automatic zoom and did not present an adequate focus on the images. The constant search for focal adjustment on the move may have contributed to the low detection of all objects in the image and incorrect classification of some plants (Figure 8). According to Zheng et. al. (2017)Zheng Y, Zhu Q, Huang M, Gui Y, Qin J (2017) Maize and weed classification using color indices with support vector data description in outdoor fields. Computers and Electronics in Agriculture 141:215–222. DOI: https://doi.org/10.1016/j.compag.2017.07.028
https://doi.org/10.1016/j.compag.2017.07...
detection in natural RGB images is difficult because of the complex background, different illumination, weather, shadow regions, and color similarities. According to Hong et al. (2012)Hong S, Minzan L, Zhang Q (2012) Detection system of smart sprayer: Status, challenges, and perspectives. International Journal of Agricultural and Biology Engineering 5(3):10-23, lighting conditions are one of the challenges in machine vision, as reflectance can increase on brightly lit days, reducing feature differentiation and making detection difficult because increased reflectance distorts image colors.

FIGURE 8
(a) and (b) incidence of high light on cotton plants at the time of evaluations, causing non-detection and incorrect detection of leaves by the algorithms and (c) blurring of the image at the time of focal adjustment of the camera.

The faster R-CNN showed excellent precision results for both cotton and soybean crops. For rapeseed, there was a variation of 4.00% between the averages of both modules, as shown in Table 3.

TABLE 3
Average precision results in the detection of the three classes in both modules in 10 repetitions for Faster R-CNN.

The accuracies for cotton and cottonseed in the cotton module were above 90.00%; whereas they were below 88.00% for cottonseed in the soybean module, as shown in Figure 9. Similar results were observed by Olsen et al. (2019)Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Jhons J (2019) Deepweeds: a multiclass weed species image dataset for deep learning. Scientific Reports 9:2058. DOI: https://doi.org/10.1038/s41598-018-38343-3
https://doi.org/10.1038/s41598-018-38343...
during evaluation of the performance of two convolutional neural network architectures in classifying eight weed species, where they obtained accuracies ranging from 88.50% to 97.60% across weed species, using the ResNet-50 architecture. The authors attributed the lower accuracy in some weeds to the availability of fewer unique visible features to be trained on. They further pointed out that a high degree of confusion in the classification of weeds is related to similar image features.

FIGURE 9
(a), (b), and (c) Cotton leaf detections; (d), (e), and (f) Rope leaves; and (g), (h), and (i) Soybean leaflets performed by the Faster R-CNN algorithm.

Thus, as pointed out earlier, the position of the leaf or leaflet at the time of capturing image may be similar to other species to be detected, leading the machine to a misclassification.

Another important fact to highlight is the accuracy of the crops of interest, as there was little error in classifying weeds as crops. This result is significant because the aim is to control weeds, and if they are detected and misclassified, they are not sprayed. However, the detection of crop plants of economic interest as weeds leads to a waste of herbicides and consequently a larger quantity of products in the environment.

In terms of sensitivity, faster R-CNN obtained results lower than 26.00% when performing real-time detection for both modules (Table 4). The low capacity to detect all objects in the image is insignificant for crops of economic interest, considering that the objective is to spray only the weeds. As for the weeds, the detection of all leaves is not necessary; however, the detection of at least one leaf from each plant is necessary to activate the nozzles and perform herbicide spraying.

TABLE 4
Average sensitivity of the Faster R-CNN algorithm of the 10 repetitions in the evaluation modules for the three different classes.

In comparison between the algorithms, faster R-CNN showed better results in terms of sensitivity for both modules and both classes; however, the accuracy was close to that of the YOLOv3 tiny, which was superior for the soybean module that detected a smaller number of soybean leaflets as weeds. The accuracy error in the classification of the cotton leaf algorithm may be related to the shape of the leaves and the angle at which the images were captured, as cotton leaves are also cordate and, depending on their stage and the angle in relation to the camera, have similar shape to that of the cotton leaf. (Table 5).

TABLE 5
Average of Precision and Sensitivity of the 10 repetitions of the evaluations of the algorithms in the field, for each class to be detected.

Quan et al. (2019)Quan L, Feng H, Ly Y, Wang O, Zhang C, Liu JE, Yuan Z (2019) Maize seedling detection under different growth stages and complex field environments based on an improved Faster R-CNN. Biosystems Engineering 184:1-23. compared the classic FASTER-CNN, faster R-CNN with VGG19 architecture and the YOLOv2 version, and showed that FASTER R-CNN was superior to YOLOv2 in terms of accuracy, which was found 10.31% in detection of corn seedlings between 6-7 leaves. Although it is an earlier version of YOLO, it shows that faster R-CNN has superior accuracy results but slower processing.

Control through spraying is directly related to the accuracy and sensitivity of detection. In this context, the faster R-CNN showed variations in the control during the repetitions in each module. In the cotton module, higher percentages can be observed, varying from 67.00% to 87.00% between repetitions. As for the soy module, the lowest percentage occurred in Repetition 9, perhaps due to several passes of the prototype in the area where some of the rape leaves were crushed, making its detection and consequently its control difficult. However, the other repetitions showed percentages greater than or equal to 67.00% in the control of weeds (Tables 6 and 7).

TABLE 6
Ryegrass control using the Faster R-CNN algorithm in the soybean and cotton modules.
TABLE 7
Control of the cotton cordgrass performed by the YOLOv3 tiny algorithm in the soybean and cotton modules.

For the YOLOv3 tiny algorithm, as shown in Table 7, the control of the vine string was below 44% in both modules. This value is reflected by the low number of sensitivities presented by the detection algorithm, although it has good accuracy in detection.

The average number of weeds that are sprayed in the cotton module was 3 (three) plants out of 15, representing an average of 19.00% in the control. As for the soy module, the control was 20.00%, with an average of 2 sprayed plants out of 9 present in the module in 10 repetitions.

For the Faster R-CNN algorithm, the best average control of cordgrass plants was obtained in both modules, controlling 12 out of 15 plants present in the cotton module and 7 sprayed plants out of 9 plants present in the soybean module. The average control was 81% for the cotton module and 78% for soybean (Table 8).

TABLE 8
Average of control of cotton cordgrass of both algorithms and modules.

CONCLUSIONS

In field tests performed on the selective sprayer prototype, the faster R-CNN showed better results in the control of cotton and soybean weevils present in both the cotton and soybean modules, even with low sensitivity values. The on-board system developed for the spraying prototype was efficient for the individual activation of each nozzle, performing the application selectively according to the detected target. The FASTER R-CNN algorithm demonstrated its applicability as an Ipomoea spp. plant detection tool for variable-rate herbicide spraying.

REFERÊNCIAS

  • Abouziena HF, Haggag WM (2016) Weed control in clean agriculture: a review. Planta Daninha 34(2):377-392.
  • ALEXEY (2019) Yolo-v3 and Yolo-v2 for Windows and Linux, GitHub repositor. Available: https://github.com/AlexeyAB/darknethttps://Github.com/AlexeyAB/darknet Accessed Apr, 2019.
    » https://github.com/AlexeyAB/darknethttps://Github.com/AlexeyAB/darknet
  • Azania CAM, Hirata ACS, Azania AAPM (2011) Biologia e manejo químico de corda-de-viola em cana-de açúcar. Campinas, IAC. Boletim Técnico 209.
  • Bakhshipour A, Jafari A (2018) Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture 145:153–160.
  • Chaim A (2019) Boas práticas agrícolas: aplicação de agrotóxicos e meio ambiente. Jaguariúna, Embrapa - Meio Ambiente. Available: https://www.embrapa.br/documents/10180/13599347/ID19.pdf Accessed Set, 2019.
    » https://www.embrapa.br/documents/10180/13599347/ID19.pdf
  • Carranza-Rojas J, Goeau H, Bonnet P, Mata-Montero E, Joly A (2017) Going deeper in the automated identification of herbarium specimens. BMC Evolutionary Biology 17:181.
  • Constantin J, Oliveira JR. RS, Gheno EA (2011) Controle de corda-de-viola com as opções de tratamentos herbicidas disponíveis para a cultura do algodão. In: Congresso Brasileiro de Algodão, COTTON EXPO. São Paulo, Anais…
  • Correia NM, Kronka JR B (2010) Controle químico de plantas dos gêneros ipomoea e merremia em cana-soca. Planta Daninha 28:1143-1152.
  • Darrenl 2019 LabelImg, repositório Github. Available: https://github.com/tzutalin/labelImg Accessed Abr, 2019.
    » https://github.com/tzutalin/labelImg
  • Dyrmann M, Karstof H, Midtiby HS (2016) Plant species classification using deep convolutional neural network. Biosytems Enginnering 151:72-80. DOI: https://doi.org/10.1016/j.biosystemseng.2016.08.024
    » https://doi.org/10.1016/j.biosystemseng.2016.08.024
  • Evan 2019 TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10. Repositório GitHub. Available: https://github.com/EdjeElectronics/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10 Accessed Abr, 2019.
    » https://github.com/EdjeElectronics/TensorFlow-Object-Detection-API-Tutorial-Train-Multiple-Objects-Windows-10
  • Ferreira PPA, Miotto STS (2009) Flora Ilustrada do Rio Grande do Sul: Sinopse das espécies de Ipomoea L. (Convolvulaceae) ocorrentes no Rio Grande do Sul, Brasil. Revista Brasileira Biociências 7(4):440-453.
  • Ferreira A dos S, Freitas DM, da Silva GG, Pistori H, Folhes MT (2017) Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture 143:314-324
  • Fleck NG, Candemil CRG (1995) Interferência de plantas daninhas na cultura da soja (GLYCINE MAX (L.) MERRILL). Ciência Rural 25(1):27-32
  • Girshick R (2015) FAST R-CNN. In: IEEE International Conference on Computer Vision (ICCV). Available: https://arxiv.org/abs/1504.08083 Accessed Ago, 2019.
    » https://arxiv.org/abs/1504.08083
  • Hong S, Minzan L, Zhang Q (2012) Detection system of smart sprayer: Status, challenges, and perspectives. International Journal of Agricultural and Biology Engineering 5(3):10-23
  • Kazmi W, Garcia-Ruiz F, Nielsen J, Rasmussen J, Andersen HJ (2015) Exploiting affine invariant regions and leaf edge shapes for weed detection. Computers and Electronics in Agriculture 118:290-299. DOI: https://doi.org/10.1016/j.compag.2015.08.023
    » https://doi.org/10.1016/j.compag.2015.08.023
  • Kozlowski LA, Ronzelli Júnior P, Purissimo C, Daros E, Koehler HS (2002) Período crítico de interferência das plantas daninhas na cultura do feijoeiro-comum em sistema de semeadura direta. Planta Daninha 20(2):213-220.
  • Lee SH, Chan CS, Mayo SJ, Remagnino P (2017) How deep learning extracts and learns leaf features for plant classification. Pattern Recognit 71:1-13.
  • Li L, Wei X, Mao H, Wu S (2017) Design and application of spectrum sensor for weed detection used in winter rape field. Transactions of the Chinese Society of Agricultural Engineering 33:127-133.
  • Lin F, Zhang D, Huang Y, Wang X, Chen X (2017) Detection of corn and weed species by the combination of spectral, shape and textural features. Sustainability 9:1335.
  • Louargant M (2018) Unsupervised classification algorithm for early weed detection in row-crops by combining spatial and spectral information. Remote Sensing 10:761.
  • Moreira HJC, Bragança HBN (2011) Manual de identificação de plantas infestantes: Hortifrútil. São Paulo, FMC Agricultural Products.
  • Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Jhons J (2019) Deepweeds: a multiclass weed species image dataset for deep learning. Scientific Reports 9:2058. DOI: https://doi.org/10.1038/s41598-018-38343-3
    » https://doi.org/10.1038/s41598-018-38343-3
  • Pagnoncelli FB, Trezzil MM, Vidal RA, Portes AF, Scalcon EL, Machado A (2017) Morning glory species interference on the development and yield of soybeans. Bragantia 76(4):470-479.
  • Partel V, Kakarla SC, Ampatzidis Y (2019) Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture 157:339-350.
  • Pkulzc (2019) Tensorflow detection model zoo. repositório GitHub. Available: https://github.com/pkulzc/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md Accessed Abr, 2019.
    » https://github.com/pkulzc/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md
  • Piccinini F, Machado SLO, Martin TN, Kruse ND, Balbinot A, Guareschi A (2018) Interference of morning glory in soybean yield. Planta Daninha 36. DOI: https://doi.org/10.1590/s0100-83582018360100063
    » https://doi.org/10.1590/s0100-83582018360100063
  • Quan L, Feng H, Ly Y, Wang O, Zhang C, Liu JE, Yuan Z (2019) Maize seedling detection under different growth stages and complex field environments based on an improved Faster R-CNN. Biosystems Engineering 184:1-23.
  • Redmon J, Farhadi A (2018) YOLOv3: An incremental improvement. computer science, arXiv pré impressão arXiv: 1804.02767. Available: http://arxiv.org/abs/1804.02767 Accessed Nov, 2018.
    » http://arxiv.org/abs/1804.02767
  • Ren S, He K, Girshick R, Sun J (2016) Faster R-CNN: Towards Real-Time object detection with region proposal networks. Available: https://arxiv.org/abs/1506.01497v3 Accessed Nov, 2018.
    » https://arxiv.org/abs/1506.01497v3
  • Rizzardi MA, Fleck NG, Mundstock CM, Bianchi MA (2003) Perdas de rendimento de grãos de soja causadas por interferência de picão-preto e guanxuma. Ciência Rural 33(4):621-627.
  • Rizzardi MA, Roman ES, Borowski DZ, Marcon R (2004) Interferência de populações de Euphorbia heterophylla e Ipomoea ramosissima isoladas ou em misturas sobre a cultura de soja. Planta Daninha 22(1):29-34.
  • Shirzadifar A, Bajwa S, Ahmad S, Howatt K, Nowatzki J (2018) Weed species discrimination based on SIMCA analysis of plant canopy spectral data. Biosystems Engineering 171: 143e154. DOI: https://doi.org/10.1016/j.biosystemseng.2018.04.019
    » https://doi.org/10.1016/j.biosystemseng.2018.04.019
  • Silva IAB, Kuva MA, Alves PLCA, Salgado TP (2009) Interferência de uma comunidade de plantas daninhas com predominância de Ipomoea hederifolia na cana-soca. Planta Daninha 27(2):265-272. DOI: http://dx.doi.org/10.1590/S0100-83582009000200008
    » http://dx.doi.org/10.1590/S0100-83582009000200008
  • Vasconcelos MCC, Da Silva AFA, Lima RS (2012) Interferência de Plantas Daninhas sobre Plantas Cultivadas. Agropecuaria cientifica no semiarido 8(1):1-6. Disponível em: <http://www.cstr.ufcg.edu.br/acsa/%5CnRevista>.
    » http://www.cstr.ufcg.edu.br/acsa/%5CnRevista
  • Viliotti CA (2002) Desenvolvimento de um sistema eletrônico para detecção da presença de plantas daninhas e controle da aplicação de herbicidas. Tese, Viçosa, Universidade Federal de Viçosa.
  • Visser R, Timmermans AJMO (1996) Weed-It: a new selective weed control system. Proc. SPIE 2907 Optics in Agricuulture, Forestry, and Biological Processing II. DOI: https://doi.org/10.1117/12.262852
    » https://doi.org/10.1117/12.262852
  • Zheng Y, Zhu Q, Huang M, Gui Y, Qin J (2017) Maize and weed classification using color indices with support vector data description in outdoor fields. Computers and Electronics in Agriculture 141:215–222. DOI: https://doi.org/10.1016/j.compag.2017.07.028
    » https://doi.org/10.1016/j.compag.2017.07.028
  • Zhou T, Ruan S, Canu SA (2019) Review: Deep learning for medical image segmentation using multi-modality fusion. ARRAY. DOI: https://doi.org/10.1016/j.array.2019.100004
    » https://doi.org/10.1016/j.array.2019.100004

Edited by

Area Editor: Gizele Ingrid Gadotti

Publication Dates

  • Publication in this collection
    29 Apr 2022
  • Date of issue
    2022

History

  • Received
    31 Aug 2021
  • Accepted
    16 Mar 2022
Associação Brasileira de Engenharia Agrícola SBEA - Associação Brasileira de Engenharia Agrícola, Departamento de Engenharia e Ciências Exatas FCAV/UNESP, Prof. Paulo Donato Castellane, km 5, 14884.900 | Jaboticabal - SP, Tel./Fax: +55 16 3209 7619 - Jaboticabal - SP - Brazil
E-mail: revistasbea@sbea.org.br