Acessibilidade / Reportar erro

DEVELOPMENT OF SOFTWARE FOR ANALYSIS OF BEHAVIOR AND WELFARE OF BROILERS

ABSTRACT

The objective of this study was to develop a software based on image processing and computer vision techniques for monitoring the feeding/collective behavior of broilers (Cobb) and validate it based on the results obtained from the visual analysis of an expert. The visual analysis was performed based on the observation and quantification of behaviors in the interval of 10 min at each hour of the day, in the period of 24 hours, totaling 1728 frames/day, for males and females. The software was developed using the Hoshen-Kopelman algorithm, for labeling clusters, which would basically be the grouping of similar pixels. This software is written in the 1999 standard of the C language. After this stage of programming, the Hoshen-Kopelman algorithm identified the birds in the region of interest (feeder and drinker), with removal of the background color to obtain the descriptors, using the same range of visual analysis. Then, the software was validated through linear regression analysis, using the R platform. Based on the correlation analyses, it was found that the coefficient of determination (R2) ranged from 0.74 to 0.97 for all software validation events. The number of broilers in the regions of interest showed R2 above 0.70 for females and above 0.89 for males, which allowed the characterization of the ingestive behavior of broilers by computer vision. Some factors that caused interference in the accuracy of the software were decisive for the result, especially the arrangement of the cameras, uneven lighting, obstruction caused by the lighting system, pendular movement of the feeders and drinker, and shade generated by them. Despite the highlighted interferences, the results allowed us to infer that the software identified and quantified the feeding behavior of broilers in an appropriate manner.

KEYWORDS
broiler farming; image processing; lighting system; computer vision

INTRODUCTION

The continuous increase in the world population significantly increases the demand for food, in quantity and quality. Thus, not only farmers and ranchers, but also researchers have devoted considerable efforts to a wide variety of techniques to increase food production, with an emphasis on production efficiency and increased return on investment. In this context, information technology has been heavily exploited in this sector, especially in terms of management and controllers that integrate automatic real-time decision-making support, such as precision agriculture and livestock farming (So-In et al., 2014So-In C, Poolsanguan S, Rujirakul K (2014) A hybrid mobile environmental and population density management. Computers and Electronics in Agriculture 109:287-301. DOI: https://doi.org/10.1016/j.compag.2014.10.004
https://doi.org/10.1016/j.compag.2014.10...
; Zhang et al., 2002Zhang N, Wang M, Wang N (2002) Precision agriculture – a worldwide overview. Computers and Electronics in Agriculture 36(2/3):113-132. DOI: https://doi.org/10.1016/S0168-1699(02)00096-0
https://doi.org/10.1016/S0168-1699(02)00...
).

Recently, the concept of precision has influenced several sectors, especially in the food industry, involving all stages of production, from the acquisition of the product to its distribution (Rehman et al., 2014Rehman A, Abbasi AZ, Islam N, Shaikh ZA (2014) A review of wireless sensors and networks' applications in agriculture. Compute Stand Interfaces 35(2):263-270. DOI: https://doi.org/10.1016/j.csi.2011.03.004-ComputeStandInterfacesv35n2p263-270/2014.
https://doi.org/10.1016/j.csi.2011.03.00...
).

The behavior is a direct reflection of how the animal is coping with its environment. Behavioral indicators are, therefore, among the preferred parameters for assessing welfare. Video-based behavioral analysis can be very time consuming, and the accuracy and reliability of the result depends on the experience and history of the observers (Barnard et al., 2016Barnard S, Calderara S, Pistocchi S, Cucchiara R, Podaliri-Vulpiani M, Messori S (2016) Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour. Plos One 11(7):e0158748. DOI: https://doi.org/10.1371/journal.pone.0158748
https://doi.org/10.1371/journal.pone.015...
).

Through behavior it is possible to characterize and identify welfare through visual observations, which consume a lot of time, and the human presence can inhibit the natural behavior of birds, thus generating unreliable responses. The use of computer vision allows the development of more efficient and reliable investigation methods, but there is no standard formula for all situations and variations occur according to the type of analysis (Cordeiro et al., 2011Cordeiro, MB, Tinôco, IFF, Mesquita Filho, RM, Souza, FC (2011) Análise de imagens digitais para a avaliação do comportamento de pintainhos de corte. Revista Engenharia Agrícola, Jaboticabal, 31:418-426. DOI: http://dx.doi.org/10.1590/S0100-69162011000300002
http://dx.doi.org/10.1590/S0100-69162011...
).

Technological resources are essential for birds to express their natural behaviors, obtaining more productive and satisfactory results. The sequence of behaviors that occur over a period of time constitutes an important source of information, pointing to the effects of the environment on the birds. It is of great value to have the possibility of applying an appropriate technique to monitor the behavior of birds in their rearing environment and, with this, have the opportunity to perform an accurate analysis regarding their behavioral pattern (Saltoratto et al., 2013Saltoratto AYK, Silva FA, Camargo ACAC, Silva PCG, Souza LFA (2013) Monitoramento de avicultura a partir de técnicas de visão computacional. Colloquium Exactarum 5(2):47-66. DOI: http://dx.org/10.5747/-ColloquiumExactarumv05n2e059p47-66/2013.
http://dx.org/10.5747/-ColloquiumExactar...
).

There is no software capable of evaluating the behavior of broilers that is available in the country, for ranchers and researchers, and that is low cost. In this context, the objective of this study was to develop a software based on image processing and computer vision techniques, for monitoring the feeding/collective behavior of broilers, and validate it based on the results obtained from the visual analysis of an expert.

MATERIAL AND METHODS

The database used in the study was based on an experiment conducted at the Experimental Station of Small Animals (EEPAC/UFRPE), located in the municipality of Carpina, state of Pernambuco, Brazil, at 7.85° S latitude, 35.24° W longitude and 180 m altitude. The climate of the region is characterized as megathermal (As') with winter precipitation and dry season from summer to autumn, according to Köppen's classification (Pereira et al., 2013Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
https://doi.org/10.1016/j.compag.2013.09...
).

The experimental shed was 9.5 m wide by 33.0 m long, with 2.8 m ceiling height, no lining, with a 0.4-m-high masonry wall and closed with polyethylene screen (22 mm) throughout its perimeter, associated with a blue polypropylene curtains, which were managed according to the thermal need of the birds during the production cycle. The roof was double pitched, covered with 6-mm-thick fiber cement roofing, 1.5-m-long eaves, with ridge vent and NE-SW orientation.

The birds were distributed by sex (males and females) in a production pen, divided by polyethylene screen (22-mm opening), with concrete floor and wood shaving bedding. The pen was 1.35 m wide by 2 m long, which allowed the housing of 32 birds (16 males and 16 females).

The lighting system used had light-emitting diodes (LED), with wavelength in the range from 400 to 760 nm (visible range - white). The lighting software used was the continuous one (18L:6D). The rearing pen received four 52-cm-long LED tubes. Each LED tube had 4 watts and 36 LEDs, installed at a distance of 6 cm between the lamps, fixed in PVC tube and placed at 70 cm from the floor, with an average illuminance of 20 lx (Figure 1).

FIGURE 1
Internal view of the shed (A) and arrangement of the LED-based lighting system (B).

Images were acquired using two micro cameras, RGB model with 3.6 mm lenses, positioned at 2.8 m high, longitudinally to the pen, whose dimensions are 1.35 m wide by 2 m long, so as to allow the recording of the birds (males and females), as shown in Figure 2A.

FIGURE 2
Distribution of the cameras inside the shed for the behavioral monitoring of the birds (A) and TOPWAY® Software used to store the images on the computer (B).

Image recordings were made once a week, for 24 hours, to obtain the videos and subsequently select the frames (picture frames), in the growth stage of the birds (22 to 42 days of age), totaling 3 weeks of analysis. The images were recorded by TOPWAY® software and stored on the computer for future analysis (Figure 2).

The variables related to the observed behavioral reactions were quantified based on the behavioral ethogram, described in Table 1, according to studies conducted by Pereira et al. (2013)Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
https://doi.org/10.1016/j.compag.2013.09...
.

TABLE 1
Behavioral ethogram for broilers based on the literature consulted.

Visual analysis of the images was performed considering the quantification of the birds within each region of interest. Visual evaluation of the behaviors was performed using the methodology of Schiassi et al. (2015)Schiassi L, Yanagi Junior T, Ferraz PFP, Campos AT, Silva GRE, Abreu LHP (2015) Comportamento de frangos de corte submetidos a diferentes ambientes térmicos. Revista Engenharia Agrícola 35(3)390-396. DOI: http://dx.doi.org/10.1590/1809-4430-Eng.Agric.v35n3p390-396/2015.
http://dx.doi.org/10.1590/1809-4430-Eng....
, with the observation and quantification of behaviors in the interval of 10 min at each hour of the day, in the period of 24 hours, totaling 1728 frames/day, for males and females.

To develop the software, the image was initially processed and prepared to count the broilers that expressed the behavior described in Table 1. The Hoshen-Kopelman algorithm was used for labeling clusters, which is basically the grouping of similar pixels. This program is written in the 1999 standard of the C language (C99). After this stage of programming, the Hoshen-Kopelman algorithm identified the birds in the region of interest, removing the background color to obtain the descriptors. Birds that were at the drinker and feeder, performing the activities described in Table 1, were considered for the region of interest.

For observation and quantification of the behaviors, the same interval of the methodology used in the visual analysis by an expert was used. Chikenlab Software was validated through linear regression analysis, from the result obtained by the software with the result obtained by standard visual analysis, using the Rstudio platform (R Core Team, 2017R Core Team (2017) R: A Language and environment for statistical computing. Available: https://www.R-project.org/.
https://www.R-project.org/...
). The coefficient of determination (R2) of the regression analysis was considered as a measure of accuracy (R2 = 1).

RESULTS AND DISCUSSION

The images were processed on a dark background defined by the bedding. According to Pereira et al. (2013)Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
https://doi.org/10.1016/j.compag.2013.09...
, the bedding, feeders and drinkers are often interpreted as chickens, which may compromise the correct classification of behavior. Image processing was applied to improve and illuminate the images of birds, prior to the analysis, as a step to reduce the number of objects mistakenly identified as birds.

The image sample used for behavior analysis was subjected to a processing, through the segmentation of images in order to enable the extraction of areas of interest (Pereira et al., 2013Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
https://doi.org/10.1016/j.compag.2013.09...
; Saltoratto et al., 2013Saltoratto AYK, Silva FA, Camargo ACAC, Silva PCG, Souza LFA (2013) Monitoramento de avicultura a partir de técnicas de visão computacional. Colloquium Exactarum 5(2):47-66. DOI: http://dx.org/10.5747/-ColloquiumExactarumv05n2e059p47-66/2013.
http://dx.org/10.5747/-ColloquiumExactar...
).

The color representation scale used was from 0 to 255, adopted due to the convenience of storing each color value in 1 byte (8 bits). Thus, White – RGB (255,255,255); Red – RGB (255;0;0); Green – RGB (0;255;0); Blue – RGB (0;0;255) and Black – RGB (0;0;0). The H hue was used to represent the color intensity in polar coordinates and the transformation from RGB to HSV, which is the abbreviation for the color system formed by hue, saturation, and value. The HSV is also known as HSB (hue, saturation and brightness, respectively), expressed by eqs (1), (2) and (3), and this filter was chosen because the generated images are easy to implement and adjust (Tian, 2009Tian C (2009) A computer vision-based classification method for pearl quality assessment. International Conference on Computer Technology and Development 2:73-76. DOI: http://doi.org/10.1109/ICCTD.2009.143-v2p73-76/2009
http://doi.org/10.1109/ICCTD.2009.143-v2...
).

(1) V = m
(2) S = m n m
(3) H = { 60 × ( 6 + ( ( G B ) ( m n ) ) ) , R = m 60 × ( 2 + ( ( B R ) ( m n ) ) ) , G = m 60 × ( 4 + ( ( R G ) ( m n ) ) ) , B = m

where:

m – max (R, G, B);

n – min (R, G, B).

For H ranging from 0 to 360, S from 0 to 1 and V from 0 to 255. The hue was normalized to the scale from 0 to 360, for representation in 1 byte (8 bits), and then transformed to the range from -90 to 90, for continuous visualization in the construction of the histogram. The descriptor color of the broilers was represented by the proportion of green in the histogram of the hue, which quantified the number of pixels of similar colors.

Image segmentation was used to obtain the definition of the object under study, through the projection of the RGB coordinate vector, in its color cube, in relation to the diagonal, which represents the shades of black.

The targets of interest reflect different colors due to the position of the sun during the day and the artificial lighting incident on them at night. To correct these effects, the color variation in the target was identified for each situation and objects that are mistakenly identified as birds were removed or minimized, in order to highlight only the areas of interest.

The clusters were labeled using the Hoshen-Kopelman algorithm, which is a code that groups similar colors. For this, values of 0 or 1 were automatically assigned to objects that had some properties in common, such as the size of the pixels. In the case of birds, all similar pixels assumed green color, as shown in Figure 3.

FIGURE 3
Quantification of broilers in the region of interest (A), empirical segmentation of the image with identification of the birds (B).

Image analysis is the stage between processing and classification through computer vision, helping to identify the shapes of the objects under study. The regions of interest were identified, which corresponded to the feeder (female/male) and drinker (female/male), totaling 4 regions of interest.

Birds were counted based on the labeling of the clusters, which was standardized for the interval of 10 minutes, the same interval used for the visual analysis of the behaviors. The software allowed identifying and quantifying the number of birds within the regions of interests, feeder and drinker, thus evaluating the feeding/collective behavior of the broilers throughout the day (Figure 3A).

The image processing method was fundamental to increase the accuracy and reliability of the analysis. The images were processed with the birds on a dark background, defined by the bedding present in the pen. However, feathers and other objects in the environment, such as the pendular movement of the feeders and drinkers and obstructions of the equipment used in the lighting system, were often mistaken for birds, which compromised the exact classification of the behaviors expressed by the birds.

Figure 3B illustrates the empirical segmentation of the image, Hoshen-Kopelman algorithm, with the identification of birds in the region of interest, after removing the background color to obtain the descriptors.

Figure 4 shows the functional analysis of weekly data of the regression between the observation of ingestive behavior by human and computational visions, with coefficients of determination (R2) ranging from 0.74 to 0.97 for the 3 weeks of analysis.

FIGURE 4
Coefficient of determination (R2) of the weekly correlation analysis for: A (4th week female-Feeder), B (4th week female-Drinker), C (4th week male-Feeder), D (4th week male-Drinker), E (5th week female-Feeder), F (5th week female-Drinker), G (5th week male-Feeder), H (5th week male-Drinker), I (6th week female-Feeder), J (6th week female-Drinker), L (6th week male-Feeder), M (6th week male-Drinker).

The software more efficiently counted the behaviors expressed by males (Figure 4 C, D, G, H, L, M), with R2 above 0.89, in this case with a Mean Square Error below 0.8, Mean Absolute Error below 0.3 and Mean Relative Error of 20%. The factors that caused interference in the readings were decisive for the result, especially the arrangement of the cameras that better captured the area of occupation of the birds in pens containing males, according to Figure 2A.

In the 4th week, female-drinker (Figure 4 B) had R2 of 0.77, a result compromised by the position of the camera, which could only capture one side of the drinker. Ideally, the cameras should be positioned perpendicular to the pen, with good image resolution and without the interference of brightness caused by the incidence of solar radiation.

Regarding the period of the day, it was possible to observe that the software was more efficient in the morning than in the afternoon (Figure 5), which is justified by the fact that the orientation of the aviary has longitudinal axis in the NE/SW direction, which hampered the behavioral identification of birds in the afternoon, due to the incidence of solar radiation in the pens, because the brightness made it difficult for the software to highlight the target object (broiler). The same difficulty was encountered at night, because both the natural and artificial sources of lighting highlighted the object of interest.

FIGURE 5
Coefficient of determination (R2) of the correlation analysis by period of the day for: A (Morning, female-Feeder), B (Morning, female-Drinker), C (Morning, male-Feeder), D (Morning, male-Drinker), E (Afternoon, female-Feeder), F (Afternoon, female-Drinker), G (Afternoon, male-Feeder), H (Afternoon, male-Drinker), I (Night, female-Feeder), J (Night, female-Drinker), L (Night, male-Feeder), M (Night, male-Drinker).

The quantification of the birds detected by the software was influenced by the interferences caused by uneven distribution of the lighting in the rearing pen, interference from sunlight, shade generated by feeders and drinkers, and arrangement of the cameras, which allowed some birds to be out of view. According to Sun (2011)Sun DW (ed). (2011) Computer vision technology for food quality evaluation. Academiv Press, Food Science and Technology, 600p., lighting is a fundamental element in the acquisition of images and in the formation of the wave spectrum reflected by the scene, so the definition of the type of lamps to be used is decisive.

The accuracy of the software could be improved if there were a greater control of the lighting in the environment because, in the present study, the position of the aviary in relation to the sun has a longitudinal axis in the NE/SW direction, which increased the brightness within the aviary and, consequently, the image of the video was distorted, which also occurred at night due to the influence of the white LED. Thus, in image processing, the labeling of clusters was necessary to find a filter that better grouped the similar pixels of our object of interest, due to the variation of solar radiation throughout the day. The same difficulties were encountered by Pereira et al. (2013)Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
https://doi.org/10.1016/j.compag.2013.09...
.

Other factors that would improve image processing methods would be: fixation of drinkers and feeders; changes in the characteristics of micro cameras, such as their positioning perpendicular to the pen and their height, so that it could better capture the entire region of interest; reduction in the effects of brightness caused by solar radiation in the afternoon, through structural changes in the environment, such as increasing roof slope or extending projection of the eaves.

CONCLUSIONS

The software proved to be a tool capable of quantifying the number of broilers in the regions of interest, thus evidencing their feeding and collective behavior.

The behavior is a reflection of how the animal is coping with the environment, and the search for tools that provide a better evaluation of this parameter will facilitate the real-time monitoring of welfare and, with this, the search for means that contribute to a better development.

It was possible to identify some limitations for the operation of the software, such as the excess of lighting and its uneven distribution in the pen, especially at night, causing greater variation in the labeling of clusters, and the movement of feeders and drinkers, which made it difficult to limit the regions of interest during the segmentation process.

The cameras were not positioned in such a way to favor the perfect framing of the images; the ideal would be positioning them in the center of the pen in the longitudinal direction. However, despite the limitations found, the proposed tool allowed identifying and quantifying the feeding behavior of broilers.

REFEERENCES

  • Barnard S, Calderara S, Pistocchi S, Cucchiara R, Podaliri-Vulpiani M, Messori S (2016) Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour. Plos One 11(7):e0158748. DOI: https://doi.org/10.1371/journal.pone.0158748
    » https://doi.org/10.1371/journal.pone.0158748
  • Cordeiro, MB, Tinôco, IFF, Mesquita Filho, RM, Souza, FC (2011) Análise de imagens digitais para a avaliação do comportamento de pintainhos de corte. Revista Engenharia Agrícola, Jaboticabal, 31:418-426. DOI: http://dx.doi.org/10.1590/S0100-69162011000300002
    » http://dx.doi.org/10.1590/S0100-69162011000300002
  • Pereira DF, Miyamoto BCB, Maia GDN, Sales T, Magalhães MM, Gates RS (2013) Machine vision to identify broiler breeder behavior. Computers and Electronics in Agriculture 99:194-199. DOI: https://doi.org/10.1016/j.compag.2013.09.012
    » https://doi.org/10.1016/j.compag.2013.09.012
  • R Core Team (2017) R: A Language and environment for statistical computing. Available: https://www.R-project.org/
    » https://www.R-project.org/
  • Rehman A, Abbasi AZ, Islam N, Shaikh ZA (2014) A review of wireless sensors and networks' applications in agriculture. Compute Stand Interfaces 35(2):263-270. DOI: https://doi.org/10.1016/j.csi.2011.03.004-ComputeStandInterfacesv35n2p263-270/2014
    » https://doi.org/10.1016/j.csi.2011.03.004-ComputeStandInterfacesv35n2p263-270/2014
  • Saltoratto AYK, Silva FA, Camargo ACAC, Silva PCG, Souza LFA (2013) Monitoramento de avicultura a partir de técnicas de visão computacional. Colloquium Exactarum 5(2):47-66. DOI: http://dx.org/10.5747/-ColloquiumExactarumv05n2e059p47-66/2013
    » http://dx.org/10.5747/-ColloquiumExactarumv05n2e059p47-66/2013
  • Schiassi L, Yanagi Junior T, Ferraz PFP, Campos AT, Silva GRE, Abreu LHP (2015) Comportamento de frangos de corte submetidos a diferentes ambientes térmicos. Revista Engenharia Agrícola 35(3)390-396. DOI: http://dx.doi.org/10.1590/1809-4430-Eng.Agric.v35n3p390-396/2015
    » http://dx.doi.org/10.1590/1809-4430-Eng.Agric.v35n3p390-396/2015
  • So-In C, Poolsanguan S, Rujirakul K (2014) A hybrid mobile environmental and population density management. Computers and Electronics in Agriculture 109:287-301. DOI: https://doi.org/10.1016/j.compag.2014.10.004
    » https://doi.org/10.1016/j.compag.2014.10.004
  • Sun DW (ed). (2011) Computer vision technology for food quality evaluation. Academiv Press, Food Science and Technology, 600p.
  • Tian C (2009) A computer vision-based classification method for pearl quality assessment. International Conference on Computer Technology and Development 2:73-76. DOI: http://doi.org/10.1109/ICCTD.2009.143-v2p73-76/2009
    » http://doi.org/10.1109/ICCTD.2009.143-v2p73-76/2009
  • Zhang N, Wang M, Wang N (2002) Precision agriculture – a worldwide overview. Computers and Electronics in Agriculture 36(2/3):113-132. DOI: https://doi.org/10.1016/S0168-1699(02)00096-0
    » https://doi.org/10.1016/S0168-1699(02)00096-0

Publication Dates

  • Publication in this collection
    23 Nov 2020
  • Date of issue
    Nov-Dec 2020

History

  • Received
    25 Aug 2018
  • Accepted
    06 Aug 2020
Associação Brasileira de Engenharia Agrícola SBEA - Associação Brasileira de Engenharia Agrícola, Departamento de Engenharia e Ciências Exatas FCAV/UNESP, Prof. Paulo Donato Castellane, km 5, 14884.900 | Jaboticabal - SP, Tel./Fax: +55 16 3209 7619 - Jaboticabal - SP - Brazil
E-mail: revistasbea@sbea.org.br