Acessibilidade / Reportar erro

Computer vision system approach in colour measurements of foods: Part I. development of methodology

Abstract

The colour assessment ability of the computer vision system is investigated and the data are compared with colour measurements taken by a conventional colorimeter. Linear and quadratic models are built to improve currently used methodology for the conversion of RGB colour units to L*a*b* colour space. For this purpose, two innovative ideas are proposed and tested. First, substantial amount of colour tones is generated to cover as many points in the colour space as possible. Secondly, the colour space is calibrated separately, whereas in previous research in the literature, the colour space is calibrated simultaneously. It is found that the RGB colour units to L*a*b* colour space transformation approach proposed in this study is more logical and more accurate, and the prediction performance of the quadratic model is superior over the linear model.

Keywords:
colour; computer vision system; colorimeter; RGB; L* a* b*

1 Introduction

Colour analysis is an important issue for the food industry during harvesting, processing and preservation of foods. For most people, it may sound like a simple thing, but in reality, colour analysis is a very complex issue. Colour is a cerebral perceptual response to the visible spectrum of light, which is reflected or emitted from an object; the response signal is linked up in the eye with the retina and transmitted by the optical nerve to the brain, as a result of this we can assign colours to this signal (MacDougall, 2002MacDougall, D. B. (2002). Colour measurement of food: principles and practice. In D.B. MacDougall (Ed.), Colour in food (pp. 33-63). Abington: Woodhead Publishing. http://dx.doi.org/10.1533/9781855736672.1.33.
http://dx.doi.org/10.1533/9781855736672....
; Wu & Sun, 2013Wu, D., & Sun, D.-W. (2013). Colour measurements by computer vision for food quality control: a review. Trends in Food Science & Technology, 29(1), 5-20. http://dx.doi.org/10.1016/j.tifs.2012.08.004.
http://dx.doi.org/10.1016/j.tifs.2012.08...
). Moreover, colour is considered as a psychophysical concept, which is related to the physiology of vision, the psychology of the observer and the spectral radiant energy of a source of light (Wyszecki & Stiles, 2000Wyszecki, G., & Stiles, W. S. (2000). Color science: concepts and methods, quantitative data and formulae (2nd ed.). New York: Wiley.). All these factors make the colour analysis difficult to accurately analyse the colour of foods.

When using a digital camera it is possible to register the colour of any pixel of the image, and the object can be registered using three colour sensors per pixel, which depend on the colour model being used (Forsyth & Ponce, 2012Forsyth, D. A., & Ponce, J. (2012). Computer vision: a modern approach (2nd ed.). Boston: Pearson.; Pedreschi et al., 2008Pedreschi, F., Mery, D., & Marique, T. (2008). Quality evaluation and control of potato chips and French fries. In D.-W. Sun (Ed.), Computer vision technology for food quality evaluation (pp. 545-566). Amsterdam: Academic Press. http://dx.doi.org/10.1016/B978-012373642-0.50025-9.
http://dx.doi.org/10.1016/B978-012373642...
). In this field, Red-Green-Blue (RGB) image analysis techniques have gained an increasing interest in food colour analysis in the last 20 years (Ulrici et al., 2012Ulrici, A., Foca, G., Ielo, M. C., Volpelli, L. A., & Lo Fiego, D. P. (2012). Automated identification and visualization of food defects using RGB imaging: Application to the detection of red skin defect of raw hams. Innovative Food Science & Emerging Technologies, 16, 417-426. http://dx.doi.org/10.1016/j.ifset.2012.09.008.
http://dx.doi.org/10.1016/j.ifset.2012.0...
). The key point is the designation of proper automated methods to extract useful information from RGB images and employ it for calibration, classification and process monitoring (Foca et al., 2011Foca, G., Masino, F., Antonelli, A., & Ulrici, A. (2011). Prediction of compositional and sensory characteristics using RGB digital images and multivariate calibration techniques. Analytica Chimica Acta, 706(2), 238-245. http://dx.doi.org/10.1016/j.aca.2011.08.046. PMid:22023857.
http://dx.doi.org/10.1016/j.aca.2011.08....
). RGB cannot be converted to L*a*b* directly, however, there are methodologies published in the literature to obtain accurate device independent L*a*b* colour units from device dependent RGB colour units captured by a digital camera (Leon et al., 2006Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
). Transformation of RGB colour units to L*a*b* colour space was achieved using limited amount of colour cards by Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
and Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
. The main drawback of their work was that they did not produce as many colour tones as possible to evaluate colour space.

The primary aim of present work was to improve currently used methodology for the conversion of RGB colour units to L*a*b* colour space. For this purpose, two innovative ideas were proposed and tested. First, 120 colour tones were generated to cover as many points in all the colour space as possible as compared to the Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
who used 32 colour tones. Secondly, the colour space was calibrated separately, whereas in previous research in the literature, the colour space is calibrated simultaneously (Afshari-Jouybari & Farahnaky, 2011Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
; Leon et al., 2006Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
). Therefore, RGB colour units to L*a*b* colour space transformation approach proposed in this study is more logical and more accurate. In this study, 4 colours (red, yellow, green and black-white) were analysed separately because most of the foods are composed of these colours and their tones. Thirty tones of each colour were generated and imprinted on cards. The L*a*b* and RGB colour analysis of these cards were carried out with a chroma meter and the image acquisition system designed in this study, respectively. After the colour analysis, two polynomial models: linear and quadratic were generated for each colour. The predictive power of these models were analysed using advanced statistical analyses tools including % mean absolute error (|e¯|), standard deviation of the % mean absolute error (σ), Euclidean distance (Δeab*), average root mean square error (RMSE¯) and coefficient of determination (R2). Appropriate models for each colour were determined based on the results of the statistical analysis.

2 Materials and methods

The work was carried out in 5 separate phases: i) A colour map was generated for 4 different colours including red, yellow, green and black-white, ii) Digital images of the colour map were taken with an image acquisition system specifically designed in this work, iii) Background separation of the digital images was carried out, iv) RGB values obtained from the digital images were converted to L*a*b*, and v) Validation process was performed by using colour cards for each colour. Schematic diagram of the experimental design is shown in Figure 1. Details of the phases in the experimental plan are explained in the following subsections.

Figure 1
Schematic diagram of the experimental design.

2.1 Colour map design and press

Thirty samples of lighter and darker tones of each colour (red, yellow, green and black-white) with a total of 120 colour cards (30 colour tones x 4 colours) in the form of 9×9 cm2 were systematically generated using the Adobe Photoshop (CS5.1) software (Adobe Systems Inc., San Jose, CA, USA) in order to transform RGB values obtained from the image acquisition system to L*a*b* values. Thirty colour tones for each colour were generated, as shown in Figure 2. Twenty four colour tones were used for calibration, and the remaining colour tones were used for validation. The generated colour map with a total of 120 colour cards was imprinted to a high quality printing paper (350 g/m2), and the colour cards were stored in oxygen and water vapour proof sealed bags to protect them against environmental conditions and direct sunlight, which may change colour of cards.

Figure 2
All generated colour tones for each colour.

2.2 Image acquisition system

The digital image acquisition system (Figure 3) designed and used in this study consists of 4 main components: a black box, light source, digital camera and image processing software Details of these components are explained below:

Figure 3
Digital image acquisition system.
  • Black box: The black box designed for the colour measurements was constructed from the medium-density fibreboard (MDF) wood, with dimensions of 1 m×1 m×0.6 m. All the interior surfaces of the box were covered with a matt black cloth to prevent any light reflection in the box and provide insulation for the ingress of any outside light (Sáenz et al., 2005Sáenz, C., Hernández, B., Beriain, M. J., & Lizaso, G. (2005). Meat color in retail displays with fluorescent illumination. Color Research and Application, 30(4), 304-311. http://dx.doi.org/10.1002/col.20123.
    http://dx.doi.org/10.1002/col.20123...
    ).

  • Light source: Illumination was achieved with 4 fluorescent lamps (Philips, MASTER TL-D 90 Graphica), and each lamp has 60 cm long, 18 W power with a colour temperature of 6500 K and a colour-rendering index (Ra) larger than 95%. These 4 fluorescent lamps were placed forming a square vertically at a distance of 45 cm from the samples, so as to get uniform intensity of light over the samples. The fluorescent lamps were used together with their commercial luminaires (Philips, TCW060) with both electronic ballast (to prevent the stroboscopic effect) and plastic light diffuser (to filter the light) in order to provide constant and uniform illumination conditions.

  • Digital camera: Nikon D3000 with a 10.2 megapixel DX format DSLR Nikon F-mount camera (Nikon Corporation, Tokyo, Japan) was used to capture digital images. The digital camera was placed at distance of 20 cm from the samples using the elevation system constructed in the box shown in Figure 3. Before taking digital images, the white balance of the camera was calibrated using a 20×25 cm2 gray card with 18% reflectance (GC1890, Danes Picta Company, Praha 3, Czech Republic). The digital images were taken using the camera’s manual mode. The settings of the digital camera used in the colour measurements are summarized in Table 1.

    Table 1
    Settings of the digital camera.

  • Image processing software and hardware: The digital camera was connected to a personnel computer (Asus, IntelR, i7-2600) with a USB cable using remote control software specifically designed for the Nikon D3000 digital camera (SM Tether, v1.5). The angle between the axis of the lens and the sources of illumination was set to 45° (Leon et al., 2006Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
    http://dx.doi.org/10.1016/j.foodres.2006...
    ). In order to balance the illumination, the light source was switched on at least 30 minutes prior to the colour measurements (Valous et al., 2009Valous, N. A., Mendoza, F., Sun, D.-W., & Allen, P. (2009). Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams. Meat Science, 81(1), 132-141. http://dx.doi.org/10.1016/j.meatsci.2008.07.009. PMid:22063973.
    http://dx.doi.org/10.1016/j.meatsci.2008...
    ). Image processing, colour analysis and conversion of RGB colour units to L*a*b* colour space were carried out using the Matlab 7.12.0 (R2011a) software (MathWorks Inc., Natick, MA, USA).

2.3 Image segmentation

All the interior surfaces of the image acquisition system were covered with a matt black cloth to assure that the background of the digital images was all black. However, before the conversion of the RGB colour system to L*a*b* , the part of interest must be separated from the black background. This separation process is known as image segmentation. First, the digital images were read from the graphics file using the imread command of the Matlab software. Then, the digital images were filtered with a Gaussian low pass filter with a filter size [3×3] matrix and sigma 0.5, which allows pre-smoothing of noisy images (Mendoza & Aguilera, 2004Mendoza, F., & Aguilera, J. M. (2004). Application of image analysis for classification of ripening bananas. Journal of Food Science, 69(9), 471-477. http://dx.doi.org/10.1111/j.1365-2621.2004.tb09932.x.
http://dx.doi.org/10.1111/j.1365-2621.20...
; Mendoza et al., 2006Mendoza, F., Dejmek, P., & Aguilera, J. M. (2006). Calibrated color measurements of agricultural foods using image analysis. Postharvest Biology and Technology, 41(3), 285-295. http://dx.doi.org/10.1016/j.postharvbio.2006.04.004.
http://dx.doi.org/10.1016/j.postharvbio....
). The filtered coloured images were converted to the grayscale intensity image using the rgb2gray command, and image segmentation was performed using the graythresh command of the Matlab software. Segmented images are in the form of binary images, so all pixels of these binary images are comprised of two values (0 and 1). Pixels with values of 0 and 1 represent black background and object in white colour, respectively.

2.4 Converting RGB colour units to L * a * b * colour space

Digital images of each card in red, green, yellow and black-white colour with 30 different colour tones for each colour were taken, and each colour card was divided into 5 zones as shown in Figure 4. The L*a*b* colour measurements were taken in each of the 5 zones by taking one colour measurement in each zone using a Konica Minolta Chroma Meter (model CR-400, Konica Minolta Inc., Tokyo, Japan). Prior to colour measurements, the Chroma Meter was calibrated with its white calibration tile (Y = 86.6, x = 0.3188, y = 0.3364). The average of these measurements representing the whole of the card analysed were coded as image 0-0. L*a*b* measurements taken from the first, second, third and fourth zones were coded as image 0-1, image 0-2, image 0-3 and image 0-4, respectively. The RGB values of the corresponding zones were calculated using the Matlab software, which computes mean of all pixels for each zone. Calculated RGB values for these five zones were coded with the same image codes as the L*a*b* measurements were taken.

Figure 4
Zones of the colour cards.

Prior to the construction of linear and quadratic models, the colour cards were separated into two groups. For each colour, the first group of 24 cards (24 cards × 5 zones = 120 colour values) was used to calibrate and construct the models, and the remaining 6 cards (6 cards × 5 zones = 30 colour values) were used to validate the constructed models. In the calibration step, the linear and quadratic functions given in Equations 1 and 2 were used, respectively, to convert the RGB colour units to L*a*b* colour space.

[ L ^ * a ^ * b ^ * ] = [ P 11 P 12 P 13 P 21 P 22 P 23 P 31 P 32 P 33 ] [ R G B ] (1)
[ L ^ * a ^ * b ^ * ] = [ P 11 P 12 P 13 P 21 P 22 P 23 P 31 P 32 P 33 P 14 P 15 P 16 P 24 P 25 P 26 P 34 P 35 P 36 P 17 P 18 P 19 P 27 P 28 P 29 P 37 P 38 P 39 ] [ R G B R 2 G 2 B 2 R G R B G B ] (2)

In these equations, R, G and B are the digital colour variables of the samples; L^*, a^* and b^* are the estimated L*, a* and b* colour variables, and P11P39 are the coefficients calculated using fminsearch command, which uses iteratively algorithm in the Matlab software.

2.5 Validation process and statistical analysis

After the calibration, Equations 3 to 12 were used for verification. Mean normalized errors (eL, ea and eb ) for L*, a* and b* variables were calculated using Equation 3, Equation 4 and Equation 5, respectively.

e L = 1 n i = 1 n | L * i L ^ * i Δ L | (3)
e a = 1 n i = 1 n | a * i a ^ * i Δ a | (4)
e b = 1 n i = 1 n | b * i b ^ * i Δ b | (5)

where L^*, a^* and b^* are the colour values estimated with the model, and n is the number of measurements. L* values range from 0 to 100, and a* and b* values are between -120 and +120 so ΔL= 100 and Δa=Δb=240.

Percent (%) mean absolute error (|e¯|) was calculated by Equation 6, and the standard deviation (σ) of the % mean absolute error was determined by Equation 7:

| e ¯ | = e L + e a + e b 3 * 100 (6)
σ = 1 n ( i = 1 n | e ¯ | i | e ¯ | ) 2 (7)

Euclidean distance (Δeab*) between real and estimated L* a* and b* values was given by Equation 8:

Δ e a b * = i = 1 n ( L * i L ^ * i ) 2 + ( a * i a ^ * i ) 2 + ( b * i b ^ * i ) 2 (8)

Root mean square errors (RMSEL, RMSEa and RMSEb) for L*, a* and b* variables were calculated using Equations 9 to 11. Average root mean square error (RMSE¯) was obtained by Equation 12.

R M S E L = i = 1 n ( L * i L ^ * i ) 2 n (9)
R M S E a = i = 1 n ( a * i a ^ * i ) 2 n (10)
R M S E b = i = 1 n ( b * i b ^ * i ) 2 n (11)
R M S E ¯ = R M S E L + R M S E a + R M S E b 3 (12)

3 Results and discussion

Calibration and validation processes for each colour (red, yellow, green and black-white) and each model (linear and quadratic) were carried out separately by calculating % mean absolute error (|e¯|), standard deviation of the % mean absolute error (σ), Euclidean distance (Δeab*) and average root mean square error (RMSE¯) (Table 2). As can be seen from this table, the models that have the best prediction performance were the quadratic models for all the colours. The quadratic models gave |e¯|, σ, Δeab* and RMSE¯ values lower than 0.74, 1.30, 3.23 and 1.84, respectively, at the calibration. The |e¯|, σ, Δeab* and RMSE¯ values at the validation were lower than 0.58, 0.85, 2.39 and 1.94, respectively. This clearly showed that quadratic models provide reliable prediction performance for the transformation of RGB colour units to L*a*b* colour space for each colour.

Table 2
Results obtained from calibration and validation processes for the colour cards.

Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
used a quadratic model to transform RGB colour units to L*a*b* colour space, and they found |e¯| of 1.22% for calibration and 1.26% for validation. Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
also used a quadratic model with a |e¯| value of 2.9% in which they did not improve the quadratic model used by Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
for the transformation of RGB colour units to L*a*b* colour space. These researchers also used neural network models in which the neural network model used by Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
had a |e¯| value of 0.93% while the neural network model used by Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
gave a |e¯| value of 1.3%. The |e¯| values of the quadratic models used in this study were much smaller than |e¯| values of the neural network models, thus, validating that the quadratic model is even better than the neural network based models.

Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
used the quadratic model to transform RGB colour units to L*a*b* colour space and found that σ is 1.42 for the calibration and 1.62 for the validation. The same quadratic model as utilized by Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
was also used in our work and the σ value was significantly improved both in calibration (σ ≤ 1.30) and validation (σ ≤ 0.85). This means that the model consistently predicts RGB colour units to L*a*b* colour space more precisely than Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
.

Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
used the quadratic and neural network models to transform RGB colour units to L*a*b* colour space and found a RMSE¯ value of 3.262 for the quadratic model and 1.973 for the neural network model. In our work, RMSE¯ values of the quadratic model were less than 1.84 for calibration and less than 1.32 for validation. These RMSE¯ values are much smaller than that of Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
found for the quadratic model and even for the neural network model. This simply means that the prediction performance of our quadratic model is better than the neural network based models.

The Δeab* values between two different colours corresponds to colour difference perceived by the human eye (Leon et al., 2006Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
). If this value is high, the colour difference is easily perceived by the human eye. If this value is small, human eye has the difficulty in distinguishing the difference between two colours. Therefore, the Δeab* values were calculated for each colour to determine the error values between the predicted and measured L*a*b* values. Valous et al. (2009)Valous, N. A., Mendoza, F., Sun, D.-W., & Allen, P. (2009). Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams. Meat Science, 81(1), 132-141. http://dx.doi.org/10.1016/j.meatsci.2008.07.009. PMid:22063973.
http://dx.doi.org/10.1016/j.meatsci.2008...
used various polynomial models to transform RGB colour units to L*a*b* colour space. In their work, Δeab* values ranged from 4 to 4.6 for the polynomial models composed of quadratic and cubic terms. In our work, all the Δeab* values for the quadratic models were lower than 3.23 for the calibration and 2.39 for the validation. Thus, even the quadratic model used in this work has smaller error than the polynomial models with quadratic and cubic terms used by Valous et al. (2009)Valous, N. A., Mendoza, F., Sun, D.-W., & Allen, P. (2009). Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams. Meat Science, 81(1), 132-141. http://dx.doi.org/10.1016/j.meatsci.2008.07.009. PMid:22063973.
http://dx.doi.org/10.1016/j.meatsci.2008...
.

Measured L*a*b* values were plotted against estimated L*a*b* values, and R2 values with root mean square errors (RMSEL, RMSEa and RMSEb) for L*, a* and b* variables were calculated to determine the existence of any relationship between measured and predicted values for the colour cards (Figure 5). Measured and predicted L*, a* and b* values yielded R2 higher than 0.9944 and RMSE smaller than 1.457. This indicated that the quadratic model has very high prediction performance for each L*, a* and b* variables. For the quadratic model, R2 values found by Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
for L*, a* and b* variables were 0.9843, 0.9851 and 0.9910, respectively, while Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
found R2 values of 0.9841, 0.9663 and 0.9742 for L*, a* and b*. All the R2 values determined in these two separate studies were less than the R2 values found in this work for L*, a* and b* variables. Higher R2 and smaller RMSE values showed considerable improvement in the prediction performance of the quadratic models used in the transformation of RGB colour units to L*a*b* colour space compared to the prediction performance of the models used in Leon et al. (2006)Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006.
http://dx.doi.org/10.1016/j.foodres.2006...
and Afshari-Jouybari & Farahnaky (2011)Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034.
http://dx.doi.org/10.1016/j.jfoodeng.201...
.

Figure 5
Relationship between measured and predicted (a) L*; (b) a*; and (c) b* values for the colour cards, respectively.

Barbin et al. (2016)Barbin, D. F., Mastelini, S. M., Barbon, S. Jr, Campos, G. F. C., Barbon, A. P. A. C., & Shimokomaki, M. (2016). Digital image analyses as an alternative tool for chicken quality assessment. Biosystems Engineering, 144(16), 85-93. http://dx.doi.org/10.1016/j.biosystemseng.2016.01.015.
http://dx.doi.org/10.1016/j.biosystemsen...
used computer vision system for non-destructive determination of colour parameters of chicken and compared the measured L*a*b* values with the predicted L*a*b* values to assess chicken quality in a fast and rapid way. They worked with poultry meat which has non-flat and non-homogeneous surfaces causing high brightness and shadow effects. Therefore, their proposed approach was based on illumination normalisation step to reduce bright spot and shadow effects on poultry meat. In our work, the colour values of foods with surfaces that do not cause high brightness and shadow effects were measured. Therefore, an illumination normalisation step was not used in our work. In order to convert RGB colour units to L*a*b* colour space, Barbin et al. (2016)Barbin, D. F., Mastelini, S. M., Barbon, S. Jr, Campos, G. F. C., Barbon, A. P. A. C., & Shimokomaki, M. (2016). Digital image analyses as an alternative tool for chicken quality assessment. Biosystems Engineering, 144(16), 85-93. http://dx.doi.org/10.1016/j.biosystemseng.2016.01.015.
http://dx.doi.org/10.1016/j.biosystemsen...
used two step conversion model (RGB→XYZ→ L*a*b*). They first converted RGB values to XYZ and then XYZ to L*a*b*. Compared to the conversion method used by Barbin et al. (2016)Barbin, D. F., Mastelini, S. M., Barbon, S. Jr, Campos, G. F. C., Barbon, A. P. A. C., & Shimokomaki, M. (2016). Digital image analyses as an alternative tool for chicken quality assessment. Biosystems Engineering, 144(16), 85-93. http://dx.doi.org/10.1016/j.biosystemseng.2016.01.015.
http://dx.doi.org/10.1016/j.biosystemsen...
, a quadratic model was used in this work to directly convert RGB colour units to L*a*b* colour space.

4 Conclusions

In this study, computer vision system was developed to measure the colour of foods as an alternative to conventional colorimeters. Data obtained by the computer vision system in the form of RGB colour units was transformed to conventional L*a*b* colour space using linear and quadratic models. The quadratic model was found to be the appropriate model which gave small error values. The novel approach, in which more colours were used and the colours were calibrated independently from each other, contributed reduction in error and improved the prediction capability of the quadratic model significantly.

Acknowledgements

This work was financially supported by Gebze Technical University through Scientific Research Project (BAP) 2014 A-25. Fatih Tarlak would like to thank The Scientific and Technological Research Council of Turkey (TUBITAK) for granting PhD scholarship (2211-C), and also would like to thank Ozgun Yucel (M.Sc.) for his help during this research.

  • Practical Application: The methodological approach proposed in this study could be used to easily convert the RGB colour units to L*a*b* colour space.

References

  • Afshari-Jouybari, H., & Farahnaky, A. (2011). Evaluation of Photoshop software potential for food colorimetry. Journal of Food Engineering, 106(2), 170-175. http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034
    » http://dx.doi.org/10.1016/j.jfoodeng.2011.02.034
  • Barbin, D. F., Mastelini, S. M., Barbon, S. Jr, Campos, G. F. C., Barbon, A. P. A. C., & Shimokomaki, M. (2016). Digital image analyses as an alternative tool for chicken quality assessment. Biosystems Engineering, 144(16), 85-93. http://dx.doi.org/10.1016/j.biosystemseng.2016.01.015
    » http://dx.doi.org/10.1016/j.biosystemseng.2016.01.015
  • Foca, G., Masino, F., Antonelli, A., & Ulrici, A. (2011). Prediction of compositional and sensory characteristics using RGB digital images and multivariate calibration techniques. Analytica Chimica Acta, 706(2), 238-245. http://dx.doi.org/10.1016/j.aca.2011.08.046 PMid:22023857.
    » http://dx.doi.org/10.1016/j.aca.2011.08.046
  • Forsyth, D. A., & Ponce, J. (2012). Computer vision: a modern approach (2nd ed.). Boston: Pearson.
  • Leon, K., Mery, D., Pedreschi, F., & Leon, J. (2006). Color measurement in . L*a*b* units from RGB digital imagesFood Research International, 39(10), 1084-1091. http://dx.doi.org/10.1016/j.foodres.2006.03.006
    » http://dx.doi.org/10.1016/j.foodres.2006.03.006
  • MacDougall, D. B. (2002). Colour measurement of food: principles and practice. In D.B. MacDougall (Ed.), Colour in food (pp. 33-63). Abington: Woodhead Publishing. http://dx.doi.org/10.1533/9781855736672.1.33
    » http://dx.doi.org/10.1533/9781855736672.1.33
  • Mendoza, F., & Aguilera, J. M. (2004). Application of image analysis for classification of ripening bananas. Journal of Food Science, 69(9), 471-477. http://dx.doi.org/10.1111/j.1365-2621.2004.tb09932.x
    » http://dx.doi.org/10.1111/j.1365-2621.2004.tb09932.x
  • Mendoza, F., Dejmek, P., & Aguilera, J. M. (2006). Calibrated color measurements of agricultural foods using image analysis. Postharvest Biology and Technology, 41(3), 285-295. http://dx.doi.org/10.1016/j.postharvbio.2006.04.004
    » http://dx.doi.org/10.1016/j.postharvbio.2006.04.004
  • Pedreschi, F., Mery, D., & Marique, T. (2008). Quality evaluation and control of potato chips and French fries. In D.-W. Sun (Ed.), Computer vision technology for food quality evaluation (pp. 545-566). Amsterdam: Academic Press. http://dx.doi.org/10.1016/B978-012373642-0.50025-9
    » http://dx.doi.org/10.1016/B978-012373642-0.50025-9
  • Sáenz, C., Hernández, B., Beriain, M. J., & Lizaso, G. (2005). Meat color in retail displays with fluorescent illumination. Color Research and Application, 30(4), 304-311. http://dx.doi.org/10.1002/col.20123
    » http://dx.doi.org/10.1002/col.20123
  • Ulrici, A., Foca, G., Ielo, M. C., Volpelli, L. A., & Lo Fiego, D. P. (2012). Automated identification and visualization of food defects using RGB imaging: Application to the detection of red skin defect of raw hams. Innovative Food Science & Emerging Technologies, 16, 417-426. http://dx.doi.org/10.1016/j.ifset.2012.09.008
    » http://dx.doi.org/10.1016/j.ifset.2012.09.008
  • Valous, N. A., Mendoza, F., Sun, D.-W., & Allen, P. (2009). Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams. Meat Science, 81(1), 132-141. http://dx.doi.org/10.1016/j.meatsci.2008.07.009 PMid:22063973.
    » http://dx.doi.org/10.1016/j.meatsci.2008.07.009
  • Wu, D., & Sun, D.-W. (2013). Colour measurements by computer vision for food quality control: a review. Trends in Food Science & Technology, 29(1), 5-20. http://dx.doi.org/10.1016/j.tifs.2012.08.004
    » http://dx.doi.org/10.1016/j.tifs.2012.08.004
  • Wyszecki, G., & Stiles, W. S. (2000). Color science: concepts and methods, quantitative data and formulae (2nd ed.). New York: Wiley.

Publication Dates

  • Publication in this collection
    31 May 2016
  • Date of issue
    Apr-Jun 2016

History

  • Received
    28 Jan 2016
  • Accepted
    09 May 2016
Sociedade Brasileira de Ciência e Tecnologia de Alimentos Av. Brasil, 2880, Caixa Postal 271, 13001-970 Campinas SP - Brazil, Tel.: +55 19 3241.5793, Tel./Fax.: +55 19 3241.0527 - Campinas - SP - Brazil
E-mail: revista@sbcta.org.br