Print version ISSN 0100-8358
Planta daninha vol.28 no.2 Viçosa Apr./June 2010
REVISÃO DE LITERATURA
Augmented reality systems for weed economic thresholds applications
Tecnologia de realidade ampliada para utilização com os níveis de dano econômico de plantas daninhas
Vidal, N.R.I; Vidal, R.A.II
IEngo, Comp., M.Sc., University of Western Ontario, London, Canada, <Nick@iss.im>
IIEngo-Agro, M.Sc., Ph.D., Professor da Faculdade de Universidade Federal do Rio Grande do Sul, Pesquisador do CNPq, Porto Alegre, RS, Brazil, <Ribas.firstname.lastname@example.org>
The augmented reality (AR) technology has applications in many fields as diverse as aeronautics, tourism, medicine, and education. In this review are summarized the current status of AR and it is proposed a new application of it in weed science. The basic algorithmic elements for AR implementation are already available to develop applications in the area of weed economic thresholds. These include algorithms for image recognition to identify and quantify weeds by species and software for herbicide selection based on weed density. Likewise, all hardware necessary for AR implementation in weed science are available at an affordable price for the user. Thus, the authors propose weed science can take a leading role integrating AR systems into weed economic thresholds software, thus, providing better opportunities for science and computer-based weed control decisions.
Keywords: augmented reality, weed economic thresholds, software, neural algorithms, image recognition.
A tecnologia de realidade aumentada (AR) tem aplicações em vários e diversos campos, como aeronáutica, turismo, medicina e educação. Nesta revisão, é resumido o estado atual da AR e propõe-se uma nova aplicação dela na ciência das plantas daninhas. Os elementos básicos para a implementação de algoritmos de AR já estão disponíveis para desenvolvimento de aplicações na área de níveis de dano econômico de plantas daninhas. Estes incluem algoritmos de reconhecimento de imagem, para identificar e quantificar as infestantes por espécie; e software, para a seleção de herbicidas com base na densidade de plantas daninhas. Da mesma forma, todo o hardware necessário para aplicação da AR nessa área da ciência das plantas daninhas está disponível a um preço acessível ao usuário. Assim, sugere-se que a ciência das plantas daninhas possa assumir um papel preponderante na integração dos sistemas de AR aos softwares de níveis de dano econômico de plantas daninhas nas culturas. Com isso, seria possível proporcionar melhor utilização da informática para apoio nas decisões de controle de plantas daninhas baseadas na ciência.
Palavras-chave: realidade aumentada, nível de dano econômico de planta daninha, programas de computador, algoritmos neurais, reconhecimento de imagem.
The weed density can be evenly distributed in a wide area. This condition will require the control of weeds over the whole field. Advances in information technology and the reduction of computer cost (Sharma et al., 1998) have allowed the development of software to mediate the appropriate weed control decision according to the situation in the area (Bennett et al., 2003). Augmented Reality (AR) is the expansion of the real world by inserting virtual elements on it (Sharma et al., 1998; Feiner et al., 1993). In agriculture, the real world is a farmer's field, whereas the virtual elements are rendered images by a computer. The overlap of these virtual elements in the real world creates a mixed environment (Feiner et al., 1993). This overlap can be accomplished through the use of monitors or projectors (Tatham, 1999; Lourakis & Argyros, 2005). The AR technology can be the next development integrated into software aimed to support the weed control decision. The objectives of this literature review are to synthesize the current status of AR and to propose a new application of it in weed science.
There is a strong focus on visual applications in AR (Uenohara & Kanade, 1995; Klein & Murray, 2010), but it can also involve other senses, such as hearing for example. Therefore, the insertion of a voice recording in a real environment can be classified as AR (Miner & Caudell, 1998).
An important feature of AR is the level of integration between the real and virtual. To listen to the radio or to watch television, for example, is not AR. But, the use of the instructions of a car navigation system is a good example of AR because this system is embedded in the actual context of the user (Pundt & Brinkkotter-Runde, 2000).
Three components are necessary to achieve a good level of integration between the real and the virtual world: input data or the monitoring of the real world; basic processing of the input as a realistic model of the world; and the output data or the result of processing and their integration into the real world (Hollerer et al., 2001) (Figure 1).
For the data input, a series of sensors to monitor the real world is necessary. For example, a car navigation system would need a GPS to know the exact location of the car; and an accelerometer, to know the speed of the car. The base system receives the input and processes it using a model of the real world. In the car navigation system it is expected a map containing detailed information about the streets and information on the location of the vehicle. If this navigation system contains an outdated model of the real world and tells the driver to move forward in the wrong lane, its level of integration with the real world is questionable. Therefore, it is essential to have a fairly accurate model of the real world, often requiring integration of real-time data from different bases.
For the presentation of the output data, it is necessary several technological artifacts. The car navigation system would require either a small monitor, containing a dynamic map; or a windshield containing semi-transparent images; or a pair of contact lenses, projecting images in the eyes of the driver (Bojda & Frantis, 2009). A driver in a car equipped with a windshield display will have a totally different experience from a driver equipped with a small monitor. A windshield monitor will display the output information in the field of vision of the driver's eyes. However, drivers with a small monitor will be required to divert their attention every time they want to view more information.
The history of AR and Virtual Reality share similar origin and concepts, and many prototypes for both technologies have evolved since the Second World War. But the term Virtual Reality (Lanier, 1990; Lanier & Biocca, 1992) and Augmented Reality (Caudell & Mizeli 1992) were only coined in the last decade of the last century. Both, Virtual Reality and AR were defined (Milgram & Kishino, 1994) using a continuum between the real and virtual world, where AR lies between the two (Figure 2).
Milestones in both AR and Virtual Reality are linked because of the technological artifacts that allowed materializing the virtual output. These include the creation of Head-Up Displays by the British air force in World War II (Sadjadi et al., 1996) and the creation of Head-Mounted Displays (Sutherland, 1966). The first Head-up display was the projection of the radar image on the windshield of the plane, allowing the pilot to visualize other planes in front of him even at night (Sadjadi et al., 1996). Today the Head-up Display is quite common in commercial aircraft and is slowly entering the car market. On the other hand, the first Head-mounted display consisted of a heavy helmet containing a monitor that showed the user a virtual environment that changed according to the orientation of the head (Sutherland, 1966). Today, the helmet gave way to the glasses and in the future it is likely to be a pair of contact lenses (Sheedy & Bergstrom, 2002).
Although there was a big step forward mainly by the decrease of size and price of Head-Up and Head-mounted displays, they still remain out of reach of the vast majority of the population. But recently AR has gained opportunities for new uses because of the development of mobile devices like cell phones. Modern cell phones come equipped with monitor displays, cameras, GPS and accelerometer, which allows the creation of applications in many areas of research (Rohs et al., 2009). The potential uses of AR are increased when the cell phone is linked to the Internet.
For instance, when the user extends the cell phone with the screen pointing to himself and the camera to the landscape in front, the AR technology could show information about the landscape in the display. This information is fetched from the Internet according to the position indicated by the GPS in the phone. When making a 360 degrees turn, the information would be updated according to the position of the viewer with the help of the accelerometer. The information gathered and displayed can vary, depending on the selections made by the user, such as: touristic information; price of property for sale; location of nearby restaurants. Many of such information are already available on the internet and can be accessed through Google Earth® or Google Maps®, for instance.
The drawback of using a cell phone is that integration with the real world is not very good, because the virtual information is presented on a very limited visual screen. The use of glasses or contact lenses can enhance and exploit the field of vision of the user. However, there are still many technological challenges ahead, among which are: the energy required to operate a system so small; and the focus on an image being displayed less than 10 cm away from the eyes (Sheedy & Bergstrom, 2002; Morris & Parvis, 2008).
Potential uses of Augmented Reality in weed economic thresholds
The potential use of AR in weed economic thresholds is very encouraging, because this technology could greatly facilitate the decision-making by the agronomist to control weeds. All the basic requirements, including algorithmic-software and hardware, are available for development of an AR support system.
With the help of a cell phone, the agronomist may point the camera to the sampling area of the field in order to visualize the infestation (Figure 3). The image can be stored at the base computer or it can be processed on real time for instant decision about the weed control method. Several processing can be performed with the image. Through image-recognition software, weed species can be identified and weed density by species can be determined. Depending on the type and quantity of weeds, a software system similar to the HADSS (Herbicide Application Decision Suport System, Bennet et al., 2003) can be used to suggest options of herbicides to be applied according to profit maximization. Environmental data, as well as crop grain prices and herbicide costs, can be collected automatically from de internet. The herbicide suggestions can go much further than merely listing. They may assist the agronomist to literally foresee the impact of each decision on weed infestation at different times in the future, including its effect on the crop yield (as in HADSS) and on the next season weed seed bank and on the next year herbicides rotation needs.
Subsequent paragraphs will detail each step of Figure 3, divided into three groups: data input, or the monitoring of the field; base processing of the imput using a realistic model of the field, including data integration from other databases; output result of the processing and feedback to the agronomist.
The data input can be performed at the time of the crop emergence and scouting could be performed to monitor the field. This scouting could be a random assessment made by a person or it can be made remotely using cameras attached to unmanned aerial vehicles. This initial monitoring would evaluate the homogeneity and geographic spread of the infestation.
While unmanned aerial vehicles today are generally used for military purposes, with the decrease of its size and price it can be used in agriculture. There are already prototypes of vehicles that are the size of an insect. They will be very useful especially in monitoring, since they offer a bird's eye view and with a much greater detail than that provided by satellites. In addition to cameras, these vehicles may also be equipped with other sensors to collect data such as temperature, relative humidity, and soil moisture. Alternatively, the sensors may be spread through the field and may be sending data to a central database in real time or can be stored for future use.
The scouting can be performed by a person using the cell phone camera to collect several samples. These samples can be shot from various angles and distances. GPS systems in the cell phone indicate the position of the runs and the field image could be sent through the Internet to be processed at the base computer or it could offer real-time estimative of weed densities per species found in the area.
This processing performed by image-recognition software would simplify today's procedure because the scouter would not be required to manually count the weeds. Today, what is usually done in practice is a rough estimate based on weed identification and counting randomly taken from limited samples in the area. But a scouting mediated by a computer device has the accuracy increased due to a large number of samples in a short period of time.
The image recognition or pattern recognition is an area of information technology that includes several algorithms and techniques (Astrand & Baerveldt, 2002; Brown & Noble, 2005; Hague et al., 2006). One of the techniques is neural networks (Figure 4), whose development was inspired by the human mind. As each species has different types of foliage and characteristic shapes, it is possible to develop a neural network to identify these characteristics and thus classify the different types of weeds.
Once the weeds are identified and quantified, software procedures similar to HADSS can be used to list the control options for the user (Bennett et al., 2003). This type of software contains a databank about the different herbicides and its efficacy on each weed species. Other information required by the system, such as soil moisture, herbicide costs, crop grain prices, can be retrieved automatically from the internet to make the calculations more realistic. For instance, the weather forecast can be assessed to identify the herbicide whose best performance matches the current environmental conditions.
The main feature of AR is the possibility of real-time processing, visualization and selection of the control option directly at the field using the monitor. Before taking the weed control decision, the user could simulate and visualize the consequences of each decision on the weed seed-bank and likelihood of herbicide needs and costs in the future. For instance, if an herbicide option allows just a few plants to survive and this species is costly to control in the next year crop, thus an alternative option could be selected.
The sophistication of the system can go as far as the human imagination. The AR technology could be integrated to other existing software. For instance, after the herbicide selection by the user, the message of the herbicide chosen can be sent directly to the distributor through the internet. The distributor software can receive this information and process it, removing the item from the stock. Another use of the information can be implemented when all the farmers in a region use the system. In this case, the collective data from all farmers can have at least four uses. First, it can be used to forecast the crop production on the year. Second, it can be used to estimate the yield loss due to weeds and, in this way, demonstrating the need for funds, research, and better tools for weed control. Third, it can help predict the weed seed-bank for next year and it can help industry predict the herbicide need for the next season. Fourth, it can be a powerful marketing tool to evaluate current and future market-share, as well as, to advertise herbicides directly to the farmer.
The AR technology is already in use in the different sectors such as: military and aviation (Mizell, 2001; Bojda & Frantis, 2009), tourism (Miyashita et al., 2008), entertainment (Thomas et al., 2002; Cheok et al., 2004), education (Arvamitis et al. 2009), and medicine (Milgram et al., 1997; Oostema et al., 2008). In this paper, it is proposed that all the basic requirements are available for the development of AR systems in agriculture. Weed science can take a leading role integrating AR systems into weed economic thresholds softwares, thus, providing better opportunities for science and computer-based weed control decisions.
To CNPq for support of the authors. To Dr. Nelson Kruse (UFSM, Santa Maria, RS, Brazil) and to Dr. Gordon Vail (USA) for suggestions to an early draft of this paper.
ARVANITIS, T. N. Human factors and qualitative pedagogical evaluation of a mobile augmented reality system for science education used by learners with physical disabilities. Personal Ubiquitous Comp., v. 13, n. 3, p. 243-250, 2009. [ Links ]
ASTRAND, B.; BAERVELDT, A. J. An agricultural mobile robot with vision-based perception for mechanical weed control. Autonomous Robots, v. 13, n. 1, p. 21-35, 2002. [ Links ]
BENNETT, A. C. et al. HADSS (TM), pocket HERB (TM), and WebHADSS (TM): Decision aids for field crops. Weed Technol., v. 17, n. 2, p. 412-420, 2003. [ Links ]
BOJDA, P.; FRANTIS, P. Multipurpose visualization system. IEEE Aerospace Electr. Syst. Magaz., v. 24, n. 4, p. 4-8, 2009. [ Links ]
BROWN, R. B.; NOBLE, S. D. Site-specific weed management: sensing requirements-what do we need to see? Weed Sci., v. 53, p. 252-258, 2005. [ Links ]
CAUDELL, T.; MIZELI, D. Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: INTERNATIONAL CONFERENCE SYSTEM SCIENCES, Hawaii, 1992. Proceedings Hawaii: 1992. p. 659-669. [ Links ]
CHEOK, A. D. et al. Human pacman: a mobile wide-area entertainment system based on physical, social, and ubiquitous computing. Personal Ubiquitous Comp., v. 8, n. 2, p. 71-81, 2004. [ Links ]
FEINER, S.; MACINTYRE, B.; SELIGMANN, D. Knowledge-based augmented reality. Comm. ACM, v. 36, n. 7, p. 52-63, 1993. [ Links ]
HAGUE, T. et al. Automated crop and weed monitoring in widely spaced cereals. Precision Agric., v. 7, n. 1, p. 21-32, 2006. [ Links ]
HOLLERER, T. et al. User interface management techniques for collaborative mobile augmented reality. Comp. Graphics-UK, v. 25, n. 5, p. 799-810, 2001. [ Links ]
KLEIN, G.; MURRAY, D. W. Simulating low-cost cameras for augmented reality composing. IEEE Trans. Visualization Comp. Graphics, v. 18, n. 3, p. 369-380, 2010. [ Links ]
LOURAKIS, M. I. A.; ARGYROS, A. A. Efficient, causal camera tracking in unprepared environments. Comp. Vision Image Understanding, v. 99, p. 259-290, 2005. [ Links ]
LANIER, J. Virtual realities. In: ASIS ANNUAL MEETING, 27., 1990, Toronto, ON. Proceedings Toronto, ON: 1990. p. 360. [ Links ]
LANIER, J.; BIOCCA, F. An insider view of the future of virtual reality. J. Comm., v. 42, n. 4, p. 150-172, 1992. [ Links ]
MILGRAM, P.; KISHINO, F. A taxonomy of mixed reality visual-displays. IEEE Trans. Infor. Syst., v. E77D, n. 12, p. 1321-1329, 1994. [ Links ]
MILGRAM, P.; KIM, M.; DRAKE, J. Augmented reality tools for microsurgery. J. Sport Exercise Psychol., v. 19, p. s6, 1997. [ Links ]
MINER, N.; CAUDELL, T. Computational requirements and synchronization issues for virtual acoustic displays. Presence Teleop. Virtual Environ., v. 7, n. 4, p. 396-409, 1998. [ Links ]
MIZELL, D. Boing's wire bundle assembly project. In: BARFIELD, W.; CAUDELL, T. (Eds.) Fundamentals of wearable computers and augmented reality. London: Laurence Erlbaum, 2001. p. 447-467. [ Links ]
MIYASHITA, T. et al. An augmented reality museum guide. In: INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, 2008, Cambridge, UK. Proceedings Cambridge, UK: 2008. p. 103-106. [ Links ]
MORRIS, C. J.; PARVIS, B. A. Micro-scale metal contacts for capillary force-driven self-assembly. J. Micromech. Microeng., v. 18, n. 1, p. 1-10, 2008. [ Links ]
OOSTEMA, J. A.; ABDEL, M. P.; GOULD, J. C. Time-efficient laparoscopic skills assessment using an augmented reality simulator. Surg. Endoscopy Interv. Techn., v. 22, n. 12, p. 2621-2624, 2008. [ Links ]
PUNDT, H.; BRINKKOTTER-RUNDE, K. Visualization of spatial data for field based GIS. Comp. Geosci., v. 26, n. 1, p. 51-56, 2000. [ Links ]
ROHS, M. et al. Impact of item density on the utility of visual context in magic lens interactions. Personal Ubiquitous Comp., v. 13, n. 8, p. 633-646, 2009. [ Links ]
SADJADI, F. et al. Enhanced vision for adverse weather aircraft landing. Intern. J. Infrared Millimeter Waves, v. 17, n. 1, p. 1-50, 1996. [ Links ]
SHARMA, R.; PAVLOVIC, V. I.; HUANG, T. S. Toward multimodal human-computer interface. Proc. IEEE, v. 86, n. 5, p. 853-869, 1998. [ Links ]
SHEEDY, J.; BERGSTROM, N. Performance and comfort on near-eye computer displays. Optometry Vision Sci., v. 79, n. 5, p. 306-312, 2002. [ Links ]
SUTHERLAND, I. E. Computer inputs and outputs. Sci. Am., v. 215, n. 3, p. 86-88, 1966. [ Links ]
TATHAM, E. W. Getting the best of both real and virtual worlds. Comm. ACM, v. 42, n. 9, p. 96-98, 1999. [ Links ]
THOMAS, B. et al. First person indoor/outdoor augmented reality application: ARQuake. Personal Ubiquitous Comp., v. 6, n. 1, p. 75-86, 2002. [ Links ]
UENOHARA, M.; KANADE, T. Vision-based object registration for real-time image overlay. Comp. Biol. Medicine, v. 25, n. 2, p. 249-260, 1995. [ Links ]
Recebido para publicação em10.2.2010 e na forma revisada em 15.6.2010 .