Acessibilidade / Reportar erro

DESIGN OF A CONTROL SYSTEM FOR A SAFFLOWER PICKING ROBOT AND RESEARCH ON MULTISENSOR FUSION POSITIONING

ABSTRACT

This paper discusses the design of a safflower picking robot control system and focuses on a navigation control subsystem based on multisensor fusion. A navigation subsystem, an identification and positioning subsystem, a picking subsystem, and a levelling subsystem are designed. The hardware and software of the navigation subsystem are designed in detail, and a multisensor fusion positioning method based on extended Kalman fusion technology is proposed. The accuracy and stability levels of different combined navigation methods are compared. To test the effectiveness and accuracy of the proposed method, an outdoor test is carried out. The test results show that the outdoor fusion positioning accuracy of the robot is less than 8 cm, and when the satellite signal is lost, the navigation control subsystem can still provide high positioning accuracy. The final positioning result obtained using the integrated positioning method of the wheel odometer + IMU + DGNSS is approximately 52% higher than that of the odometer, approximately 29% higher than that of the wheel odometer + IMU, and approximately 11% higher than that of the IMU + DGNSS.

picking robot; control system; extended Kalman; fusion; positioning

INTRODUCTION

Safflower, which is a red and blue flower, is a special economic crop that is used for medicinal and oil products (State Pharmacopoeia Committee, 2010State Pharmacopoeia Committee (2010) Chinese pharmacopoeia. Beijing, Medical Science and Technology Press, p 141−142.). It is difficult to realize mechanized harvesting of safflower due to the characteristics of safflower blooming in multiple batches, different stalk heights and uneven distribution of flower bulbs (Cao et al., 2018Cao WB, Sun WL, Niu C, Jiao HB, Chen BB (2018) Research on comb clip safflower picking device based on ANSYS / LS-DYNA. Transactions of the Chinese Society for Agricultural Machinery 49 (11): 123−131.; Azimi et al., 2012Azimi S, Chegini G, Kianmehr MH, Heidari A (2012) Design and construction of a harvesting safflower petals machine. Mechanics & Industry 13(5): 301-305.). Safflower picking robots can not only improve agricultural productivity, reduce operating costs and solve the problem of an inadequate workforce, but also change the agricultural production model to realize the scale, diversification and precision of agricultural production (Wang et al., 2019Wang R, Zheng Z, Zhu G, Gao L, Cui B (2019) Research status of harvesting machines for medicinal flowers. In: ASABE Annual International Meeting. Boston, American Society of Agricultural and Biological Engineers, p1; Judd & Knasinski, 1990Judd RP, Knasinski AB (1990) A technique to calibrate in-dustrial robots with experimental verification. Robotics and Automation, IEEE Transactionson 6(1): 20−30.). Therefore, the study of safflower picking robots is important for the automatic harvesting of safflower in fields and upgrading the safflower industry.

One of the key issues for safflower picking robots in picking operations is positioning and navigation. The working environment of safflower picking robots is poor and is characterized by uncertainty and inhomogeneity (Ge et al., 2015Ge Y, Zhang L, Gu J, Fu W, Zhu R, Zhang H (2015) Parameter optimization and experiment of dual roller harvesting device for safflower. Transactions of the Chinese Society of Agricultural Engineering 31(21): 35-42.; Ichiura et al., 2020Ichiura S, Yoshihiro H, Sato K, Onodera R, Katahira M (2020) Safflower Production Management ECOSYSTEM with AI harvester. In 2020 ASABE Annual International Virtual Meeting (p. 1). American Society of Agricultural and Biological Engineers.). In such an environment, a safflower picking robot wants to achieve precise operation and safe autonomous movement. It must have strong anti-interference and scene perception capabilities, and it is difficult for a single sensor to meet the positioning requirements (Fountas et al., 2020Fountas S, Mylonas N, Malounas I, Rodias E, Hellmann Santos C, Pekkeriet E (2020) Agricultural robotics for field operations. Sensors 20(9): 2672.).

At present, the common positioning methods of field robots are the difference global navigation satellite system (DGNSS), inertial measurement units (IMUs) and wheel odometers (Bonadies & Gadsden, 2019Bonadies S, Gadsden SA (2019) An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Engineering in Agriculture, Environment and Food 12(1): 24-31.). DGNSS can provide all-day, all-weather absolute position and heading information, but the loss of DGNSS signals due to extreme environments or occlusions limits its application in complex farmland environments (Gao et al., 2018Gao X, Li J, Fan L, Zhou Q, Yin K, Wang J, Wang Z (2018) Review of wheeled mobile robots’ navigation problems and application prospects in agriculture. IEEE Access 6: 49248-49268.; Mahmud et al., 2020)Mahmud MSA, Abidin MSZ, Emmanuel AA, Hasan HS (2020) Robotics and automation in agriculture: present and future applications. Applications of Modelling and Simulation 4: 130-140.. An IMU is a relative parameter solving system with gyroscopes, accelerometers, and geomagnetic field sensors as sensitive elements and does not depend on external information. An IMU can obtain high-frequency relative position and heading information through trajectory estimation, but it cannot address the cumulative error caused by temperature and integration (Alatise & Hancke, 2020)Alatise MB, Hancke GP (2020) A review on challenges of autonomous mobile robot and sensor fusion methods. IEEE Access, 8: 39830-39846.. A wheel odometer can obtain the relative position of the vehicle body by monitoring the speed of the driving wheel (Khablov, 2021)Khablov DV (2021) Autonomous navigation for control of unmanned vehicles based on odometers and radar sensor. In: International Conference Management of large-scale system development (MLSD). Moscow, Russian Federation.. It has a low cost and fast response, but the accumulated error is higher due to bumps on the ground and wheel skidding (Youssef et al., 2021)Youssef AA, Al-Subaie N, El-Sheimy N, Elhabiby M (2021) Accelerometer-based wheel odometer for kinematics determination. Sensors 21(4): 1327..

Gao et al. (2022)Gao P, Lee H, FAN LF, Jeon CW, Yun C, Kim HJ, Wang WX, Liang GT, Chen YF, Zhang Z, Han XZ (2022) Improved position estimation algorithm of agricultural mobile robots based on multisensor fusion and autoencoder neural network. Sensors (Basel) 22 (4). proposed an improved algorithm for position estimation based on multisensor fusion and autoencoder neural network. The extended Kalman fusion algorithm is used to fuse the positioning data of DGNSSs, IMUs and dual rotary encoders. To achieve improved fusion positioning accuracy, the system combines an autoencoder and radial basis function (ARBF)-based neural network algorithm to optimize the extended Kalman filter. Although this control algorithm can yield significantly improved navigation accuracy, its computational complexity is high, and it requires high processing performance from the navigation controller. This paper does not compare the navigation performance differences observed under different combinations. Jiang et al. (2022)Jiang HT, Li, T, Song D, Shi C (2022) An effective integrity monitoring scheme for GNSS/INS/Vision integration based on error state EKF model. IEEE Sensors Journal 22(7): 7063-7073. proposed a positioning system with a GNSS, an IMU, and vision fusion that was applied to autonomous vehicles. The extended Kalman fusion algorithm can effectively improve the positioning accuracy of vehicles.

In this paper, aiming at the difficulties of mechanized safflower harvesting and the low positioning accuracy and poor stability of a single sensor, a safflower picking robot control system is designed and built. Combining the characteristics of DGNSS, IMUs and wheel odometers, a fusion localization algorithm based on extended Kalman fusion (EKF) technology is proposed. The positioning accuracy and stability levels of different combinations, including IMU + DGNSS, wheel odometer + IMU, and wheel odometer + IMU + DGNSS, based on the extended Kalman fusion algorithm are compared. This algorithm provides a feasible solution to the localization problem of safflower picking robots in the field.

HARDWARE DESIGN OF THE CONTROL SYSTEM

Overall design of the control system

The structure block diagram of the safflower picking robot, as shown in Fig. 1, consists of five parts: an automatic navigation subsystem, a power subsystem, a safflower recognition and positioning subsystem, a picking subsystem, and a leveling subsystem. The automatic navigation subsystem includes industrial computer terminals, DGNSS mobile stations, IMU sensors, ultrasonic sensors, and servo differential drives. The power subsystem includes a 48 V lithium iron phosphate battery, a power conversion module, and a BMS communication module. The safflower recognition and positioning subsystem includes a binocular stereo vision camera, a safflower recognition and positioning subcontroller, and a camera translation device. The picking subsystem includes a safflower conveying device, safflower shearing device, parallel mechanical arm, and mechanical arm servo drive. The vehicle body leveling subsystem includes an inclination sensor, a cable sensor, and an electric push rod. The robot uses a 48 V lithium battery to power the entire robot and supplies power to different electrical components through a DC power conversion module.

FIGURE 1
Block diagram of the safflower picking robot.

The complete machine diagram of the safflower picking robot is shown in Fig. 2. The safflower picking robot operation process is as follows. The robot is powered on, the control system is initialized, and the industrial computer terminal receives the scheduling instructions issued by the scheduling system. The navigation subsystem controls the robot to move to the safflower waiting area according to the scheduling instructions. The leveling subsystem monitors the posture of the picking platform using data from the inclination and cable displacement sensors and controls the electric push rod inside the support legs to raise or lower to adjust the posture of the picking platform. After the leveling of the picking platform is completed, the safflower recognition and positioning subsystem uses the binocular camera to identify and locate a safflower, sends the three-dimensional coordinates of the safflower to be picked to the picker controller, and then starts the picking operation. The picking hand controller first controls the action of the parallel mechanical arm and then drives the safflower shearing device to move to the position where the safflower is to be picked. Then, the cutting device is controlled to cut the safflower silk, and the cut safflower silk enters the storage box through the safflower conveying device and is collected and temporarily stored.

FIGURE 2
Structure and picture of the safflower picking robot

(1) universal wheel, (2) support leg, (3) emergency stop, (4) sound and light integrated indicator, (5) rear DGNSS antenna, (6) radio antenna, (7) box of safflower picking robot, (8) linear actuator, (9) suspension bracket, (10) driving wheel, (11) ultrasonic sensor, (12) 48 V lithium battery, and (13) driving motor


Design of each subsystem

Navigation subsystem design

The safflower picking robot obtains path information for the vehicle body in the field according to the dispatching instructions (Chen et al., 2021Chen F, Ge Y, Zhang L, Qi Z, Zeng H (2021) Design and experiment of the strip-collected pre-positioning mechanism for safflower picking robots. Transactions of the Chinese Society for Agricultural Machinery 37(15): 10-19.). First, the industrial computer terminal obtains the current longitude and latitude information of the robot using the DGNSS and controls the robot to move to the starting position of the prepicking area. Then, according to the fusion information of the DGNSS and IMU, the vehicle body angle and posture information are obtained, the robot posture is adjusted so that the heading direction is consistent with the preset path forward direction, and the robot starts to move. Data fusion of the DGNSS, IMU and wheel odometer are carried out by an industrial computer terminal during the moving process, and real-time positioning information of the robot is calculated. Then, the robot is controlled to drive along the preset path by comparing the location information with the preset path information. The industrial computer terminal sends the position information of the safflower picking robot to the dispatching system for real-time display through wireless AP.

The safflower picking robot monitors obstacles in front of the vehicle using ultrasonic sensors and stops and alarms when the monitoring distance is less than a set value. The industrial computer terminal communicates with the motor driver through the CAN bus and drives the servo motors. The servo motors drive the drive wheels through a reducer and chain drive to perform differential motion to enable the robot to travel straight and steer. The gear ratio of the reducer is 30:1, and the chain drive reduction ratio is 2:1. Table 1 shows the technical parameters of the main components of the navigation system.

TABLE 1
Main device technical parameters.

The DGNSS mobile station is a P3DU vehicle-mounted mobile station (Shanghai Huace Navigation Technology Co., Ltd.). The DGNSS base station is a B5UA-CHOWYA station (Shanghai Huace Navigation Technology Co., Ltd.). The industrial computer terminal is an APQ-E6 industrial computer (Chengdu Apqi Technology Co., Ltd.). The IMU is an HWT905 (Shenzhen Witt Intelligent Technology Co., Ltd.). The servo motors are 80LCB075C motors (Beijing Hollysys Electric Co., Ltd.). The motor drivers are LS20530DG drivers (Beijing Hollysys Electric Co., Ltd.).

Design of the safflower recognition and location subsystem

The safflower recognition and positioning subsystem includes a binocular stereo vision camera, safflower recognition location controller, and a camera translation device. Based on the principle of parallax, the binocular camera uses left-eye and right-eye lenses at different positions to obtain two images of the same safflower area to be picked (Yang et al., 2017Yang Y, Meng X, Gao M (2017) Vision system of mobile robot combining binocular and depth cameras. Journal of Sensors 2017:1-11.). The safflower recognition location controller calculates the position deviation between the characteristic points of the safflower corresponding to the image and obtains the three-dimensional coordinate information of the safflower to be picked (Luo et al., 2016Luo L, Tang Y, Zou X, Ye M, Feng W, Li G (2016) Vision-based extraction of spatial information in grape clusters for harvesting robots. Biosystems Engineering 151: 90-104.). The safflower picking robot straddles 3 rows of safflower for picking operations. Due to the camera's field of view, the positioning accuracy of a safflower at the edge of the field of view cannot meet the picking requirements. Therefore, a camera translation device is added to improve the identification and positioning accuracy of edge safflowers.

The camera translation device includes a stepping motor and a ball screw slide. The binocular camera is installed on the ball screw slide. The safflower recognition positioning controller controls the translation of the ball screw slide by controlling the stepping motor, controlling the translation of the binocular camera.

Picking subsystem design

The schematic diagram of the picking subsystem is shown in Fig. 3. The picking subsystem includes a picking controller, a safflower conveying device, a safflower shearing device, a parallel mechanical arm, and a mechanical arm servo drive. The safflower picking system uses a parallel robotic arm to drive an end effector (safflower shearing device) to achieve three-dimensional movement. The parallel robotic arm includes three active arms and three slave arms (Yang et al., 2021Yang H, Chen L, Ma Z, Chen M, Zhong Y, Deng F, Li M (2021) Computer vision-based high-quality tea automatic plucking robot using Delta parallel manipulator. Computers and Electronics in Agriculture 181: 105946.). The picking controller obtains the three-dimensional coordinates of the safflower to be picked from the safflower recognition positioning controller and calculates the rotation angle of each servo axis through the inverse kinematics calculation method (Yin et al., 2018Yin XL, Li B, Zhao XH (2018) Kinematic analysis of a redundant actuated five degrees of freedom parallel mechanism. In: IEEE International Conference on Robotics and Biomimetics (ROBIO). Kuala Lumpur.). The servo axis drives the active arm, which in turn drives the slave arm. The three slave arms are hinged with the end effector to realize the three-dimensional movement of the safflower shearing device.

FIGURE 3
Schematic diagram of the picking subsystem.

The schematic diagram of the safflower shearing device is shown in Fig. 4. The safflower shearing device includes a flower ball guide groove and a crank slider shearing mechanism. The safflower cutting device moves above the safflower to be picked with the parallel manipulator and moves slowly downward. The safflower enters the safflower cutting device through the ball guide groove, and the safflower filament is cut by the action of the crank slider shearing mechanism. The cut safflower enters the safflower collection box through the safflower conveying device. The safflower conveying device is composed of a suction fan and a conveying hose. The collection box generates negative pressure due to the suction fan. Under negative pressure, the filament enters the collection box through the flower conveying hose. The safflower filaments are picked and collected through the above process.

FIGURE 4
Schematic diagram of the safflower shearing device.

(1) Mobile platform, (2) conveyor hose mounting groove, (3) DC geared motor, (4) end cover, (5) hall element, (6) safflower shear chamber, (7) crank slider shearing mechanism, (8) crank disc, (9) baffle plate, (10) moving knife, and (11) fixed knife


Design of the picking platform leveling subsystem

The leveling system of the picking platform is shown in Fig. 5. The levelling subsystem of the picking platform includes an inclination sensor, rope displacement sensor, and linear actuator. Parallel robotic arms and binocular cameras are installed on the picking platform. Due to the uneven ground in the field, it is easy to cause the picking platform to tilt, and the safflower picking robot cannot effectively locate and pick safflowers (Baeten et al., 2008Baeten J, Donné K, Boedrij S, Beckers W, Claesen E (2008) Autonomous fruit picking machine: a robotic apple harvester. In: Lauger C, Segwart R, editor. Field and service robotics. Berlin, Springer, p531-539.; Li et al., 2016Li J, Karkee M, Zhang Q, Xiao K, Feng T (2016) Characterizing apple picking patterns for robotic harvesting. Computers and Electronics in Agriculture 127: 633-640.). Therefore, it is necessary to use a leveling subsystem to adjust the picking platform in real time to maintain the level of the picking platform and the area to be picked and improve the accuracy of safflower positioning and picking.

FIGURE 5
Physical image of the leveling subsystem of the picking platform.

(1) Inclination sensor, (2) linear actuator, (3) and rope displacement sensor


The vehicle body levelling system monitors the inclination angle of the platform relative to the platform center point through an inclination sensor installed in the middle of the picking platform. The controller controls a linear actuator (Fig. 1 (8)) on the inner sides of the 4 supporting legs to rise or fall according to the inclination information and then adjusts the position of the picking platform. The controller monitors the elongation of the linear actuator through the rope displacement sensor to prevent the linear actuator from overranging during adjustment.

NAVIGATION SUBSYSTEM SOFTWARE DESIGN

Navigation path planning

After the safflower picking robot is powered on and initialized, it waits for the dispatch terminal to send the coordinates of the online point and the heading to that point. The robot controls the differential wheels of the vehicle to travel to the online point according to the instructions from the dispatch terminal. After adjusting the posture, the robot waits again for the dispatch terminal to send the movement task chain.

For the Chinese safflower mechanized planting mode, the most commonly used methods are wide and narrow row planting. Wide row spacing is 450 mm and narrow row spacing is 200 mm; the plant height is 450-750 mm (Jiao, 2019Jiao HB (2019) Research on curved clamping comb-type safflower picking device and drive control system. Thesis, Shi He Zi University.). As shown in Fig. 6, the trajectory line with arrows represents the preset navigation path of the safflower picking robot, point 0 is the online point, and points 1-9 are task chain nodes (turning points).

FIGURE 6
Field movement task chain of the safflower picking robot.

Positioning and navigation frame of the safflower picking robot

The process framework of multisensor fusion positioning and navigation is shown in Fig. 7. Traditional Kalman filtering is only suitable for systems that are linear and satisfy Gaussian distribution, but the field navigation environment of the safflower picking robot is not linear (Zhang et al., 2021Zhang SL, Tan XQ, Wu QW (2021) Indoor mobile robot localization based on multi-sensor fusion technology. Transducer and Microsystem Technologies 40(08): 53-56.; Silva & Cruz, 2016Silva AL da, Cruz JJ da (2016) Fuzzy adaptive extended Kalman filter for UAV INS/GPS data fusion. Journal of the Brazilian Society of Mechanical Sciences and Engineering 38(6): 1671-1688.). Aiming at the complex navigation environment in the field where the safflower picking robot is located, this paper adopts the EKF algorithm to realize the data fusion of the navigation sensors.

FIGURE 7
Flow diagram of the fusion of the positioning navigation.

This paper uses the EKF algorithm to fuse the positioning information provided by the wheel odometer, IMU, and DGNSS differential navigation module. The EKF consists of two parts: a prediction step and an update step. First, the positioning information (DOdo, X, DOdo, Y, θOdo) provided by the wheel odometer is used as the prediction step. The position information provided by the DGNSS (XGNSS, YGNSS, θGNSS) is used as the update step observation. Second, the fused wheel odometer and DGNSS positioning information is used as the prediction step, and the heading information θIMU provided by the IMU is used as the update step to perform data fusion again. The localization results (XEKF, YEKF, θEKF) are output by the EKF multisensor fusion algorithm.

During safflower picking robot navigation, the difference between the positioning information (DOdo, X, DOdo, Y, θOdo) and the target path (XT, YT, θT) is used as input to a fuzzy PID controller. The output is the rotational velocity values (Vl, Vr) of the left and right driving wheels to realize the path tracking and heading adjustment of the safflower picking robot.

EKF multisensor fusion algorithm

It is assumed that the working area of the picking robot is an ideal horizontal two-dimensional environment, and the state vector of the system is the position and heading of the robot. The position and heading of the picking robot at time k is known to be xk=[Xk,Yk,θk]T, where Xk and Yk denote the position of the geometric center of the picking robot and θk represents the heading of the picking robot.

State prediction

According to the EKF algorithm, once the motion model of the mobile robot is accurately provided, the position and heading at time k+1 can be expressed as

x k + 1 = f ( x k , u k + 1 ) + w k + 1 (1)

Where:

Xk+1 is the state variable at k+1;

υk+1 is the control quantity at k+1;

f(xk,uk+1) is the functional relation between the state variable k+1 and the state variable Xk,

Wk+1 is the motion noise at time Xk+1.

Here, the wheel odometer is used to provide the control amount uk+1=[DOdo ,X,DOdo ,Y,θOdo ]for the prediction step.

(1) Prediction of the position and heading at time k+1 based on the motion model of the picking robot.

x ^ k + 1 = x ^ k + ( cos θ k sin θ k 0 sin θ k cos θ k 0 0 0 1 ) ( D O d o , X D O d o , Y θ O d o ) d t (2)

Where:

x^k+1 is the position and heading estimation value at time k+1 predicted by the motion model, where the symbol "^" is the estimated value and the symbol "-" is the predicted value.

(2) The covariance matrix of the prior estimate of the predicted state vector is

P k + 1 = f x P k f x T + f w Q k + 1 f w T (2)

Where:

Pk and Qk+1are the covariance matrices of the position and heading Xk and the motion noise Wk+1, respectively.

fx and ▽fw are the Jacobian matrix of the motion model and the Jacobian matrix of the motion noise, respectively.

f x = ( 1 0 ( D O d o , Y cos θ k D O d o , X sin θ k ) d t 0 1 ( D O d o , X cos θ k D O d o , Y sin θ k ) d t 0 0 1 )
f w = ( d t cos θ k d t sin θ k 0 d t sin θ k d t cos θ k 0 0 0 1 )

When measuring with a wheel odometer, the typical values of motion noise are wx = wy = 0.01 m/min and wθ=1°/m (Hu & Wu, 2020Hu F, Wu G (2020) Distributed error correction of EKF algorithm in multi-sensor fusion localization model. IEEE Access 8: 93211-93218.), so

Q k + 1 = ( 0.01 0 0 0 0.01 0 0 0 1 )

Measurement update

Using the position information provided by the DGNSS [Xk+1, GNSS, Yk+1, GNSS, θk+1, GNSS], the observation model at this time is

Z k + 1 = ( X k + 1 , G N S S Y k + 1 , G N S S θ k + 1 , G N S S ) + v k + 1 (4)

Where:

Xk+1 is the system observation value at time k+1, and

Vk+1 is the observation noise at time k+1.

(1) Calculate the Kalman gain.

K k + 1 = P k + 1 h x T ( h x P k + 1 h x T + R k + 1 ) 1 (5)

Where:

hx is the Jacobian matrix of the measurement model, and it is known from the measurement model that ▽hx is a unit matrix. Rk+1 is the covariance matrix of the observation model. Generally, the accuracy is given by the manufacturer, or a noise evaluation is carried out using experimental statistics.

R k + 1 = ( σ x 0 0 0 σ y 0 0 0 σ θ ) (6)

Where:

σχ is the variance of the observation noise output by the DGNSS with respect to position X;

σγ is the variance of the observation noise output by the DGNSS with respect to position Y, and

σθ is the variance of observed noise about position and heading output by the DGNSS.

The Kalman gain is

K k + 1 = ( P k + 1 , x P k + 1 , x + σ x 0 0 0 P k + 1 , y P k + 1 , y + σ y 0 0 0 P k + 1 , θ P k + 1 , θ + σ θ ) (7)

(2) Determine the posterior of the state variables.

x ^ k + 1 = x ^ k + 1 + K k + 1 ( Z k + 1 x ^ k + 1 ) (8)

(3) Update the covariance matrix to the covariance matrix of the posterior estimated value of the state variable.

P k + 1 = ( I 3 K k + 1 ) P k + 1 (9)

In the formula, I3 is the third-order identity matrix.

The data fusion result χ^k+1,θ of the wheel odometer and DGNSS is used as the data for the prediction step, and the heading result θIMU obtained by the IMU is used as the observation for the update step. The above data fusion process is repeated to obtain xk,EKF=[XEKF,YEKF,θEKF].

TEST AND RESULT ANALYSIS

Test evaluation

To analyze and verify the accuracy and stability of the designed safflower picking robot fusion positioning system, a positioning accuracy test was carried out. Test equipment: safflower picking robot, DGNSS mobile station P3DU, DGNSS base station B5UA-CHOWYA, industrial computer terminal APQ-E6, IMU sensor HWT905, servo motor drivers 80LCB075C and servo motors LS20530DG, notebook computer, STM32F105 microcontroller, JLINK V9 simulation, Guangcheng Technology CAN recorder. Test time: April 8, 2022 - April 16, 2021. Test location: Xinjiang Agricultural University. The test site is shown in Fig. 8.

FIGURE 8
Outdoor test site.

Positioning accuracy test

The movement area selected was a 5 m×5 m square plot, and the vertices of this square area were defined as o, a, b, and c. The picking robot was placed in this area, and the manual control mode was set. The picking robot was placed at the coordinate origin o (0,0) m, 3 target points a (5,0) m, b (5,5) m, c (0,5) m were set for the picking robot, and the vehicle speed was set to 0.2 m/s. The picking robot passed through coordinate points o, a, b, and c in turn along the reference trajectory and returned to the origin o. When the vehicle body reached section a-b, 2.5-3.5 m, the positioning antenna of the DGNSS base station was artificially blocked. The CAN recorder separately recorded the positioning track of the wheel odometer, DGNSS, wheel odometer + IMU fusion, IMU + DGNSS fusion, and wheel odometer + IMU + DGNSS fusion methods. The positioning trajectories of different positioning methods are shown in Fig. 9.

FIGURE 9
Comparison of the positioning trajectories of the different positioning methods.

Fig. 9 shows that the curve of the wheel odometer positioning trajectory track drifts compared to the reference trajectory. The curve of the DGNSS positioning trajectory fluctuates around the positioning reference trajectory, indicating that there is extensive noise in the DGNSS positioning, but there is no drift. The DGNSS positioning accuracy is susceptible to occlusion by obscurants. Compared with the wheel odometer, the positioning trajectory curve of the wheel odometer + IMU fusion is closer to the reference trajectory. The wheel odometer + IMU + DGNSS fusion positioning trajectory curve is closer to the reference trajectory, and compared with the IMU + DGNSS fusion positioning trajectory, the fluctuation is smaller, and the positioning noise is smaller. The wheel odometer + IMU + DGNSS fusion can improve the phenomenon of abnormal positioning information caused by a temporary loss of the DGNSS signal.

In Fig. 9, the positioning trajectory based on the wheel odometer + IMU + DGNSS fusion positioning algorithm is smoother and closer to the reference trajectory than the positioning trajectory obtained by other positioning methods. The accuracy of each positioning method is shown in Fig. 10.

FIGURE 10
Accuracy of each positioning method.

The standard deviation is used to calculate the accuracy of the positioning methods.

δ = 1 n i = 1 n ( z i z 0 ) 2 (10)

Where:

δ is the positioning accuracy, m;

Zi is the coordinate of the point measured in the trajectories of the different positioning methods,

Z0 is the coordinate of the point measured in the reference trajectory.

The outdoor fusion positioning accuracy of the safflower picking robot is less than 8 cm, and when the satellite signal is lost, the control system can still provide high positioning accuracy. The final positioning result obtained by the multisensor fusion-based positioning method is approximately 52% higher than that of the odometer positioning method, approximately 29% higher than that of the wheel odometer + IMU, and approximately 11% higher than that of the IMU + DGNSS. This shows that the multisensor fusion positioning method proposed in this paper achieves high positioning accuracy and stability.

CONCLUSIONS

This paper proposes a safflower picking robot control system and completes the overall design of the system and the design of each subsystem. The hardware and software of the automatic navigation system of the safflower picking robot were designed and built in detail. A movement path for the safflower picking robot in a field was planned, and a fusion positioning algorithm of a wheel odometer + DGNSS + IMU based on extended Kalman fusion was proposed. The positioning accuracy and stability levels of different positioning combinations based on the extended Kalman fusion algorithm were compared.

To evaluate the feasibility and positioning accuracy of the proposed positioning method, a fusion positioning test of the safflower picking machine was conducted. The test results show that the outdoor fusion positioning accuracy of the robot is less than 8 cm, and when the satellite signal is lost, the control system can still provide high positioning accuracy. The final positioning results obtained by the multisensor fusion positioning method are improved by approximately 52% compared with the odometer positioning accuracy, by approximately 29% compared with the wheel odometer + IMU fusion accuracy, and by approximately 11% compared with the IMU + DGNSS accuracy. Therefore, multisensor fusion helps the safflower picking robot obtain more accurate positioning and better positioning stability and lays the foundation for the field navigation of the safflower picking robot.

ACKNOWLEDGEMENTS

This study was supported by China's Natural Science Foundation (no. 31901417). Funding was provided by the Ministry of Science and Technology of China. The author also thanks Associate Professor Zhang Zhenguo for their assistance with this study.

REFERENCES

  • Alatise MB, Hancke GP (2020) A review on challenges of autonomous mobile robot and sensor fusion methods. IEEE Access, 8: 39830-39846.
  • Azimi S, Chegini G, Kianmehr MH, Heidari A (2012) Design and construction of a harvesting safflower petals machine. Mechanics & Industry 13(5): 301-305.
  • Baeten J, Donné K, Boedrij S, Beckers W, Claesen E (2008) Autonomous fruit picking machine: a robotic apple harvester. In: Lauger C, Segwart R, editor. Field and service robotics. Berlin, Springer, p531-539.
  • Bonadies S, Gadsden SA (2019) An overview of autonomous crop row navigation strategies for unmanned ground vehicles. Engineering in Agriculture, Environment and Food 12(1): 24-31.
  • Cao WB, Sun WL, Niu C, Jiao HB, Chen BB (2018) Research on comb clip safflower picking device based on ANSYS / LS-DYNA. Transactions of the Chinese Society for Agricultural Machinery 49 (11): 123−131.
  • Chen F, Ge Y, Zhang L, Qi Z, Zeng H (2021) Design and experiment of the strip-collected pre-positioning mechanism for safflower picking robots. Transactions of the Chinese Society for Agricultural Machinery 37(15): 10-19.
  • Fountas S, Mylonas N, Malounas I, Rodias E, Hellmann Santos C, Pekkeriet E (2020) Agricultural robotics for field operations. Sensors 20(9): 2672.
  • Gao P, Lee H, FAN LF, Jeon CW, Yun C, Kim HJ, Wang WX, Liang GT, Chen YF, Zhang Z, Han XZ (2022) Improved position estimation algorithm of agricultural mobile robots based on multisensor fusion and autoencoder neural network. Sensors (Basel) 22 (4).
  • Gao X, Li J, Fan L, Zhou Q, Yin K, Wang J, Wang Z (2018) Review of wheeled mobile robots’ navigation problems and application prospects in agriculture. IEEE Access 6: 49248-49268.
  • Ge Y, Zhang L, Gu J, Fu W, Zhu R, Zhang H (2015) Parameter optimization and experiment of dual roller harvesting device for safflower. Transactions of the Chinese Society of Agricultural Engineering 31(21): 35-42.
  • Hu F, Wu G (2020) Distributed error correction of EKF algorithm in multi-sensor fusion localization model. IEEE Access 8: 93211-93218.
  • Ichiura S, Yoshihiro H, Sato K, Onodera R, Katahira M (2020) Safflower Production Management ECOSYSTEM with AI harvester. In 2020 ASABE Annual International Virtual Meeting (p. 1). American Society of Agricultural and Biological Engineers.
  • Jiang HT, Li, T, Song D, Shi C (2022) An effective integrity monitoring scheme for GNSS/INS/Vision integration based on error state EKF model. IEEE Sensors Journal 22(7): 7063-7073.
  • Jiao HB (2019) Research on curved clamping comb-type safflower picking device and drive control system. Thesis, Shi He Zi University.
  • Judd RP, Knasinski AB (1990) A technique to calibrate in-dustrial robots with experimental verification. Robotics and Automation, IEEE Transactionson 6(1): 20−30.
  • Khablov DV (2021) Autonomous navigation for control of unmanned vehicles based on odometers and radar sensor. In: International Conference Management of large-scale system development (MLSD). Moscow, Russian Federation.
  • Li J, Karkee M, Zhang Q, Xiao K, Feng T (2016) Characterizing apple picking patterns for robotic harvesting. Computers and Electronics in Agriculture 127: 633-640.
  • Luo L, Tang Y, Zou X, Ye M, Feng W, Li G (2016) Vision-based extraction of spatial information in grape clusters for harvesting robots. Biosystems Engineering 151: 90-104.
  • Mahmud MSA, Abidin MSZ, Emmanuel AA, Hasan HS (2020) Robotics and automation in agriculture: present and future applications. Applications of Modelling and Simulation 4: 130-140.
  • Silva AL da, Cruz JJ da (2016) Fuzzy adaptive extended Kalman filter for UAV INS/GPS data fusion. Journal of the Brazilian Society of Mechanical Sciences and Engineering 38(6): 1671-1688.
  • State Pharmacopoeia Committee (2010) Chinese pharmacopoeia. Beijing, Medical Science and Technology Press, p 141−142.
  • Wang R, Zheng Z, Zhu G, Gao L, Cui B (2019) Research status of harvesting machines for medicinal flowers. In: ASABE Annual International Meeting. Boston, American Society of Agricultural and Biological Engineers, p1
  • Yin XL, Li B, Zhao XH (2018) Kinematic analysis of a redundant actuated five degrees of freedom parallel mechanism. In: IEEE International Conference on Robotics and Biomimetics (ROBIO). Kuala Lumpur.
  • Yang H, Chen L, Ma Z, Chen M, Zhong Y, Deng F, Li M (2021) Computer vision-based high-quality tea automatic plucking robot using Delta parallel manipulator. Computers and Electronics in Agriculture 181: 105946.
  • Yang Y, Meng X, Gao M (2017) Vision system of mobile robot combining binocular and depth cameras. Journal of Sensors 2017:1-11.
  • Youssef AA, Al-Subaie N, El-Sheimy N, Elhabiby M (2021) Accelerometer-based wheel odometer for kinematics determination. Sensors 21(4): 1327.
  • Zhang SL, Tan XQ, Wu QW (2021) Indoor mobile robot localization based on multi-sensor fusion technology. Transducer and Microsystem Technologies 40(08): 53-56.

Edited by

Area Editor: Teresa Cristina Tarlé Pissarra

Publication Dates

  • Publication in this collection
    21 Apr 2023
  • Date of issue
    2023

History

  • Received
    29 Dec 2021
  • Accepted
    20 Mar 2023
Associação Brasileira de Engenharia Agrícola SBEA - Associação Brasileira de Engenharia Agrícola, Departamento de Engenharia e Ciências Exatas FCAV/UNESP, Prof. Paulo Donato Castellane, km 5, 14884.900 | Jaboticabal - SP, Tel./Fax: +55 16 3209 7619 - Jaboticabal - SP - Brazil
E-mail: revistasbea@sbea.org.br