Abstract

The collision detection algorithm of the robot body previously needed to rely on the surface geometry information of the colliding object and no deformation was allowed during the collision process. To solve this problem, a new robot body collision detection algorithm that uses the force information of the six-axis force/torque sensor at the base to self-constrain is proposed which does not rely on the geometric information of the colliding object surface, and the deformation also allows deformation during the collision. In terms of sensor data preprocessing, a gravity and dynamic force compensation algorithm for the six-axis force/torque sensor at the base is proposed to ensure that the reading of the six-axis force/torque sensor at the base always maintains the value of 0 when the robot is working. Then, the robot is considered to have collided with the outside world when the sensor reading exceeds the set threshold. And a precision factor is proposed to analyze the influence of force and collision distance on the accuracy of the algorithm. Finally, the new algorithm proposed in this paper is compared with the traditional algorithm that relies on the geometric information of the colliding body surface. The experimental results indicate that the accuracy of the collision point detection algorithm proposed in this paper is close to that of the traditional method, but it does not need to rely on the geometric information of the collision body surface, and there is no requirement for whether there is deformation during the contact process. It can be concluded that the collision distance is the most important factor affecting the accuracy of the algorithm, followed by the conclusion of the magnitude of the collision force through the calculation of the precision factor. The results show that this method can effectively detect the collision point of the machine body, and the maximum error at the farthest point of the robot is 8.712%, which lays a certain foundation for the subsequent research on human-machine collaboration in small collaborative robots.

1. Introduction

With the development of robot technology, a variety of sensors such as vision, touch, and force were applied to robot systems. Force perception is an important function for intelligent robots to interact with the external environment, especially for operations such as grasping, contour detection, obstacle avoidance, human-computer interaction, and force feedback control [13]. The perception of force information also plays a vital role in the hexacopter equipped researched by Ibrahim et al. [4] and the new full-mobile robot and control system researched by Kilin et al. [5]. The perception of force information is conducive to improving the robustness of the hexarotor and fully movable robot control. Six-axis force/torque sensors are widely used in industrial robot collision detection and feedback control [68]. In some smarter robot systems, six-axis force/torque sensors are used to implement functions such as tactile sensing, safety control, and collision detection [914]. It can be seen that collision detection and feedback control are two key functions of force perception, and collision position detection is one of the important contents of collision detection.

There have been many researches on collision detection algorithms using six-axis force/torque sensors in the past few years. Bicchi et al. [15] proposed a solution for contact position detection based on a six-axis force/torque sensor as early as 1990, which allows for practical devices that provide simple, relevant contact information in practical robotic applications. Kazanzides et al. proposed a method for detecting collision points by a linear regression method in a surgical robot system in [16], but did not give a specific calculation process. Zhou et al. proposed a mathematical model for determining the contact position between the fingertip and the object to be grasped/manipulated using measurement data provided by a force/torque sensor installed at the end of the fingertip in 1996, and the errors were analyzed [17]. The real-time selection algorithm described above was used to implement contact point measurement based on a six-axis force/torque sensor. The experimental results show that the test results still have good stability in the case of large noise and interference. Leng et al. [18, 19] used constraint equations and compatibility solution analysis to ensure collision point detection. In addition, gravity compensation and dynamic compensation are described, and the theory is proved to be effective in a gravity environment in experiments. The geometric and natural constraints of the collision process have also been analyzed by many scholars, and the gravity compensation method has also been applied to the collision detection of robots [10, 20].

Although the methods mentioned above provide a certain theoretical basis for the detection of robot collision points, there are still obvious shortcomings. The previous method has limited application range due to the reliance on geometric constraints to solve. Therefore, the collision objects must be regular in shape or geometrically modelable (such as a probe with a certain length, sphere, ellipse, and plane), and the collision surface must be smooth and cannot be deformed. The previous method of relying on geometric constraints to calculate the collision point has great limitations because most of the actual collision environment will be accompanied by deformation. In Figure 1, the collision types of robots can be divided into three types according to the number of contacts on the collision surface: single-contact collision (Figure 1(a)), flat or surface uniform contact collision (Figure 1(b)), and multicontact collision (Figure 1(c)). However, single-point collisions can be divided into structured collisions (which can model the collision body surface) and unstructured collisions as shown in (e) and (d) in Figure 1. In Figure 1(f), deformation is also an important issue to be considered in the collision process. In fact, collisions with uniform contact on the surface and collisions with multiple contacts can be considered as special combinations of single-contact collisions. Therefore, this article focuses on the analysis of single-contact collision and deformation in an unstructured environment.

In this paper, a robot collision point detection system based on six-axis force/torque sensor is proposed and experimental studies were carried out. We propose an error factor to search for the best in order to improve the detection accuracy. Compared with the traditional method that relies on the geometric information of the colliding body surface, the method proposed not only has no requirements on the surface of the colliding object, but also has good accuracy in the case of shape changes. The algorithm proposed solves the limitations of traditional geometric constraint methods that require strict surface information and do not allow deformation. This paper effectively promotes the application of force perception in robot systems and also lays the foundation for the research of multicontact detection.

The remainder of this paper is organized as follows. Collision point detection algorithms and minimum error search strategies are proposed in Section 2. Section 3 proposes gravity compensation and dynamic force compensation algorithms. Section 4 conducts experimental research on the proposed collision point detection algorithm and verifies the correctness and accuracy of the algorithm. Finally, the paper is summarized in Section 5.

2. Collision Point Detection Algorithm

2.1. Mathematical Model for Collision Point Detection

Six-axis force/torque sensor is installed at the base of the robot in order to measure the collision points of the robot body extensively, which is shown in Figure 2.

Six-axis force/torque sensors can resolve forces and torque into components on three coordinate axes. Therefore, assuming that the collision force and moment are and , the relationship between force and moment and contact position can be expressed as follows: where is the vector of the position of the sensor to the contact point. Equation (1) can be expressed in matrix form as follows:

The equation system has multiple solutions for , so other constraints are needed to determine the unique contact position.

In fact, the solution of Eq. (2) is distributed on a spatial straight line when and are constants. This space curve is usually called the external force vector line, which is defined as . The parameter form can be expressed as follows:

The external force vector line is determined by the collision force, and is the direction vector of the straight line. Generally, the direction of the collision object’s movement and the direction of the collision force cause two sets of external force vector lines to intersect at different times, and the intersection point is the collision contact point. Therefore, the position of the contact point can be obtained by solving the intersection of the two sets of external force vector lines, which can be expressed as follows:

2.2. Projection Method for Collision Points

The external force vector line calculated using the data of two adjacent frames of the sensor under ideal conditions without error is shown as and in Figure 3(a) and intersects at point in space. However, the two external force vector lines may not intersect in space due to the measurement error of the sensor, which is shown as and in Figure 3(a). Therefore, a projection method is proposed in this paper to solve the contact coordinates. The two external force vector lines and in space will not intersect at a certain point in space due to the existence of errors, but the projected straight lines on a certain plane will intersect at point , as shown in Figure 3(b).

The projection point coordinates of the contact point in three coordinate planes can be obtained by projecting the intersection point, and then the three-dimensional coordinate value of the contact point can be obtained by combining the original external force vector line equation, which is shown in Figure 4.

The projection of on the plane is set to , the projection on the plane is , and the projection on the plane is . Then, we can get

is the intersection of the projection lines and , and are the corresponding points of the projection points on and . and can be solved by the following formulas: where is a determination factor. is used to select an optimal projection plane from the three coordinate planes, and the specific rules are as follows: where , and represents the angle between and , which can be calculated with the following formula:

The solving formulas for and can be expressed as Eq. (9) and Eq. (10) if is the final selected projection plane after the above calculation.

As it is impossible to confirm which point the real collision point is closer to, the contact point is temporarily defined as the midpoint of and , which can be expressed as follows:

2.3. Least Error Search

This paper proposes a method to search the optimal solution among multiple sets of data during the collision process in order to further reduce the error and improve the robustness of contact position detection. Assuming that the external force vector lines generated by multiple sets of force sensor data during the collision are shown in Figure 5, the contact set obtained by the projection method using two sets of adjacent data is shown in Figure 6, where the purple point in the point set represents the real contact position, and the orange point is the optimal solution searched by the minimum error method.

If is the result of the collision point calculation, then is introduced as a precision factor to reduce the calculation error. The Eq. (2) can be expressed as follows: then

Assuming , the smallest point in the multiple sets of data is the optimal result of collision point detection. Finally, the improved contact position detection method of the system can be summarized as the flow shown in Figure 7.

3. Dynamic Force Compensation Algorithm

3.1. Gravity Compensation Algorithm

The influence of the gravity of the robot body and the end load on the sensor during the movement should be eliminated first when the six-axis force/torque sensor is installed at the base of the robot, and the dynamic force compensation of the sensor should be performed so that the sensor is not changed in the state of robot motion. After dynamic force compensation is performed on the sensor, the reading is constant to zero when the robot is in motion and is not impacted by external forces. It indicates that the robot collided unexpectedly with the outside world, when the sensor reading exceeds the threshold. Robot gravity compensation is shown in Figure 8.

The D-H parameters are used to establish the joint coordinate system of the robot. The general formula of the homogeneous transformation matrix of the connecting rod can be expressed as follows:

Equation (14) can be expressed as follows: where , is the rotation matrix of coordinate system relative to coordinate system , which can be expressed as follows: where is a unit orthogonal matrix, .

The pose transformation matrix between adjacent links can be obtained in turn by bringing the parameters of each link of the robot into Eq. (14). The transformation matrix of the coordinate system relative to the coordinate system {0} can be expressed as follows:

The position vector of the center of mass of the joint relative to the joint coordinate system is as follows:

If the gravity of each joint of the robot is , then the gravity vector of each link can be expressed in the base coordinate system as follows:

The component of the gravity of each joint at its center of mass will change with the robot’s posture. It can be known from Eq. (18) that the rotation matrix of the joint attachment coordinate system relative to the base coordinate system is . The centroid coordinate system of the connecting rod has the same direction as the coordinate system . Therefore, the gravity vector of each joint on the coordinate system can expressed as follows:

The homogeneous transformation matrix of the joint centroid coordinate system and the joint coordinate system can be shown as

The homogeneous transformation matrix of the centroid coordinate system of each link with respect to the {0} system can be expressed as follows: where . The force vector and moment vector of the robot’s own gravity on the {0} system in the stationary state can be expressed as follows:

The force sensor coordinate system has the same attitude as the base coordinate system {0} and is offset by - on the -axis; then, we can get the transformation matrix of and in the coordinate system {0} and the coordinate system which can be expressed as follows: where

3.2. Dynamic Force Compensation Algorithm

It is obviously not enough to only compensate the sensor for gravity during the robot movement. At this time, the inertial force and the coercive force generated during the movement of the robot need to be considered. Therefore, the robot dynamic force compensation algorithm is proposed in this section.

We assume that the mass of each joint arm of the robot is , the position vector of the centroid of the joint with respect to the coordinate system is , the displacement of joint is , the speed is , and the acceleration is . Then, we can get where is the angular velocity of link , is the angular acceleration of link , is the linear acceleration of link , is the linear acceleration of the center of mass of link , and , . The schematic diagram of dynamic force compensation for the sensor is shown in Figure 9. Then, we can get the force/torque relationship between the links, which can be expressed as follows: where is the acting force of rod on rod ; is the acting force of rod on rod ; is the resistance of rod to rod ; is the resistance of rod to rod ; is the sagittal diameter from the origin of the coordinate system attached to the joint to the center of mass; is the sagittal diameter from the origin of the coordinate system attached to the joint to the center of mass; is the inertia tensor of the rod with respect to its center of mass ; is the scientific force term.

For -degree-of-freedom robots, the value of is related to the end load of the robot. when the end is unloaded. From Eqs. (27)–(32), and can be calculated. Take them into the Eqs. (24)–(26) to get the specific dynamic force compensation values and that are required by the six-axis force/torque sensor at the base. If the force/torque sensor readings of the six-axis force/torque sensor on the base are when the robot is in a certain posture, the force/torque information after gravity compensation can be expressed as follows:

After dynamic force compensation, the robot can keep the six-axis force/torque sensor reading constant to 0 under the condition of no external force during the movement. Once the reading of the six-axis force/torque sensor at the spindle exceeds the set threshold, the robot can be considered to collide with the outside. The threshold value needs to be determined according to the characteristics of different robot systems. The collision between the robot and the outside world is a continuous process, so each frame of data read by the sensor at the base can be used as input to the collision point detection algorithm after dynamic force compensation.

4. Simulation Experiment

4.1. Simulation Verification Experiment

A three-degree-of-freedom robot was built to perform simulation experiments to verify the correctness and effectiveness of the algorithm proposed in this paper, as shown in Figure 10. The links and joints of the manipulator are center symmetrical to reduce the amount of calculation, and the center of gravity of each link is located on its own central axis. The structural parameters of each link of the three-degree-of-freedom robot used in this paper are shown in Table 1, and the material is set to alloy (). In Figure 10, , , , , and .

The date of six-axis force/torque sensor at the base will change continuously over time. The data collected by the sensor at the base is preprocessed and brought into the proposed collision point detection algorithm. The robot model is imported into ADAMS, and two sets of simulation experiments are designed to verify the proposed dynamic force compensation algorithm and collision point detection algorithm. The magnitude, direction, and position of the collision force applied in these two sets of experiments are known, and the data in the experimental results are based on the sensor coordinate system . The first set of experiments was designed to verify the proposed gravity compensation algorithm and collision point detection algorithm, and an equal amount of increased collision force was applied at the same collision point . The robot was set to be stationary in this set of experiments, and , , , and . The results of the experiment are shown in Table 2, where the collision direction is obtained by calculating the difference of the coordinate system data of the sensor before and after the force is applied in the ADAMS. Calculation force and calculate position can be obtained by Eq. (24) and Eq. (11) and a series of coordinate transformations.

In order to verify the effectiveness and accuracy of the dynamic force compensation algorithm and collision point detection algorithm in the experiment 2, the following assumptions were added on the basis of experiment 1: ; , , and . The robot moves according to the set parameters in experiment 2 and performs the collision test at different points with the same collision force when running to the 3rd second. Specific experimental data is shown in Table 3.

4.2. Error Calculation Method

The error in the above table is calculated to verify the effectiveness of the proposed algorithm. Equation of them is shown as follows:

Force direction error: where is the collision force direction, and is the calculate force direction.

Relative error of force in all directions:

Force error:

In Eq. (36), and are the component forces of the measured force and the calculated force, respectively.

Position error: where is the known collision position, and is the calculated collision position.

Relative error of position in all directions:

Position error (%):

4.3. Analysis of Results

Two sets of experiments are designed to reflect the effectiveness and accuracy of the algorithm proposed in this paper. Experiment 1 and experiment 2 take the magnitude of collision force and collision distance as two independent variables. Figures 11 and 12 show the relative error and absolute error of each direction force and the position of the collision point in experiment 1 with the increase of the collision force. Figures 13 and 14 show the relative error and absolute error of each direction force and the position of the collision point in experiment 2 as the collision distance increases. Figure 15 shows the overall error of the force and the location of the impact point.

4.3.1. Error Analysis of Experiment 1 Results

In Figures 11 and 12, the absolute error and relative error of the force in all directions are kept within a reasonable range when the collision point is fixed, and there is no obvious change trend with the increase of the force. However, the absolute position error and relative position error in all directions have a clear tendency to decrease with increasing force. Therefore, if the method proposed in this paper is used to detect the collision point, the position accuracy of the minimum detectable collision external force calculation must be achieved.

4.3.2. Error Analysis of Experiment 2 Results

Figures 13 and 14 show the force error and position error of the system in different directions under the same collision external force at different collision points. If the denominator is 0 when performing the calculation in Eq. (38), the calculation result is output as 0. When using the fifth set of data in Table 3 to calculate , this value is eliminated as a singular value, because the difference between the calculation results is very large. It can be seen that the absolute error and relative error of the component force in all directions fluctuate within a certain range as the distance of the collision point changes when the collision force is constant. But the absolute error and relative error in each direction of the collision position continue to increase as the distance of the collision point increases. So using this method for collision point detection needs to meet the robot’s farthest collision accuracy to meet the requirements.

4.3.3. The Overall Error Analysis of the Experiment

The relative error results of the force and collision points in the two sets of experiments calculated by Eq. (36) and Eq. (39) are shown in Figure 15.

It can be concluded that the error of the component force in each direction fluctuates within 5%, and the maximum relative error of the resultant force is 4.8952% from Figures 12, 14, and 15. The accuracy of the dynamic force compensation algorithm proposed in this paper meets the requirements. However, it can be seen that as the distance of the collision point increases, the error keeps increasing from Figures 14 and 15. The relative error of the collision point coordinate at the position 598.61 mm farthest from the force sensor reaches the maximum value of 8.712%.

4.4. Error Source Analysis

It can be seen from Section 4.3 that the magnitude of the force and the distance from the collision position to the sensor have an impact on the calculation accuracy. The accuracy factor is proposed to verify the impact of collision force and collision distance on the accuracy of the collision point detection algorithm proposed in this paper, which can be expressed as follows:

The data of experiment 1 and experiment 2 are brought into Eqs. (40) and (41), and the specific calculation results are shown in Table 4.

The sensitivity of data to variables is reflected by the size of . In experiment 1, the experiment of collision point detection was performed using the magnitude of the force as a variable, and in experiment 2, the experiment was performed using the position of the collision force as a variable. It can be seen from Table 4 that the values of , , , and in experiment 1 are all more smaller than those in experiment 2. Therefore, the impact of the collision position on the accuracy of the algorithm is much greater than the effect of the force. The location of the collision is the first factor affecting accuracy. So it is necessary to mainly consider the algorithm accuracy of the furthest part of the robot working space when using this algorithm for single external force collision point detection.

5. Conclusion and Future Work

The detection of collision by the robot can make the robot interact with the environment, so that the robot can adapt to the environment and realize the safety of human-computer interaction. Collision sensing includes collision position, collision direction, and force size. Most of the existing methods use geometric constraints. Their disadvantages are that they cannot be applied to unstructured environments and cannot cope with the deformation of collision objects. In order to make up for the above problems, a robot collision point detection algorithm for single point and single external force is proposed in this paper, and the gravity compensation and dynamic force compensation algorithms of the six-axis force/torque sensors at the base are described, then perform simulation experiments to verify the algorithm. Compared with the traditional algorithms [18, 19] that use the geometric information of the collision surface to calculate the collision position, the algorithm proposed in this paper has a minimum error of only 1.5470%, and the maximum error is approximated by traditional methods. However, the advantage compared with the traditional method is that the proposed algorithm does not rely on the geometric information of the collision body surface, and the generation of deformation has no effect on the result. The experimental results prove the effectiveness of the proposed algorithm and show that the accuracy of the proposed gravity and dynamic force compensation algorithm does not change with the change of the force and the position of the collision point. The accuracy of the collision point detection algorithm is obviously affected by the magnitude of the collision force and the collision position. The accuracy of the algorithm in the furthest part of the robot’s workspace should be considered first when applying this algorithm.

The application of the algorithm proposed in this paper in robot force drag teaching and application in medical collaborative robots needs to continue to be explored, which could be as follows: (1) how to use contact information as a basis for decision-making to achieve task division in human-machine collaboration and (2) how to use the trajectory of the touch point as the basis to realize the force teaching of the complex curved surface of the robot. Aiming at the application direction of the above two points, experimental research will be carried out in the follow-up to increase the practicality of the algorithm proposed in this paper.

Data Availability

The raw/processed data required to reproduce these findings cannot be shared at this time as the data also forms part of an ongoing study.

Conflicts of Interest

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work is supported by the National Natural Science Foundation of China (No. 51505124), the Science and Technology Research Project of Hebei Province (ZD2020151), the Foster Fund Projects of North China University of Science and Technology (No. JP201505), and the Natural Science Foundation of Hebei Province (E2016209312).