Pose estimation and behavior classification of broiler chickens based on deep neural networks

https://doi.org/10.1016/j.compag.2020.105863Get rights and content

Highlights

Abstract

Poultry behavior is an important indicator for diagnosing poultry diseases. Accurate pose estimation is the basis of poultry behavior analysis, and it provides poultry disease warning methods. On large-scale poultry farms, it is usually a farmer or veterinarian who watches the pose of the broiler chicken to determine whether they are sick. When the posture of the bird is abnormal, the breeders can address the problem promptly. Accurate tracking of birds can better estimate their posture. In this paper, pose estimation based on a deep neural network (DNN) is applied to analyze the broiler chicken’s behavior for the first time. First, the pose skeleton is constructed through the feature points of the broiler chicken, and then, it is used to track specific body parts. Furthermore, the naive bayesian model (NBM) was used to classify and identify the poses of broiler chickens. Preliminary tests revealed that we could identify chickens in standing, walking, running, eating, resting, and preening states by comparing the postures of classified broiler chickens. The test precision of behavior recognition is 0.7511 (standing), 0.5135 (walking), 0.6270 (running), 0.9361 (eating), 0.9623 (resting), and 0.9258 (preening). Our research provides a noninvasive method for broiler chicken behavior analysis, which can be used for future behavior analysis in broiler chicken farming.

Introduction

In many studies on animal health and welfare, quantitative behavioral analysis plays a vital role (Koolhaas and Van Reenen, 2016, Rushen et al., 2011). Important information about animal behavior can make production managers evaluate animal welfare better (Janczak and Riber, 2015, Peden et al., 2018). Pose estimation is an integral part of behavioral analysis (Li et al., 2019). In some animal posture recognition studies, some additional markers are usually placed on the animal to enhance the recognition precision (Maghsoudi et al., 2017). Although markers can extract animal poses with high recognition precision, this type of system is expensive and distracts animals because they are invasive. Video cameras have the advantage of low invasiveness, which reduces the impact on the animal’s life. Among them, the skeleton fitting and motion contour model is an excellent alternative to physical marking. Although modern high-speed video cameras can accurately record animal behavior, manually analyzing these records requires large amounts of time and labor (Egnor and Kristin, 2016). Video recording and description can take a long time, and these observations are subjective.

With the continuous advancement of computer vision and deep learning technology, research on animals is being improved. Uhlmann et al. (2017) proposed an automatic tracking tool called FlyLimbTracker, which is used to track flies and analyze the behaviors of various flies. Zhuang and Zhang (2019) classified healthy and suspected sick chickens by using a target detection algorithm IFSSD. Fang et al. (2020) used the Tbroiler tracking algorithm to track a single chicken in a flock and analyzed the relate indexes, such as the overlap rate (OR) and pixel error (PE). Lin et al. (2018) used a deep convolutional neural network (CNN) to track broiler chickens by combining heat stress with the broiler chicken activity index as a new predictor, which was used to predict the heat stress response of broiler chickens.

Previously, the postures of animals such as bats and mice were estimated based on traditional image-processing algorithms (Mikhail et al., 2014, Salem et al., 2019, Shen et al., 2017). The DNN is commonly used in animal behavioral classification (Ravbar et al., 2019, Stern et al., 2015). Mathis et al., 2018, Pereira et al., 2019 successfully applied CNNs to animal pose estimation for the first time. Mathis and Pereira's work provided the basis for our research. In general, training a deep CNN requires a large number of marker images (Graving et al., 2019, Günel et al., 2019). Deeplabcut uses ResNet-50 as the initial network for pretraining and overcomes this problem using transfer learning (He et al., 2016). By marking a small number of images, we can accurately track the specific parts of the body of the target chicken, such as chicken legs, chicken neck, chicken head, and so on.

Different from previous broiler chicken detection methods, this study focuses on six common behaviors of broiler chickens, and it implements pose detection methods to detect broiler chickens’ behaviors. The main objective of this article is to combine broiler chickens’ pose estimation networks and classification networks to detect broiler chickens’ behaviors. By combining a neural network based on feature points and a classification network based on the naive bayesian model (NBM), we aimed to analyze six behaviors of broiler chickens.

This paper proposes a user-defined broiler chicken pose skeleton and behavior classification method by using Deeplabcut as the basis for the tracking framework. Through this model, we monitor and track specific parts of the chicken's body in a dynamic environment. Broiler chickens were tracked by pose estimation, and their behaviors were classified.

Section snippets

The pose skeleton model of a broiler chicken

First, we established a simple map of a broiler chicken pose skeleton. As seen from Fig. 1, the plan mainly includes ten custom feature points and broiler chicken pose skeletons. Point 1 is the center point of the chicken, which is the center of the largest inscribed circle in the contour boundary of the broiler chicken (Zhuang et al., 2018). The tail point is the point at which the longest line between the center point and the tail area of the broiler chicken intersects the outline of the

Broiler chicken pose estimation results

The premise of behavioral classification is to accurately estimate the posture of the chicken, and thus, we must analyze the results of the chicken tracking feature points. Fig. 7 shows the loss map for the deep neural network training.

Behavior classification results

Each of the six behaviors after the data expansion includes 100 pose skeletons. The posture characteristics of the chicken skeleton points show different weight values in different behaviors. We obtained Fig. 8 by counting the different feature points of the

Conclusions

This study shows that we can further analyze the behavior of chickens by using deep learning to track specific points in the body of the chicken. Through the preliminary exploration and analysis of the six behavioral postures of chickens, we filled in the blanks of using skeleton feature points to calculate the behaviors of chickens. In addition, the pose estimation and behavior classification methods of chickens based on deep neural networks proposed in this research have certain reference

CRediT authorship contribution statement

Cheng Fang: Conceptualization, Methodology, Software, Writing - original draft. Tiemin Zhang: Supervision, Conceptualization. Haikun Zheng: Visualization, Software. Junduan Huang: Validation, Writing - review & editing. Kaixuan Cuan: Methodology, Writing - review & editing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

Funding: This work was supported by the National Key Research and Development Plan of China [grant No. 2018YFD0500705] and the Guangdong Province Special Fund for Modern Agricultural Industry Common Key Technology R&D Innovation Team of China [grant No. 2019KJ129].

References (31)

  • X. Zhuang et al.

    Detection of sick broilers by digital image processing and deep learning

    Biosyst. Eng.

    (2019)
  • S.E.R. Egnor et al.

    Computational analysis of behavior

    Annu. Rev. Neurosci.

    (2016)
  • J.M. Graving et al.

    Fast and robust animal pose estimation

    bioRxiv

    (2019)
  • S. Günel et al.

    DeepFly3D: A deep learning-based approach for 3D limb and appendage tracking in tethered adult Drosophila

    bioRxiv

    (2019)
  • K. He et al.

    Deep Residual Learning for Image Recognition

  • Cited by (0)

    View full text