Digipig: Monitoring pig bodies, heads and tails

28-02-2022 | |
Identifying individual pigs when they are packed together can sometimes be a tough job. Photo: Bert Jansen
Identifying individual pigs when they are packed together can sometimes be a tough job. Photo: Bert Jansen

Researchers from various countries in Europe have been working on a project called Digipig in an attempt to make machine learning more suitable for automatic detection of monitoring individual pig behaviour.

The team was formed by researchers from the Norwegian University of Life Sciences, located in Ås; the University of Maribor, Slovenia; and Saxion University of Applied Sciences in Enschede, the Netherlands. They described the Digipig system as an automated monitoring system for pig body, head, and tail detection for future behavioural study applications. They published their findings in the peer-reviewed journal, Agriculture in December 2021.

The reason for the research is related to the fact that currently, it is not possible for farmers to monitor their pigs 24/7. With the implementation of an automatic monitoring system, it would be possible to gather more information about positive and negative pig behaviour, to provide optimal conditions in real time at the farm level, to reduce workload and costs, and to improve pig welfare.

The team wrote that detecting individual pig and its body parts by using deep learning-based computer vision has great potential as a welfare assessment tool to define positive and negative affective states in individual pigs.

Over 7,500 individual pig postures

The dataset the team worked with included a collection of 583 images, with an average of 13 pigs per image, for a total of 7,579 individual pig postures visible on the images. A dataset was then annotated.

The study consisted of 2 separate parts. The aim in the 1st part of the study was to recognise individual pigs (in lying and standing positions) in groups and their body parts (head/ears, and tail) by using machine-learning algorithms (feature pyramid network). The model recognised each individual pig’s body with a precision of 96% – virtually the same as human-level detection. Pig tails were recognised with a precision of 77% and pig heads with a precision of 66%. This part of the study, the researchers found, was relatively time-consuming.

In the 2nd part of the study, the team zoomed in only on improving the detection of tail posture (tail straight and curled) during activity (standing/moving around) by the use of a neural network analysis. The model recognised tail postures with a precision of 90%.

Lower precision for ear and tail detection

The researchers were able to distinguish individuals in groups of 12-15 pigs, which is the most common group size in Norwegian and other European farms. As both ears and tails are relatively small compared to the rest of the body, it sometimes turned out to be problematic even for human detection, which was the reason for lower precision of head detection versus tail posture detection, the researchers said. Furthermore, pigs were observed to prefer to lie near pen-mates or even more frequently lying over them. That made it harder to identify tails or heads.

In their conclusion, the researchers wrote: “Our new method can be further explored in detecting behavioural sequences, group synchrony as well as quantifying positive welfare (play, exploration, tail curled and wagging). Most of these behaviours are associated with certain body postures and by defining such golden standards with high, human-level precision, we could improve the welfare status of the pig. This means that the current classical approaches of gathering data based on manual observation in real time or the manual analysis of recorded animal behaviours in research will be expensive in terms of both time and labour and will be replaced by a cheaper, real-time digital monitoring system.

“Furthermore, with the implementation of a digital system, we will be able to gather more information about pig behaviour (positive and negative), and thus have better control over them and be able to provide optimal conditions in real time at the farm level.”

The original article was authored by M. Ocepek, Norwegian University of Life Sciences, Ås, Norway; A. Žnidar, M. Lavrič and D. Škorjanc, University of Maribor, Slovenia; and I.L. Andersen, Saxion University of Applied Sciences, Enschede, the Netherlands.

Azarpajouh
Samaneh Azarpajouh Author, veterinarian



Beheer