Evaluation of sampling frequency, window size and sensor position for classification of sheep behaviour.
ABSTRACT: Automated behavioural classification and identification through sensors has the potential to improve health and welfare of the animals. Position of a sensor, sampling frequency and window size of segmented signal data has a major impact on classification accuracy in activity recognition and energy needs for the sensor, yet, there are no studies in precision livestock farming that have evaluated the effect of all these factors simultaneously. The aim of this study was to evaluate the effects of position (ear and collar), sampling frequency (8, 16 and 32?Hz) of a triaxial accelerometer and gyroscope sensor and window size (3, 5 and 7?s) on the classification of important behaviours in sheep such as lying, standing and walking. Behaviours were classified using a random forest approach with 44 feature characteristics. The best performance for walking, standing and lying classification in sheep (accuracy 95%, F-score 91%-97%) was obtained using combination of 32?Hz, 7?s and 32?Hz, 5?s for both ear and collar sensors, although, results obtained with 16?Hz and 7?s window were comparable with accuracy of 91%-93% and F-score 88%-95%. Energy efficiency was best at a 7?s window. This suggests that sampling at 16?Hz with 7?s window will offer benefits in a real-time behavioural monitoring system for sheep due to reduced energy needs.
Project description:Lameness in sheep is the biggest cause of concern regarding poor health and welfare among sheep-producing countries. Best practice for lameness relies on rapid treatment, yet there are no objective measures of lameness detection. Accelerometers and gyroscopes have been widely used in human activity studies and their use is becoming increasingly common in livestock. In this study, we used 23 datasets (10 non-lame and 13 lame sheep) from an accelerometer- and gyroscope-based ear sensor with a sampling frequency of 16 Hz to develop and compare algorithms that can differentiate lameness within three different activities (walking, standing and lying). We show for the first time that features extracted from accelerometer and gyroscope signals can differentiate between lame and non-lame sheep while standing, walking and lying. The random forest algorithm performed best for classifying lameness with an accuracy of 84.91% within lying, 81.15% within standing and 76.83% within walking and overall correctly classified over 80% sheep within activities. Both accelerometer- and gyroscope-based features ranked among the top 10 features for classification. Our results suggest that novel behavioural differences between lame and non-lame sheep across all three activities could be used to develop an automated system for lameness detection.
Project description:The miniaturization and affordability of new technology is driving a biologging revolution in wildlife ecology with use of animal-borne data logging devices. Among many new biologging technologies, accelerometers are emerging as key tools for continuously recording animal behavior. Yet a critical, but under-acknowledged consideration in biologging is the trade-off between sampling rate and sampling duration, created by battery- (or memory-) related sampling constraints. This is especially acute among small animals, causing most researchers to sample at high rates for very limited durations. Here, we show that high accuracy in behavioral classification is achievable when pairing low-frequency acceleration recordings with temperature. We conducted 84 hr of direct behavioral observations on 67 free-ranging red squirrels (200-300 g) that were fitted with accelerometers (2 g) recording tri-axial acceleration and temperature at 1 Hz. We then used a random forest algorithm and a manually created decision tree, with variable sampling window lengths, to associate observed behavior with logger recorded acceleration and temperature. Finally, we assessed the accuracy of these different classifications using an additional 60 hr of behavioral observations, not used in the initial classification. The accuracy of the manually created decision tree classification using observational data varied from 70.6% to 91.6% depending on the complexity of the tree, with increasing accuracy as complexity decreased. Short duration behavior like running had lower accuracy than long-duration behavior like feeding. The random forest algorithm offered similarly high overall accuracy, but the manual decision tree afforded the flexibility to create a hierarchical tree, and to adjust sampling window length for behavioral states with varying durations. Low frequency biologging of acceleration and temperature allows accurate behavioral classification of small animals over multi-month sampling durations. Nevertheless, low sampling rates impose several important limitations, especially related to assessing the classification accuracy of short duration behavior.
Project description:Behaviors are important indicators for assessing the health and well-being of dairy cows. The aim of this study is to develop and validate an ensemble classifier for automatically measuring and distinguishing several behavior patterns of dairy cows from accelerometer data and location data. The ensemble classifier consists of two parts, our new Multi-BP-AdaBoost algorithm and a data fusion method based on D-S evidence theory. We identify seven behavior patterns: feeding, lying, standing, lying down, standing up, normal walking, and active walking. Accuracy, sensitivity, and precision were used to validate classification performance. The Multi-BP-AdaBoost algorithm performed well when identifying lying (92% accuracy, 93% sensitivity, 82% precision), lying down (99%, 82%, 86%), standing up (99%, 74%, 85%), normal walking (97%, 92%, 86%), and active walking (99%, 94%, 89%). Its results were poor for feeding (80%, 52%, 55%) and standing (80%, 46%, 58%), which are difficult to differentiate using a leg-mounted sensor. Position data made it possible to differentiate feeding and standing. The D-S evidence fusion method for combining accelerometer data and location data in classification was used to fuse two pieces of basic behavior-related evidence into a single estimation model. With this addition, the sensitivity and precision of the two difficult behaviors increased by approximately 20 percentage points. In conclusion, the classification results indicate that the ensemble classifier effectively recognizes various behavior patterns in dairy cows. However, further work is needed to study the robustness of the feature and model by increasing the number of cows enrolled in the trial.
Project description:During the past ten years, dynamic functional connectivity (FC) has been extensively studied using the sliding-window method. A fixed window-size is usually selected heuristically, since no consensus exists yet on choice of the optimal window-size. Furthermore, without a known ground-truth, the validity of the computed dynamic FC remains unclear and questionable. In this study, we computed single-scale time-dependent (SSTD) window-sizes for the sliding-window method. SSTD window-sizes were based on the frequency content at every time point of a time series and were computed without any prior information. Therefore, they were time-dependent and data-driven. Using simulated sinusoidal time series with frequency shifts, we demonstrated that SSTD window-sizes captured the time-dependent period (inverse of frequency) information at every time point. We further validated the dynamic FC values computed with SSTD window-sizes with both a classification analysis using fMRI data with a low sampling rate and a regression analysis using fMRI data with a high sampling rate. Specifically, we achieved both a higher classification accuracy in predicting cognitive impairment status in fighters and a larger explained behavioral variance in healthy young adults when using dynamic FC matrices computed with SSTD window-sizes as features, as compared to using dynamic FC matrices computed with the conventional fixed window-sizes. Overall, our study computed and validated SSTD window-sizes in the sliding-window method for dynamic FC analysis. Our results demonstrate that dynamic FC matrices computed with SSTD window-sizes can capture more temporal dynamic information related to behavior and cognitive function.
Project description:A minimalistic optical sensing device for the indoor localization is proposed to estimate the relative position between the sensor and active markers using amplitude modulated infrared light. The innovative insect-based sensor can measure azimuth and elevation angles with respect to two small and cheap active infrared light emitting diodes (LEDs) flickering at two different frequencies. In comparison to a previous lensless visual sensor that we proposed for proximal localization (less than 30 cm), we implemented: (i) a minimalistic sensor in terms of small size (10 cm 3 ), light weight (6 g) and low power consumption (0.4 W); (ii) an Arduino-compatible demodulator for fast analog signal processing requiring low computational resources; and (iii) an indoor positioning system for a mobile robotic application. Our results confirmed that the proposed sensor was able to estimate the position at a distance of 2 m with an accuracy as small as 2-cm at a sampling frequency of 100 Hz. Our sensor can be also suitable to be implemented in a position feedback loop for indoor robotic applications in GPS-denied environment.
Project description:The commercially available collar device MooMonitor+ was evaluated with regards to accuracy and application potential for measuring grazing behavior. These automated measurements are crucial as cows feed intake behavior at pasture is an important parameter of animal performance, health and welfare as well as being an indicator of feed availability. Compared to laborious and time-consuming visual observation, the continuous and automated measurement of grazing behavior may support and improve the grazing management of dairy cows on pasture. Therefore, there were two experiments as well as a literature analysis conducted to evaluate the MooMonitor+ under grazing conditions. The first experiment compared the automated measurement of the sensor against visual observation. In a second experiment, the MooMonitor+ was compared to a noseband sensor (RumiWatch), which also allows continuous measurement of grazing behavior. The first experiment on n = 12 cows revealed that the automated sensor MooMonitor+ and visual observation were highly correlated as indicated by the Spearman's rank correlation coefficient (rs) = 0.94 and concordance correlation coefficient (CCC) = 0.97 for grazing time. An rs-value of 0.97 and CCC = 0.98 was observed for rumination time. In a second experiment with n = 12 cows over 24-h periods, a high correlation between the MooMonitor+ and the RumiWatch was observed for grazing time as indicated by an rs-value of 0.91 and a CCC-value of 0.97. Similarly, a high correlation was observed for rumination time with an rs-value of 0.96 and a CCC-value of 0.99. While a higher level of agreement between the MooMonitor+ and both visual observation and RumiWatch was observed for rumination time compared to grazing time, the overall results showed a high level of accuracy of the collar device in measuring grazing and rumination times. Therefore, the collar device can be applied to monitor cow behavior at pasture on farms. With regards to the application potential of the collar device, it may not only be used on commercial farms but can also be applied to research questions when a data resolution of 15 min is sufficient. Thus, at farm level, the farmer can get an accurate and continuous measurement of grazing behavior of each individual cow and may then use those data for decision-making to optimize the animal management.
Project description:A brain-computer interface (BCI) is a channel of communication that transforms brain activity into specific commands for manipulating a personal computer or other home or electrical devices. In other words, a BCI is an alternative way of interacting with the environment by using brain activity instead of muscles and nerves. For that reason, BCI systems are of high clinical value for targeted populations suffering from neurological disorders. In this paper, we present a new processing approach in three publicly available BCI data sets: (a) a well-known multi-class (N = 6) coded-modulated Visual Evoked potential (c-VEP)-based BCI system for able-bodied and disabled subjects; (b) a multi-class (N = 32) c-VEP with slow and fast stimulus representation; and (c) a steady-state Visual Evoked potential (SSVEP) multi-class (N = 5) flickering BCI system. Estimating cross-frequency coupling (CFC) and namely ?-? [?: (0.5-4 Hz), ?: (4-8 Hz)] phase-to-amplitude coupling (PAC) within sensor and across experimental time, we succeeded in achieving high classification accuracy and Information Transfer Rates (ITR) in the three data sets. Our approach outperformed the originally presented ITR on the three data sets. The bit rates obtained for both the disabled and able-bodied subjects reached the fastest reported level of 324 bits/min with the PAC estimator. Additionally, our approach outperformed alternative signal features such as the relative power (29.73 bits/min) and raw time series analysis (24.93 bits/min) and also the original reported bit rates of 10-25 bits/min. In the second data set, we succeeded in achieving an average ITR of 124.40 ± 11.68 for the slow 60 Hz and an average ITR of 233.99 ± 15.75 for the fast 120 Hz. In the third data set, we succeeded in achieving an average ITR of 106.44 ± 8.94. Current methodology outperforms any previous methodologies applied to each of the three free available BCI datasets.
Project description:Trial-by-trial texture classification analysis and identifying salient texture related EEG features during active touch that are minimally influenced by movement type and frequency conditions are the main contributions of this work. A total of twelve healthy subjects were recruited. Each subject was instructed to use the fingertip of their dominant hand's index finger to rub or tap three textured surfaces (smooth flat, medium rough, and rough) with three levels of movement frequency (approximately 2, 1 and 0.5 Hz). EEG and force data were collected synchronously during each touch condition. A systematic feature selection process was performed to select temporal and spectral EEG features that contribute to texture classification but have low contribution towards movement type and frequency classification. A tenfold cross validation was used to train two 3-class (each for texture and movement frequency classification) and a 2-class (movement type) Support Vector Machine classifiers. Our results showed that the total power in the mu (8-15 Hz) and beta (16-30 Hz) frequency bands showed high accuracy in discriminating among textures with different levels of roughness (average accuracy > 84%) but lower contribution towards movement type (average accuracy < 65%) and frequency (average accuracy < 58%) classification.
Project description:High gamma band (>50?Hz) activity is a key oscillatory phenomenon of brain activation. However, there has not been a non-invasive method established to detect language-related high gamma band activity. We used a 160-channel whole-head magnetoencephalography (MEG) system equipped with superconducting quantum interference device (SQUID) gradiometers to non-invasively investigate neuromagnetic activities during silent reading and verb generation tasks in 15 healthy participants. Individual data were divided into alpha (8-13?Hz), beta (13-25?Hz), low gamma (25-50?Hz), and high gamma (50-100?Hz) bands and analysed with the beamformer method. The time window was consecutively moved. Group analysis was performed to delineate common areas of brain activation. In the verb generation task, transient power increases in the high gamma band appeared in the left middle frontal gyrus (MFG) at the 550-750 ms post-stimulus window. We set a virtual sensor on the left MFG for time-frequency analysis, and high gamma event-related synchronization (ERS) induced by a verb generation task was demonstrated at 650 ms. In contrast, ERS in the high gamma band was not detected in the silent reading task. Thus, our study successfully non-invasively measured language-related prefrontal high gamma band activity.
Project description:Rapid profiling of the biomolecular components of milk can be useful for food quality assessment and for food fraud detection. Differences in commercial value and availability of milk from specific species are often the reasons for the illicit and fraudulent sale of milk whose species origin is wrongly declared. In this study, a fast, MS-based speciation method is presented to distinguish sheep from goat milk and sheep colostrum at different phases. Using liquid atmospheric pressure (AP)-matrix-assisted laser desorption/ionisation (MALDI) MS, it was possible to classify samples of goat and sheep milk with 100% accuracy in one minute of data acquisition per sample. Moreover, an accuracy of 98% was achieved in classifying pure sheep milk samples and sheep milk samples containing 10% goat milk. Evaluating colostrum quality and postnatal stages represents another possible application of this technology. Classification of sheep colostrum samples that were collected within 6 hours after parturition and 48 hours later was achieved with an accuracy of 84.4%. Our data show that substantial changes in the lipid profile can account for the accurate classification of colostrum collected at the early and late time points. This method applied to the analysis of protein orthologs of different species can, as in this case, allow unequivocal speciation analysis.