Project description:Cell communication is primarily regulated by secreted proteins, whose inhomogeneous secretion often indicates physiological disorder. Parallel monitoring of innate protein-secretion kinetics from individual cells is thus crucial to unravel systemic malfunctions. Here, we report a label-free, high-throughput method for parallel, in vitro, and real-time analysis of specific single-cell signaling using hyperspectral photonic crystal resonant technology. Heterogeneity in physiological thrombopoietin expression from individual HepG2 liver cells in response to platelet desialylation was quantified demonstrating how mapping real-time protein secretion can provide a simple, yet powerful approach for studying complex physiological systems regulating protein production at single-cell resolution.
Project description:Hyperspectral Imaging (HSI) is a relatively new medical imaging modality that exploits an area of diagnostic potential formerly untouched. Although exploratory translational and clinical studies exist, no surgical HSI datasets are openly accessible to the general scientific community. To address this bottleneck, this publication releases HeiPorSPECTRAL ( https://www.heiporspectral.org ; https://doi.org/10.5281/zenodo.7737674 ), the first annotated high-quality standardized surgical HSI dataset. It comprises 5,758 spectral images acquired with the TIVITA® Tissue and annotated with 20 physiological porcine organs from 8 pigs per organ distributed over a total number of 11 pigs. Each HSI image features a resolution of 480 × 640 pixels acquired over the 500-1000 nm wavelength range. The acquisition protocol has been designed such that the variability of organ spectra as a function of several parameters including the camera angle and the individual can be assessed. A comprehensive technical validation confirmed both the quality of the raw data and the annotations. We envision potential reuse within this dataset, but also its reuse as baseline data for future research questions outside this dataset. Measurement(s) Spectral Reflectance Technology Type(s) Hyperspectral Imaging Sample Characteristic - Organism Sus scrofa.
Project description:BackgroundThe early and specific detection of abiotic and biotic stresses, particularly their combinations, is a major challenge for maintaining and increasing plant productivity in sustainable agriculture under changing environmental conditions. Optical imaging techniques enable cost-efficient and non-destructive quantification of plant stress states. Monomodal detection of certain stressors is usually based on non-specific/indirect features and therefore is commonly limited in their cross-specificity to other stressors. The fusion of multi-domain sensor systems can provide more potentially discriminative features for machine learning models and potentially provide synergistic information to increase cross-specificity in plant disease detection when image data are fused at the pixel level.ResultsIn this study, we demonstrate successful multi-modal image registration of RGB, hyperspectral (HSI) and chlorophyll fluorescence (ChlF) kinetics data at the pixel level for high-throughput phenotyping of A. thaliana grown in Multi-well plates and an assay with detached leaf discs of Rosa × hybrida inoculated with the black spot disease-inducing fungus Diplocarpon rosae. Here, we showcase the effects of (i) selection of reference image selection, (ii) different registrations methods and (iii) frame selection on the performance of image registration via affine transform. In addition, we developed a combined approach for registration methods through NCC-based selection for each file, resulting in a robust and accurate approach that sacrifices computational time. Since image data encompass multiple objects, the initial coarse image registration using a global transformation matrix exhibited heterogeneity across different image regions. By employing an additional fine registration on the object-separated image data, we achieved a high overlap ratio. Specifically, for the A. thaliana test set, the overlap ratios (ORConvex) were 98.0 ± 2.3% for RGB-to-ChlF and 96.6 ± 4.2% for HSI-to-ChlF. For the Rosa × hybrida test set, the values were 98.9 ± 0.5% for RGB-to-ChlF and 98.3 ± 1.3% for HSI-to-ChlF.ConclusionThe presented multi-modal imaging pipeline enables high-throughput, high-dimensional phenotyping of different plant species with respect to various biotic or abiotic stressors. This paves the way for in-depth studies investigating the correlative relationships of the multi-domain data or the performance enhancement of machine learning models via multi modal image fusion.
Project description:Hyperspectral cameras onboard unmanned aerial vehicles (UAVs) have recently emerged for monitoring crop traits at the sub-field scale. Different physical, statistical, and hybrid methods for crop trait retrieval have been developed. However, spectra collected from UAVs can be confounded by various issues, including illumination variation throughout the crop growing season, the effect of which on the retrieval performance is not well understood at present. In this study, four retrieval methods are compared, in terms of retrieving the leaf area index (LAI), fractional vegetation cover (fCover), and canopy chlorophyll content (CCC) of potato plants over an agricultural field for six dates during the growing season. We analyzed: (1) The standard look-up table method (LUTstd), (2) an improved (regularized) LUT method that involves variable correlation (LUTreg), (3) hybrid methods, and (4) random forest regression without (RF) and with (RFexp) the exposure time as an additional explanatory variable. The Soil-Leaf-Canopy (SLC) model was used in association with the LUT-based inversion and hybrid methods, while the statistical modelling methods (RF and RFexp) relied entirely on in situ data. The results revealed that RFexp was the best-performing method, yielding the highest accuracies, in terms of the normalized root mean square error (NRMSE), for LAI (5.36%), fCover (5.87%), and CCC (15.01%). RFexp was able to reduce the effects of illumination variability and cloud shadows. LUTreg outperformed the other two retrieval methods (hybrid methods and LUTstd), with an NRMSE of 9.18% for LAI, 10.46% for fCover, and 12.16% for CCC. Conversely, LUTreg led to lower accuracies than those derived from RF for LAI (5.51%) and for fCover (6.23%), but not for CCC (16.21%). Therefore, the machine learning approaches-in particular, RF-appear to be the most promising retrieval methods for application to UAV-based hyperspectral data.
Project description:Various rice diseases threaten the growth of rice. It is of great importance to achieve the rapid and accurate detection of rice diseases for precise disease prevention and control. Hyperspectral imaging (HSI) was performed to detect rice leaf diseases in four different varieties of rice. Considering that it costs much time and energy to develop a classifier for each variety of rice, deep transfer learning was firstly introduced to rice disease detection across different rice varieties. Three deep transfer learning methods were adapted for 12 transfer tasks, namely, fine-tuning, deep CORrelation ALignment (CORAL), and deep domain confusion (DDC). A self-designed convolutional neural network (CNN) was set as the basic network of the deep transfer learning methods. Fine-tuning achieved the best transferable performance with an accuracy of over 88% for the test set of the target domain in the majority of transfer tasks. Deep CORAL obtained an accuracy of over 80% in four of all the transfer tasks, which was superior to that of DDC. A multi-task transfer strategy has been explored with good results, indicating the potential of both pair-wise, and multi-task transfers. A saliency map was used for the visualization of the key wavelength range captured by CNN with and without transfer learning. The results indicated that the wavelength range with and without transfer learning was overlapped to some extent. Overall, the results suggested that deep transfer learning methods could perform rice disease detection across different rice varieties. Hyperspectral imaging, in combination with the deep transfer learning method, is a promising possibility for the efficient and cost-saving field detection of rice diseases among different rice varieties.
Project description:With progress of genetic sequencing technology, plant genomics has experienced rapid development and subsequently triggered the progress of plant phenomics. In this study, a high-throughput hyperspectral imaging system (HHIS) was developed to obtain 1,540 hyperspectral indices at whole-plant level during tillering, heading, and ripening stages. These indices were used to quantify traditional agronomic traits and to explore genetic variation. We performed genome-wide association study (GWAS) of these indices and traditional agronomic traits in a global rice collection of 529 accessions. With the genome-level suggestive P-value threshold, 989 loci were identified. Of the 1,540 indices, we detected 502 significant indices (designated as hyper-traits) that exhibited phenotypic and genetic relationship with traditional agronomic traits and had high heritability. Many hyper-trait-associated loci could not be detected using traditional agronomic traits. For example, we identified a candidate gene controlling chlorophyll content (Chl). This gene, which was not identified based on Chl, was significantly associated with a chlorophyll-related hyper-trait in GWAS and was demonstrated to control Chl. Moreover, our study demonstrates that red edge (680-760 nm) is vital for rice research for phenotypic and genetic insights. Thus, combination of HHIS and GWAS provides a novel platform for dissection of complex traits and for crop breeding.
Project description:Microscopy and omics are complementary approaches to probe the molecular state of cells in health and disease, combining granularity with scalability. While important advances have been achieved over the last decade in each area, integrating both imaging- and sequencing-based assays on the same cell has proven challenging. In this study, a new approach called HyperSeq that combines hyperspectral autofluorescence imaging with transcriptomics on the same cell is demonstrated. HyperSeq was applied to Michigan Cancer Foundation 7 (MCF-7) breast cancer cells and identified a subpopulation of cells exhibiting bright autofluorescence rings at the plasma membrane in optical channel 13 (ex = 431 nm, em = 594 nm). Correlating the presence of a ring with the gene expression in the same cell indicated that ringed cells are more likely to express hallmark genes of apoptosis and gene silencing and less likely to express genes associated with ATP production. Further, correlation of cell morphology with gene expression suggested that multiple members of the spliceosome were upregulated in larger cells. A number of genes, albeit evenly expressed across cell sizes, exhibited higher usage of specific exons in larger or smaller cells. Finally, correlation between gene expression and fluorescence within the spectral range of Nicotinamide adenine dinucleotide hydrogen (NADH) provided preliminary insight into the metabolic states of cells. These observations provided a link between the cell’s optical spectrum and its internal molecular state, demonstrating the utility of HyperSeq to study cell biology at single cell resolution by integrating spectral, morphological and transcriptomic analyses into a single, streamlined workflow.
Project description:Bakanae disease, caused by Fusarium fujikuroi, poses a significant threat to rice production and has been observed in most rice-growing regions. The disease symptoms caused by different pathogens may vary, including elongated and weak stems, slender and yellow leaves, and dwarfism, as example. Bakanae disease is likely to cause necrosis of diseased seedlings, and it may cause a large area of infection in the field through the transmission of conidia. Therefore, early disease surveillance plays a crucial role in securing rice production. Traditional monitoring methods are both time-consuming and labor-intensive and cannot be broadly applied. In this study, a combination of hyperspectral imaging technology and deep learning algorithms were used to achieve in situ detection of rice seedlings infected with bakanae disease. Phenotypic data were obtained on the 9th, 15th, and 21st day after rice infection to explore the physiological and biochemical performance, which helps to deepen the research on the disease mechanism. Hyperspectral data were obtained over these same periods of infection, and a deep learning model, named Rice Bakanae Disease-Visual Geometry Group (RBD-VGG), was established by leveraging hyperspectral imaging technology and deep learning algorithms. Based on this model, an average accuracy of 92.2% was achieved on the 21st day of infection. It also achieved an accuracy of 79.4% as early as the 9th day. Universal characteristic wavelengths were extracted to increase the feasibility of using portable spectral equipment for field surveillance. Collectively, the model offers an efficient and non-destructive surveillance methodology for monitoring bakanae disease, thereby providing an efficient avenue for disease prevention and control.Supplementary informationThe online version contains supplementary material available at 10.1007/s42994-024-00169-1.