Project description:Nature conservation and fisheries management often focus on particular seabed features that are considered vulnerable or important to commercial species. As a result, individual seabed types are protected in isolation, without any understanding of what effect the mixture of seabed types within the landscape has on ecosystem functions. Here we undertook predictive seabed modelling within a coastal marine protected area using observations from underwater stereo-video camera deployments and environmental information (depth, wave fetch, maximum tidal speeds, distance from coast and underlying geology). The effect of the predicted substratum type, extent and heterogeneity or the diversity of substrata, within a radius of 1500 m around each camera deployment of juvenile gadoid relative abundance was analysed. The predicted substratum model performed well with wave fetch and depth being the most influential predictor variables. Gadus morhua (Atlantic cod) were associated with relatively more rugose substrata (Algal-gravel-pebble and seagrass) and heterogeneous landscapes, than Melanogrammus aeglefinus (haddock) or Merlangius merlangus (whiting) (sand and mud). An increase in M. merlangus relative abundance was observed with increasing substratum extent. These results reveal that landscape effects should be considered when protecting the seabed for fish and not just individual seabed types. The landscape approach used in this study therefore has important implications for marine protected area, fisheries management and monitoring advice concerning demersal fish populations.
Project description:Information about lake morphometry (e.g., depth, volume, size, etc.) aids understanding of the physical and ecological dynamics of lakes, yet is often not readily available. The data needed to calculate measures of lake morphometry, particularly lake depth, are usually collected on a lake-by-lake basis and are difficult to obtain across broad regions. To span the gap between studies of individual lakes where detailed data exist and regional studies where access to useful data on lake depth is unavailable, we developed a method to predict maximum lake depth from the slope of the topography surrounding a lake. We use the National Elevation Dataset and the National Hydrography Dataset - Plus to estimate the percent slope of surrounding lakes and use this information to predict maximum lake depth. We also use field measured maximum lake depths from the US EPA's National Lakes Assessment to empirically adjust and cross-validate our predictions. We were able to predict maximum depth for ∼28,000 lakes in the Northeastern United States with an average cross-validated RMSE of 5.95 m and 5.09 m and average correlation of 0.82 and 0.69 for Hydrological Unit Code Regions 01 and 02, respectively. The depth predictions and the scripts are openly available as supplements to this manuscript.
Project description:This report examines how sensing of substrate topography can be used to modulate T cell activation, a key coordinating step in the adaptive immune response. Inspired by the native T cell-antigen presenting cell interface, micrometer scale pits with varying depth are fabricated into planar substrates. Primary CD4+ T cells extend actin-rich protrusions into the micropits. T cell activation, reflected in secretion of cytokines interleukin-2 and interferon gamma, is sensitive to the micropit depth. Surprisingly, arrays of micropits with 4 μm depth enhance activation compared to flat substrates but deeper micropits are less effective at increasing cell response, revealing a biphasic dependence in activation as a function of feature dimensions. Inhibition of cell contractility abrogates the enhanced activation associated with the micropits. In conclusion, this report demonstrates that the 3D, microscale topography can be used to enhance T cell activation, an ability that most directly can be used to improve production of these cells for immunotherapy.
Project description:Soil surface roughness controls how water ponds on and flows over soil surfaces. It is a crucial parameter for erosion and runoff studies. Surface roughness has traditionally been measured using manual techniques that are simple but laborious. Newer technologies have been proposed that are less laborious but require expensive equipment and considerable expertise. New depth-camera technologies might provide a useful alternative. We tested the ability of one such camera to measure soil surface roughness. The camera's accuracy was good but decreased with camera-soil distance (0.3% at 750 mm and 0.5% at 1500 mm) however it was very precise (< 0.5 mm for elevation and < 0.05 mm for random roughness). Similarly, the error of the surface area estimation increased with camera-soil distance (0.56% at 750 mm and 2.3% at 1500 mm). We describe the workflow to produce high-resolution digital elevation models from initial images and describe the conditions under which the camera will not work well (e.g. extremes of lighting conditions, inappropriate post-processing options). The camera was reliable, required little in the way of additional technology and was practical to use in the field. We propose that depth cameras are a simple and inexpensive alternative to existing techniques. •We tested a commercially-available 3D depth camera.•The camera gave highly accurate and precise soil surface measurements.•The camera provides an inexpensive alternative to existing techniques.
Project description:Clinical trials with longitudinal outcomes typically include missing data due to missed assessments or structural missingness of outcomes after intercurrent events handled with a hypothetical strategy. Approaches based on Bayesian random multiple imputation and Rubin's rules for pooling results across multiple imputed data sets are increasingly used in order to align the analysis of these trials with the targeted estimand. We propose and justify deterministic conditional mean imputation combined with the jackknife for inference as an alternative approach. The method is applicable to imputations under a missing-at-random assumption as well as for reference-based imputation approaches. In an application and a simulation study, we demonstrate that it provides consistent treatment effect estimates with the Bayesian approach and reliable frequentist inference with accurate standard error estimation and type I error control. A further advantage of the method is that it does not rely on random sampling and is therefore replicable and unaffected by Monte Carlo error.
Project description:PurposeTo investigate if topography-guided laser in situ keratomileusis (LASIK) depending on the topographic astigmatism which is measured using the Topolyzer leads to a better refractive outcome when compared to treatment of the manifest refractive astigmatism in cases of myopic astigmatism.MethodsThis was a prospective non-masked, randomized study (block randomization) of postoperative vision and refraction of patients with myopic astigmatism that had LASIK using Contoura vision software. They were divided into three groups according to the treatment strategy, treating the manifest astigmatism in one group, the topographic astigmatism with compensation for the spherical power in the second group and treating the topographic astigmatism without changing the spherical power in the third group. It was conducted at Kasr Alainy Hospital and Dar Eloyon Hospital.ResultsThe postoperative uncorrected distant visual acuity (UDVA) in each group separately was better than the preoperative corrected distant visual acuity (CDVA) (58% (n=35) had UDVA better than 20/20 and gained 1 line or more); however, eyes treated with the topographic astigmatism without changing the spherical power showed the statistically best results (75% (n=15) had UDVA better than 20/20). The residual anterior corneal astigmatism was lower in this group (the mean 0.47 vs 0.95 and 0.59). No significant difference was noted in the residual refractive astigmatism, but it was also the least in that group.ConclusionTopography-guided LASIK is a safe and effective ablation profile for treatment of myopic astigmatism. Treating according to the topographic astigmatism shows the best outcome as regards the vision and residual astigmatism.
Project description:The depth is important for vessel navigation at sea. Currently, most vessels use electronic navigation charts to navigate at sea. In coastal areas, especially close to shallow water areas, the dynamic change of the water level is very important to safe navigation. Ships calculate the change of water level by using up-to-date tide tables, to obtain the dynamic water depth in the channels. However, the depth caused by the tide and non-tidal components may reach several meters in some seas, causing the dynamic depth below the safety depth, which can easily lead to grounding of vessels stranding accidents. The channel is regularly dredged to achieve navigational depth. Without regular dredging, the offshore non-channel area becomes the common area of ship grounding. The dynamic chart depth model studied in this article can provide real-time depth, which serves the ships navigation in the non-channel. The model incorporates the chart depth and the dynamic water levels on the same reference datum. The chart depth is from the electronic navigational chart depth. The dynamic water levels are constructed by the simulated tidal levels and continuous series of nontidal residual. We then designed a deviation correction method to reduce the discrepancy of the simulated tidal level with the actual water level, including datum offset correction and residual water level correction. Finally, by merging the revised dynamic water levels with the electronic navigational chart depth, we obtained the dynamic chart depth model of the study region.
Project description:ObjectiveTo investigate the correlation between corneal biomechanical properties and topographic parameters using machine learning networks for automatic severity diagnosis and reference benchmark construction.MethodsThis was a retrospective study involving 31 eyes from 31 patients with keratonus. Two clustering approaches were used (i.e., shape-based and feature-based). The shape-based method used a keratoconus benchmark validated for indicating the severity of keratoconus. The feature-based method extracted imperative features for clustering analysis.ResultsThere were strong correlations between the symmetric modes and the keratoconus severity and between the asymmetric modes and the location of the weak centroid. The Pearson product-moment correlation coefficient (PPMC) between the symmetric mode and normality was 0.92 and between the asymmetric mode and the weak centroid value was 0.75.ConclusionThis study confirmed that there is a relationship between the keratoconus signs obtained from topography and the corneal dynamic behaviour captured by the Corvis ST device. Further studies are required to gather more patient data to establish a more extensive database for validation.
Project description:Secondary ion mass spectrometry using the argon cluster primary ion beam enables molecular compositional depth profiling of organic thin films with minimal loss of chemical information or changes in sputter rate. However, for depth profiles of thicker organic films (> 10 μm of sputtered depth) we have observed the rapid formation of micron-scale topography in the shape of pillars that significantly affect both the linearity of the sputter yield and depth resolution. To minimize distortions in the 3D reconstruction of the sample due to this topography, a step-wise, staggered sample rotation was employed. By using polymer spheres embedded in an organic film, it was possible to measure the depth resolution at the film-sphere interface as a function of sputtered depth and observe when possible distortions in the 3D image occurred. In this way, it was possible to quantitatively measure the effect of micron-scale topography and sample rotation on the quality of the depth profile.
Project description:The ability of climate models to simulate 20th century global mean sea level (GMSL) and regional sea-level change has been demonstrated. However, the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and Special Report on the Ocean and Cryosphere in a Changing Climate (SROCC) sea-level projections have not been rigorously evaluated with observed GMSL and coastal sea level from a global network of tide gauges as the short overlapping period (2007-2018) and natural variability make the detection of trends and accelerations challenging. Here, we critically evaluate these projections with satellite and tide-gauge observations. The observed trends from GMSL and the regional weighted mean at tide-gauge stations confirm the projections under three Representative Concentration Pathway (RCP) scenarios within 90% confidence level during 2007-2018. The central values of the observed GMSL (1993-2018) and regional weighted mean (1970-2018) accelerations are larger than projections for RCP2.6 and lie between (or even above) those for RCP4.5 and RCP8.5 over 2007-2032, but are not yet statistically different from any scenario. While the confirmation of the projection trends gives us confidence in current understanding of near future sea-level change, it leaves open questions concerning late 21st century non-linear accelerations from ice-sheet contributions.