Project description:Social media dominate today's information ecosystem and provide valuable information for social research. Market researchers, social scientists, policymakers, government entities, public health researchers, and practitioners recognize the potential for social data to inspire innovation, support products and services, characterize public opinion, and guide decisions. The appeal of mining these rich datasets is clear. However, there is potential risk of data misuse, underscoring an equally huge and fundamental flaw in the research: there are no procedural standards and little transparency. Transparency across the processes of collecting and analyzing social media data is often limited due to proprietary algorithms. Spurious findings and biases introduced by artificial intelligence (AI) demonstrate the challenges this lack of transparency poses for research. Social media research remains a virtual "wild west," with no clear standards for reporting regarding data retrieval, preprocessing steps, analytic methods, or interpretation. Use of emerging generative AI technologies to augment social media analytics can undermine validity and replicability of findings, potentially turning this research into a "black box" enterprise. Clear guidance for social media analyses and reporting is needed to assure the quality of the resulting research. In this article, we propose criteria for evaluating the quality of studies using social media data, grounded in established scientific practice. We offer clear documentation guidelines to ensure that social data are used properly and transparently in research and applications. A checklist of disclosure elements to meet minimal reporting standards is proposed. These criteria will make it possible for scholars and practitioners to assess the quality, credibility, and comparability of research findings using digital data.
Project description:IntroductionReproducibility is critical to diagnostic accuracy and treatment implementation. Concurrent with clinical reproducibility, research reproducibility establishes whether the use of identical study materials and methodologies in replication efforts permits researchers to arrive at similar results and conclusions. In this study, we address this gap by evaluating nephrology literature for common indicators of transparent and reproducible research.MethodsWe searched the National Library of Medicine catalog to identify 36 MEDLINE-indexed, English-language nephrology journals. We randomly sampled 300 publications published between January 1, 2014, and December 31, 2018.ResultsOur search yielded 28,835 publications, of which we randomly sampled 300 publications. Of the 300 publications, 152 (50.7%) were publicly available, whereas 143 (47.7%) were restricted through paywall and 5 (1.7%) were inaccessible. Of the remaining 295 publications, 123 were excluded because they lack empirical data necessary for reproducibility. Of the 172 publications with empirical data, 43 (25%) reported data availability statements and 4 (2.3%) analysis scripts. Of the 71 publications analyzed for preregistration and protocol availability, 0 (0.0%) provided links to a protocol and 8 (11.3%) were preregistered.ConclusionOur study found that reproducible and transparent research practices are infrequently used by the nephrology research community. Greater efforts should be made by both funders and journals. In doing so, an open science culture may eventually become the norm rather than the exception.
Project description:BackgroundThe landscape of available psychosocial services within pediatric nephrology care is poorly characterized. However, the effects of kidney disease on emotional health and health-related quality of life are well documented, as is the impact of social determinants of health on kidney disease outcomes. The objectives of this study were to assess pediatric nephrologists' perceptions of available psychosocial services and to elucidate inequities in access to psychosocial care.MethodsA web-based survey was distributed to members of the Pediatric Nephrology Research Consortium (PNRC). Quantitative analyses were performed.ResultsWe received responses from 49 of the 90 PNRC centers. With regards to dedicated services, social work was most commonly available (45.5-100%), followed by pediatric psychology (0-57.1%) and neuropsychology (0-14.3%), with no centers having embedded psychiatry. Availability of psychosocial providers was positively associated with nephrology division size, such that as center size increased, access to various psychosocial providers increased. Notably, the majority of respondents indicated that perceived need for psychosocial support exceeds that which is currently available, even at centers with higher levels of current support.ConclusionsWithin the US, there is wide variability in the availability of psychosocial services within pediatric nephrology centers despite a well-documented necessity for the provision of holistic care. Much work remains to better understand the variation in funding for psychosocial services and in utilization of psychosocial professionals in the pediatric nephrology clinic, and to inform key best practices for addressing the psychosocial needs of patients with kidney disease.
Project description:We assessed adherence to five transparency practices-data sharing, code sharing, conflict of interest disclosure, funding disclosure, and protocol registration-in articles in dental journals. We searched and exported the full text of all research articles from PubMed-indexed dental journals available in the Europe PubMed Central database until the end of 2021. We programmatically assessed their adherence to the five transparency practices using a validated and automated tool. Journal- and article-related information was retrieved from ScimagoJR and Journal Citation Reports. Of all 329,784 articles published in PubMed-indexed dental journals, 10,659 (3.2%) were available to download. Of those, 77% included a conflict of interest disclosure, and 62% included a funding disclosure. Seven percent of the articles had a registered protocol. Data sharing (2.0%) and code sharing (0.1%) were rarer. Sixteen percent of articles did not adhere to any of the five transparency practices, 29% adhered to one, 48% adhered to two, 7.0% adhered to three, 0.3% adhered to four, and no article adhered to all five practices. Adherence to transparency practices increased over time; however, data and code sharing especially remained rare. Coordinated efforts involving all stakeholders are needed to change current transparency practices in dental research.
Project description:More than one published paper are often derived from analyzing the same cohort of individuals to make full use of the collected information. Preplanned study outcomes are generally mentioned in open databases while exhaustive information on methodological aspects are provided in submitted articles.
Project description:ObjectiveTo annotate a corpus of randomized controlled trial (RCT) publications with the checklist items of CONSORT reporting guidelines and using the corpus to develop text mining methods for RCT appraisal.MethodsWe annotated a corpus of 50 RCT articles at the sentence level using 37 fine-grained CONSORT checklist items. A subset (31 articles) was double-annotated and adjudicated, while 19 were annotated by a single annotator and reconciled by another. We calculated inter-annotator agreement at the article and section level using MASI (Measuring Agreement on Set-Valued Items) and at the CONSORT item level using Krippendorff's α. We experimented with two rule-based methods (phrase-based and section header-based) and two supervised learning approaches (support vector machine and BioBERT-based neural network classifiers), for recognizing 17 methodology-related items in the RCT Methods sections.ResultsWe created CONSORT-TM consisting of 10,709 sentences, 4,845 (45%) of which were annotated with 5,246 labels. A median of 28 CONSORT items (out of possible 37) were annotated per article. Agreement was moderate at the article and section levels (average MASI: 0.60 and 0.64, respectively). Agreement varied considerably among individual checklist items (Krippendorff's α= 0.06-0.96). The model based on BioBERT performed best overall for recognizing methodology-related items (micro-precision: 0.82, micro-recall: 0.63, micro-F1: 0.71). Combining models using majority vote and label aggregation further improved precision and recall, respectively.ConclusionOur annotated corpus, CONSORT-TM, contains more fine-grained information than earlier RCT corpora. Low frequency of some CONSORT items made it difficult to train effective text mining models to recognize them. For the items commonly reported, CONSORT-TM can serve as a testbed for text mining methods that assess RCT transparency, rigor, and reliability, and support methods for peer review and authoring assistance. Minor modifications to the annotation scheme and a larger corpus could facilitate improved text mining models. CONSORT-TM is publicly available at https://github.com/kilicogluh/CONSORT-TM.
Project description:BackgroundWe aimed to assess the adherence to five transparency practices (data availability, code availability, protocol registration and conflicts of interest (COI), and funding disclosures) from open access Coronavirus disease 2019 (COVID-19) related articles.MethodsWe searched and exported all open access COVID-19-related articles from PubMed-indexed journals in the Europe PubMed Central database published from January 2020 to June 9, 2022. With a validated and automated tool, we detected transparent practices of three paper types: research articles, randomized controlled trials (RCTs), and reviews. Basic journal- and article-related information were retrieved from the database. We used R for the descriptive analyses.ResultsThe total number of articles was 258,678, of which we were able to retrieve full texts of 186,157 (72%) articles from the database Over half of the papers (55.7%, n = 103,732) were research articles, 10.9% (n = 20,229) were review articles, and less than one percent (n = 1,202) were RCTs. Approximately nine-tenths of articles (in all three paper types) had a statement to disclose COI. Funding disclosure (83.9%, confidence interval (CI): 81.7-85.8 95%) and protocol registration (53.5%, 95% CI: 50.7-56.3) were more frequent in RCTs than in reviews or research articles. Reviews shared data (2.5%, 95% CI: 2.3-2.8) and code (0.4%, 95% CI: 0.4-0.5) less frequently than RCTs or research articles. Articles published in 2022 had the highest adherence to all five transparency practices. Most of the reviews (62%) and research articles (58%) adhered to two transparency practices, whereas almost half of the RCTs (47%) adhered to three practices. There were journal- and publisher-related differences in all five practices, and articles that did not adhere to transparency practices were more likely published in lowest impact journals and were less likely cited.ConclusionWhile most articles were freely available and had a COI disclosure, adherence to other transparent practices was far from acceptable. A much stronger commitment to open science practices, particularly to protocol registration, data and code sharing, is needed from all stakeholders.
Project description:Medical biobanks often struggle to obtain sustainable funding. Commercialization of specimens is one solution, but disclosure of commercial interests to potential contributors can be dissuasive. Recent revisions to the federal human subjects research regulations will soon mandate such commercialization disclosure in some circumstances, which raises questions about implications for practice. In this nationally representative, probability-based survey sample of the US adult population, we found that 67 percent of participants agreed that clear notification of potential commercialization of biospecimens is warranted, but only 23 percent were comfortable with such use. Sixty-two percent believed that profits should be used only to support future research, and 41 percent supported sharing profits with the public. We also considered other factors related to disclosure in our analysis and argue for a "disclosure plus" standard: informing potential contributors that their biospecimens might be accessed by commercial organizations and explaining how profits would be used to both enhance transparency and facilitate contributors' altruistic motivations.
Project description:Advances in medical care and biomedical research depend on the participation of human subjects. Poor patient enrollment in research has limited past clinical and translational research endeavors in nephrology. Simultaneously, patients and their caregivers are seeking better diagnostic, monitoring, and therapeutic approaches to improve or restore kidney and overall health. This manuscript will discuss a framework and strategies to optimize patient enrollment within nephrology research and provide examples of success from existing nephrology research programs.
Project description:Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery.