bims-skolko Biomed News
on Scholarly communication
Issue of 2019‒07‒07
23 papers selected by
Thomas Krichel
Open Library Society


  1. Elife. 2019 Jul 02. pii: e43094. [Epub ahead of print]8
      Concerns have been expressed about the robustness of experimental findings in several areas of science, but these matters have not been evaluated at scale. Here we identify a large sample of published drug-gene interaction claims curated in the Comparative Toxicogenomics Database (for example, benzo(a)pyrene decreases expression of SLC22A3) and evaluate these claims by connecting them with high-throughput experiments from the LINCS L1000 program. Our sample included 60,159 supporting findings and 4253 opposing findings about 51,292 drug-gene interaction claims in 3363 scientific articles. We show that claims reported in a single paper replicate 19.0% (95% confidence interval [CI], 16.9-21.2%) more frequently than expected, while claims reported in multiple papers replicate 45.5% (95% CI, 21.8-74.2%) more frequently than expected. We also analyze the subsample of interactions with two or more published findings (2493 claims; 6272 supporting findings; 339 opposing findings; 1282 research articles), and show that centralized scientific communities, which use similar methods and involve shared authors who contribute to many articles, propagate less replicable claims than decentralized communities, which use more diverse methods and contain more independent teams. Our findings suggest how policies that foster decentralized collaboration will increase the robustness of scientific findings in biomedical research.
    Keywords:  collaboration networks; computational biology; computational social science; meta-research; none; replication; robust knowledge; sociology of science; systems biology
    DOI:  https://doi.org/10.7554/eLife.43094
  2. Int J Ment Health Nurs. 2019 Jun 30.
      The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information. The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non-traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research. The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal's articles. To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed. Quantitative evidence demonstrates the effectiveness of one journal's social media strategy in increasing the reach and readership of the articles it publishes. This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.
    Keywords:  Twitter; editor; journal; social media; strategy
    DOI:  https://doi.org/10.1111/inm.12600
  3. Nurs Outlook. 2019 May 11. pii: S0029-6554(19)30189-7. [Epub ahead of print]
      BACKGROUND: Nursing journals from predatory publication outlets may look authentic and seem to be a credible source of information. However, further inspection may reveal otherwise.PURPOSE: The purpose of this study was to analyze publication and dissemination patterns of articles published in known predatory nursing journals.
    METHOD: Using Scopus, reference lists were searched for citations from seven identified predatory nursing journals. Bibliographic information and subsequent citation information were then collected and analyzed.
    FINDINGS: A total of 814 citations of articles published in predatory nursing journals were identified. Further analysis indicated that these articles were cited in 141 nonpredatory nursing journals of various types.
    DISCUSSION: Predatory nursing journals continue to persist, yet fewer may now be in existence. Education and information may help authors and reviewers identify predatory journals, thereby discouraging submissions to these publications and hesitancy among authors to cite articles published in them.
    Keywords:  Citation analysis; Knowledge dissemination; Nursing literature; Open access; Predatory nursing journals; Publications
    DOI:  https://doi.org/10.1016/j.outlook.2019.05.001
  4. Nature. 2019 Jul;571(7763): 95-98
      The overwhelming majority of scientific knowledge is published as text, which is difficult to analyse by either traditional statistical analysis or modern machine learning methods. By contrast, the main source of machine-interpretable data for the materials research community has come from structured property databases1,2, which encompass only a small fraction of the knowledge present in the research literature. Beyond property values, publications contain valuable knowledge regarding the connections and relationships between data items as interpreted by the authors. To improve the identification and use of this knowledge, several studies have focused on the retrieval of information from scientific literature using supervised natural language processing3-10, which requires large hand-labelled datasets for training. Here we show that materials science knowledge present in the published literature can be efficiently encoded as information-dense word embeddings11-13 (vector representations of words) without human labelling or supervision. Without any explicit insertion of chemical knowledge, these embeddings capture complex materials science concepts such as the underlying structure of the periodic table and structure-property relationships in materials. Furthermore, we demonstrate that an unsupervised method can recommend materials for functional applications several years before their discovery. This suggests that latent knowledge regarding future discoveries is to a large extent embedded in past publications. Our findings highlight the possibility of extracting knowledge and relationships from the massive body of scientific literature in a collective manner, and point towards a generalized approach to the mining of scientific literature.
    DOI:  https://doi.org/10.1038/s41586-019-1335-8
  5. Nature. 2019 Jul;571(7764): 147
      
    Keywords:  Publishing; Research management
    DOI:  https://doi.org/10.1038/d41586-019-02084-8
  6. J Investig Med. 2019 Jun 26. pii: jim-2019-001009. [Epub ahead of print]
      The journal impact factor (IF) is the leading method of scholarly assessment in today's research world. An important question is whether or not this is still a constructive method. For a specific journal, the IF is the number of citations for publications over the previous 2 years divided by the number of total citable publications in these years (the citation window). Although this simplicity works to an advantage of this method, complications arise when answers to questions such as 'What is included in the citation window' or 'What makes a good journal impact factor' contain ambiguity. In this review, we discuss whether or not the IF should still be considered the gold standard of scholarly assessment in view of the many recent changes and the emergence of new publication models. We will outline its advantages and disadvantages. The advantages of the IF include promoting the author meanwhile giving the readers a visualization of the magnitude of review. On the other hand, its disadvantages include reflecting the journal's quality more than the author's work, the fact that it cannot be compared across different research disciplines, and the struggles it faces in the world of open access. Recently, alternatives to the IF have been emerging, such as the SCImago Journal & Country Rank, the Source Normalized Impact per Paper and the Eigenfactor Score, among others. However, all alternatives proposed thus far are associated with their own limitations as well. In conclusion, although IF contains its cons, until there are better proposed alternative methods, IF remains one of the most effective methods for assessing scholarly activity.
    Keywords:  education, medical; evidence-based medicine; science
    DOI:  https://doi.org/10.1136/jim-2019-001009
  7. Mol Cell Proteomics. 2019 Jul 04. pii: mcp.IP119.001567. [Epub ahead of print]
      To truly achieve reproducible research, having reproducible analytics must be a principal research goal. Biological discovery is not the only deliverable; reproducibility is an essential part of our research.
    Keywords:  Algorithms; Bioinformatics; Computational Biology; Data evaluation; Data standards; open science; reproducibility; transparency
    DOI:  https://doi.org/10.1074/mcp.IP119.001567
  8. J Postgrad Med. 2019 Jul 03.
      Medical student journals (MSJs) refer to a cluster of entirely student-led periodicals that publish student-authored articles. A recent review showed that MSJs characteristically employ a student-friendly and feeble peer review process, which is largely associated with poor quality of published articles. Herein, as a graduate medical student, I call on peer medical students to make an informed decision in refraining from submitting their research work to MSJs for four primary reasons. These reasons, generally, include: 1) opaque peer-review process, 2) lack of MEDLINE® indexing, 3) absence of official journal impact factor scores, and 4) poor article visibility and exposure to scientific community. Furthermore, I encourage students to take advantage of the existing opportunities provided by the professional MEDLINE®-indexed journals in disseminating their research work. These opportunities include: 1) the absolute welcoming calls for student-authored contributions, and 2) the designated 'student contribution corners'. Lastly, I succinctly highlight the joint duties of medical schools, undergraduate research committees, institutional review boards and mentors in publishing the student-authored research work in the professional journals, rather than the MSJs.
    Keywords:  Publication; medical student journals; research
    DOI:  https://doi.org/10.4103/jpgm.JPGM_278_19
  9. Genome Biol. 2019 Jul 02. 20(1): 128
      As the scale of genomic and health-related data explodes and our understanding of these data matures, the privacy of the individuals behind the data is increasingly at stake. Traditional approaches to protect privacy have fundamental limitations. Here we discuss emerging privacy-enhancing technologies that can enable broader data sharing and collaboration in genomics research.
    DOI:  https://doi.org/10.1186/s13059-019-1741-0
  10. Br J Surg. 2019 Jul;106(8): 963-964
      
    DOI:  https://doi.org/10.1002/bjs.11285
  11. Brief Bioinform. 2019 Jun 29. pii: bbz044. [Epub ahead of print]
      Compelling research has recently shown that cancer is so heterogeneous that single research centres cannot produce enough data to fit prognostic and predictive models of sufficient accuracy. Data sharing in precision oncology is therefore of utmost importance. The Findable, Accessible, Interoperable and Reusable (FAIR) Data Principles have been developed to define good practices in data sharing. Motivated by the ambition of applying the FAIR Data Principles to our own clinical precision oncology implementations and research, we have performed a systematic literature review of potentially relevant initiatives. For clinical data, we suggest using the Genomic Data Commons model as a reference as it provides a field-tested and well-documented solution. Regarding classification of diagnosis, morphology and topography and drugs, we chose to follow the World Health Organization standards, i.e. ICD10, ICD-O-3 and Anatomical Therapeutic Chemical classifications, respectively. For the bioinformatics pipeline, the Genome Analysis ToolKit Best Practices using Docker containers offer a coherent solution and have therefore been selected. Regarding the naming of variants, we follow the Human Genome Variation Society's standard. For the IT infrastructure, we have built a centralized solution to participate in data sharing through federated solutions such as the Beacon Networks.
    Keywords:  FAIR Data Principles; data sharing; genomics; precision oncology; standards
    DOI:  https://doi.org/10.1093/bib/bbz044
  12. Nature. 2019 Jul;571(7763): 7
      
    Keywords:  Institutions; Publishing; Research management
    DOI:  https://doi.org/10.1038/d41586-019-02023-7
  13. Epilepsy Behav. 2019 Jun 25. pii: S1525-5050(19)30315-4. [Epub ahead of print]97 174-181
      INTRODUCTION: Psychological interventions hold promise for the epilepsy population and continue to be trialed to determine their efficacy. Such interventions present opportunities for variance in delivery. Therefore, to accurately interpret a trial's estimate of effect, information on implementation fidelity (IF) is required. We present a novel 3-part study. Part 1 systematically rated trials for the extent to which they reported assessing whether the intervention was delivered as intended (adherence) and with what sort of skill (competence). Part 2 identified barriers to reporting and assessing on fidelity perceived by trialists. Part 3 determined what journals publishing epilepsy trials are doing to support IFs reporting.METHODS: Articles for 50 randomized controlled trials (RCTs)/quasi-RCTs of psychological interventions identified by Cochrane searches were rated using the Psychotherapy Outcome Study Methodology Rating Form's fidelity items. The 45 corresponding authors for the 50 trials were invited to complete the 'Barriers to Treatment Integrity Implementation Survey'. 'Instructions to Authors' for the 17 journals publishing the trials were reviewed for endorsement of popular reporting guidelines which refer to fidelity (Consolidated Standards of Reporting Trials (CONSORT) statement or Journal Article Reporting Standards [JARS]) and asked how they enforced compliance.
    RESULTS: Part 1: 15 (30%) trials reported assessing for adherence, but only 2 (4.3%) gave the result. Four (8.5%) reported assessing for competence, 1 (2.1%) gave the result. Part 2: 22 trialists - mostly chief investigators - responded. They identified 'lack of theory and specific guidelines on treatment integrity procedures', 'time, cost, and labor demands', and 'lack of editorial requirement' as "strong barriers". Part 3: Most (15, 88.2%) journals endorsed CONSORT or JARS, but only 5 enforced compliance.
    CONCLUSIONS: Most trials of psychological interventions for epilepsy are not reported in a transparent way when it comes to IF. The barriers' trialists identify for this do not appear insurmountable. Addressing them could ultimately help the field to better understand how best to support the population with epilepsy.
    Keywords:  Epilepsy; Fidelity; Psychological; Reporting; Treatment; Trials
    DOI:  https://doi.org/10.1016/j.yebeh.2019.05.041
  14. Cell Syst. 2019 Jun 26. pii: S2405-4712(19)30197-8. [Epub ahead of print]8(6): 477-478
      
    DOI:  https://doi.org/10.1016/j.cels.2019.06.002
  15. Perspect Psychol Sci. 2019 Jul 01. 1745691619850561
      Most scientific research is conducted by small teams of investigators who together formulate hypotheses, collect data, conduct analyses, and report novel findings. These teams operate independently as vertically integrated silos. Here we argue that scientific research that is horizontally distributed can provide substantial complementary value, aiming to maximize available resources, promote inclusiveness and transparency, and increase rigor and reliability. This alternative approach enables researchers to tackle ambitious projects that would not be possible under the standard model. Crowdsourced scientific initiatives vary in the degree of communication between project members from largely independent work curated by a coordination team to crowd collaboration on shared activities. The potential benefits and challenges of large-scale collaboration span the entire research process: ideation, study design, data collection, data analysis, reporting, and peer review. Complementing traditional small science with crowdsourced approaches can accelerate the progress of science and improve the quality of scientific research.
    Keywords:  collaboration; crowdsourcing; metascience; methodology; teams
    DOI:  https://doi.org/10.1177/1745691619850561
  16. Commun Biol. 2019 ;2 239
      As of January 1st 2019, authors submitting manuscripts to Communications Biology can choose to publish the reviewer reports and author replies with their articles. The first articles with associated reviewer reports have now been published, representing an important step in our broader journey toward greater openness.
    Keywords:  Peer review; Research data
    DOI:  https://doi.org/10.1038/s42003-019-0489-0
  17. J Med Libr Assoc. 2019 Jul;107(3): 341-351
      Objective: This study explores the variety of information formats used and audiences targeted by public health faculty in the process of disseminating research.Methods: The authors conducted semi-structured interviews with twelve faculty members in the School of Public Health at the University of Illinois at Chicago, asking them about their research practices, habits, and preferences.
    Results: Faculty scholars disseminate their research findings in a variety of formats intended for multiple audiences, including not only their peers in academia, but also public health practitioners, policymakers, government and other agencies, and community partners.
    Conclusion: Librarians who serve public health faculty should bear in mind the diversity of faculty's information needs when designing and improving library services and resources, particularly those related to research dissemination and knowledge translation. Promising areas for growth in health sciences libraries include supporting data visualization, measuring the impact of non-scholarly publications, and promoting institutional repositories for dissemination of research.
    DOI:  https://doi.org/10.5195/jmla.2019.524
  18. Lab Invest. 2019 Jul;99(7): 910-911
      
    DOI:  https://doi.org/10.1038/s41374-019-0287-9