Friday, September 23, 2016

B - Self-citation rates higher for men

Singh Chawla D. Self-citation rates higher for men. Nature 2016;535:212

Men cite their own papers 56% more than women on average, according to an analysis of 1.5 million studies published between 1779 and 2011. The analysis looked at papers across disciplines in the digital library JSTOR and found that men’s self-citation rate had risen to 70% more than women’s over the past two decades, despite an increase of women in academia in recent years. According to the study authors, men view their abilities more positively than women do and face fewer societal penalties for self-promotion than do women.

B - PhD thesis: being more open

Burrough-Boenisch J. PhD thesis: being more open about PhD papers. Nature 2016;536:274
(doi: 10.1038/536274b)

In the Netherlands, a PhD thesis is published before the viva voce exam with an ISBN identifier and is later posted online. Advantages over the traditional monograph thesis include: it is quick and easy to write; feedback from the papers' reviewers can be instructive; and students attain a presence in the international science community before graduation. The author of this Letter also suggests that the thesis itself could contain a statement of all assistance received.

B - Predatory journals

Beall J. Best practices for scholarly authors in the age of predatory journals. Annals of The Royal College of Surgeons of England 2016;98(2):77-79
(doi: 10.1308/rcsann.2016.0056)

The author discusses one recent phenomenon that has arisen from the open access movement: that of ‘predatory publishers’. These are individuals or companies that use the open access financial system (author pays, rather than library subscribes) to defraud authors and readers by promising reputable publishing platforms but delivering nothing of the sort. They frequently have imaginary editorial boards, do not operate any peer review or quality control, are unclear about payment requirements. The author manages a blog site that names publishers and journals that he has identified as predatory, the Beall's lists.

B - Gold OA sustainability

Mellon Foundation. Pay it forward. Investigating a sustainable model of open access article processing charges for large North American research institutions. 185p.

A major study conducted by the University of California, Davis, and the California Digital Library, the Pay-It-Forward project, addressed the financial ramifications for the types of research institutions whose affiliated scholars generate a preponderance of the scholarly literature. It investigated the financial sustainability of the OA gold model, in which journal publishers charge authors an article processing charge (APC) to generate revenue instead of subscriptions. The project has collected data on journal budgets and expenditures, publishing costs and APCs, attitudes about Gold OA of publishers and authors at various career stages, and authorship patterns at our institutions.

B - Data exchange standards for peer review

Paglione LD, Lawrence RN. Data exchange standards to support and acknowledge peer-review activity. Learned Publishing 2015;328:309-316
(doi: 10.1087/20150411)

A Working Group on Peer Review Service, facilitated by CASRAI, was created to develop a data model and citation standard for peer-review activity that can be used to support both existing and new review models. Standardized citation structures for reviews can enable the inclusion of peer-review activity in personal recognition and evaluation

Wednesday, August 03, 2016

N - invited reproducibility paper

The journal Information Systems has introduced a new article type: the invited reproducibility paper. Directly addressing the lack of reproducibility in science, the journal, published by Elsevier, is inviting authors to co-author a report of a verified reproduced experiment. All code and data is made available on Mendeley Data. You can read more on the Elsevier Connect blog.

Tuesday, August 02, 2016

N - Springer Nature research data policies

Springer Nature is introducing a set of standardised research data policies, aiming to have "the most comprehensive and inclusive research data policy of any large publisher". Aiming where possible to harmonise policies across many journals, while recognising the different data sharing needs and expectations of different communities, Springer Nature has opted for a modular set of policies and an implementation strategy. There are four main types of policy: (1) data sharing encouraged; (2) evidence of data sharing encouraged; (3) statements of data sharing required; (4) data sharing and peer review of data required. The policies are explained on the SpringerOpen Blog.

N - Dutch research misconduct and reproducibility funds

The Dutch government has committed €8 million to explore research misconduct and reproduce key studies. As reported by Times Higher Education, all researchers in the Netherlands will be questioned about their possible involvement in research misconduct or 'sloppy science', and a fund will be set up for replication of research that has influenced policy or gained media attention.

N - How Can I Share It?

How Can I Share it? ( is an initiative of the International Association of Scientific, Technical and Medical Publishers (STM), launched in May 2016. A long-standing STM working group has been exploring the effects of scholarly collaboration networks (SCNs), such as ResearchGate, Mendeley, Readcube and many others. The working group developed a set of voluntary principles for article sharing, endorsed by many publishers and SCNs, and the new site aims to provide practical information on all aspects of sharing articles.

N - Badges for books

Altmetric has enabled Badges for Books, for displaying how much attention a published book and its individual chapters have received. The badges are linked to ISBNs and record mentions in mainstream media, policy documents, reference managers, blogs, social media, and peer review platforms. The service launched on the Routledge Handbooks Online platform.

N - WAME professionalism code of conduct

The World Association of Medical Editors (WAME) has developed a professional code of conduct for medical journal editors. The code of conduct covers six areas: research integrity; personal development; policies and behaviour; editorial independence; best practice; and relevance. The code was created following discussions at WAME's 2015 International Conference for Medical Journal Editors.

Friday, June 24, 2016

B - Scientists' participation in public debates

Woolston C. Scientists are cautious about public outreach. Nature Febr. 2015

Scientists think that they should actively participate in public debates about science and technology - but many have misgivings about doing so, according to a survey of nearly 4,000 US researchers. Of the respondents, 87% said that scientists should “take an active role in public policy debates about science and technology”, and just over half said that they had talked about their research with reporters. However, 52% said that oversimplification of science in news reports was a major problem, They have also showed mixed feelings about news and social media.

B - Writing for lay audiences

Salita JT. Writing for lay audiences: a challenge for scientists. Medical Writing 2015;424(4):183-189
(doi: 10.1179/2047480615Z.000000000320)

Writing for lay audiences, especially lay summaries, is needed to increase health and science literacy, but this kind of writing can be difficult for scientists. The article describes why it can be so difficult and gives some advice on how scientists can cope with the challenge and how institutions and organisations can help.

B - Medical journalism

Whelan J. Medical journalism: another way to write about science. Medical Writing 2015;24(4):219-221
(doi: 10.1179/2047480615Z.000000000327)

True journalism differs from public relations and uncritically reproducing press releases. It involves doing background research into the context surrounding the finding being reported, seeking comments from independent experts, and highlighting the negative as well as positive aspects. In this article, the author pulls together information for medical writers interested in journalism or science writing.

Wednesday, June 22, 2016

B - Replicating psychology studies

Bohannon J. Many psychology papers fail replication test. Science 2015;349(6251):910-911
(doi: 10.1126/science.349.6251.910)

In the Open Science Collaboration, 270 psychologists from around the world signed up to replicate studies; they did not receive any funding. The group selected the studies to be replicated based on the feasibility of the experiment, choosing from those published in 2008 in three journals. Of the 100 prominent papers analyzed, only 39% could be replicated unambiguously. The results lend support to the idea that scientists and journal editors are biased—consciously or not—in what they publish.

B - Sex and gender equity in research: SAGER guidelines

Heidari S, Babor TF, De Castro P, et al. Sex and gender equity in research: rationale for the SAGER guidelines and recommended use. Research Integrity and Peer Review 2016;1:2
(doi: 10.1186/s41073-016-0007-6)

This article describes the rationale for an international set of guidelines to encourage a more systematic approach to the reporting of sex and gender in research across disciplines. The Sex and Gender Equity in Research (SAGER) guidelines are designed primarily to guide authors in preparing their manuscripts, but they are also useful for editors, as gatekeepers of science, to integrate assessment of sex and gender into all manuscripts as an integral part of the editorial process.

B - Rewarding reviewers

Warne V. Rewarding reviewers - sense or sensibility? A Wiley study explained. Learned Publishing 2016;29(1):41-50

In July 2015, Wiley surveyed over 170,000 researchers in order to explore peer reviewing experience; attitudes towards recognition and reward for reviewers; and training requirements. Results show that while reviewers choose to review in order to give back to the community, there is more perceived benefit in interacting with the community of a top-ranking journal than a low-ranking one. Seventy-seven per cent show an interest in receiving reviewer training. Reviewers strongly believe that reviewing is inadequately acknowledged at present and should carry more weight in their institutions' evaluation process.

B - What makes a good policy paper

Whitty JM. What makes an academic paper useful for health policy? BMC Medicine 2015;13:301
(doi: 10.1186/s12916-015-0544-8)

Getting relevant science and research into policy is essential. There are several barriers, but the easiest to reduce is making papers more relevant and accessible to policymakers. Opinion pieces backed up by footnotes are generally unusable for policy. Objective, rigorous, simply written original papers from multiple disciplines with data can be very helpful.

B - Post-publication peer review

Teixeira da Silva A, Dobr√°nszki J. Problems with traditional science publishing and finding a wider niche for post-publication peer review. Accountability in Research 2015;22(1):22-40
(doi: 10.1080/08989621.2014.899909)

Errors in the literature, incorrect findings, fraudulent data, poorly written scientific reports, or studies that cannot be reproduced not only serve as a burden on tax-payers' money, but they also serve to diminish public trust in science and its findings. Therefore, there is every need to fortify the validity of data that exists in the science literature. One way to address the problem is through post-publication peer review, an efficient complement to traditional peer-review that allows for the continuous improvement and strengthening of the quality of science publishing.

B - OA and knowledge translation

Adisesh A, Whiting A. Power to the people - open access publishing and knowledge translation. Occupational Medicine 2016;66:264-265
(doi: 10.1093/occmed/kqv191)

This Editorial attempts to demystify the rights and wrongs of self-archiving and explains some of the issues around open access (OA) publishing. There are essentially three major publication options for authors: no cost for publication in a subscription-based journal; OA journal publication where there may be an article processing charge (APC) paid by or on behalf of the authors; and publication in a hybrid journal where a subscription journal provides the option for OA publication upon payment of an APC. Occupational Medicine recognized the need for open access as early as 2007, when it became a ‘hybrid’ journal.

B - Rule violations

Gächter S, Schulz JF. Intrinsec honesty and the prevalence of rule violations across societies. Nature 2016;531:496-499
(doi: 10.1038/nature17160)

The authors present cross-societal experiments from 23 countries around the world that demonstrate a robust link between the prevalence of rule violations and intrinsic honesty. They developed an index of the ‘prevalence of rule violations’ (PRV). Their results suggest that institutions and cultural values influence PRV, which impact on people's intrinsec honesty and rule following.

Friday, June 17, 2016

B - Sharing clinical trial data

Taichman DB, Backus J, Baethge C, et al. Sharing clinical trial data. A proposal from the International Committee of Medical Journal Editors. JAMA February 2, 2016;315(5):467-468

The International Committee of Medical Journal Editors (ICMJE) believes that there is an ethical obligation to responsibly share data generated by interventional clinical trials because participants have put themselves at risk. As a condition of consideration for publication of a clinical trial report in our member journals, the ICMJE proposes to require authors to share with others the deidentified individual-patient data (IPD) underlying the results presented in the article (including tables, figures, and appendices or supplementary material) no later than 6 months after publication. The ICMJE also proposes to require that authors include a plan for data sharing as a component of clinical trial registration.

B - Statistical reporting errors in psychology

Nuijten MB, Hartgerink CHJ, van Assen MALM, et al. The prevalence of statistical reporting errors in psychology (1985-2013). Behavior Research Methods 2015:1-22
(doi: 10.3758/s13428-015-0664-2)

This study documents reporting errors in a sample of over 250,000 p-values reported in eight major psychology journals from 1985 until 2013, using the null-hypothesis significance testing (NHST). Results showed that half of all papers contained at least one p-value that was inconsistent with its test statistic and degrees of freedom. One in eight papers contained a grossly inconsistent p-value that may have affected the statistical conclusion. This could indicate a systematic bias in favor of significant results.

B - Public registry of competing interests

Dunn AG. Set up a public registry of competing interests. Nature 2016 May 5;533(7601):9.
(doi: 10.1038/533009a2016)

According to the author, publishing system for disclosing competing interests is still fragmented, inconsistent and inaccessible. About half of the studies that involve researchers who hold relevant competing interests fail to declare them, and the common causes are inconsistent requirements across journals and negligence. To solve this problems, the research community should establish a public registry of competing interests, i.e. an online database of interests declared by researchers to precisely determine the association between competing interests and the potential for bias.

B - Reviewer fatigue?

Breuning M, Backstrom J, Brannon J, et al. Reviewer fatigue? Why scholars decline to review their peers' work. PS: Political Science & Politics 2015;48(4):595-600.
(doi: 10.1017/S1049096515000827)

The double-blind peer review process is central to publishing in academic journals, but it also relies heavily on the voluntarily efforts of anonymous reviewers. Journal editors have increasingly become concerned that scholars feel overburdened with requests to review manuscripts and experience “reviewer fatigue.”. The authors of this article empirically investigated the rate at which scholars accept or decline to review for the American Political Science Review, as well as the reasons they gave for declining: almost three-quarters of those who responded to requests agreed to review, and reviewer fatigue was only one of many other reasons (also busy professional and personal lives).


B - OA publishing trend analysis

Poltronieri E, Bravo E, Curti M, et al. Open access publishing trend analysis: statistics beyond the perception. Information Research 2016;21(2), paper 712.

This analysis aimed to track the number of OA journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals in the period 2010-2012. Results showed a growth of OA scholarly publishing, with a prevalence for journals relating to medicine and biological science disciplines.

Thursday, June 16, 2016

B - Gender analysis in health system research

Morgan R, George A, Ssali S, et al. How to do (or not to do)...gender analysis in health system research. Health Policy and Planning 2016;1-10
(doi: 10.1093/heapol/czw037)

The article outlines what gender analysis is and how gender analysis can be incorporated into health system research (HSR) content, process and outcomes. It recommends exploring whether and how gender power relations affect females and males in health systems through the use of sex disaggregated data, gender frameworks and questions. It also examines gender in HSR process by reflecting on how the research process itself is imbued with power relations, and in HSR outcomes by supporting how power relations can be transformed progressively or at least not exacerbated.

B - The Flesch Reading Ease measure

Hartley J. Is time up for the Flesch measure of reading ease? Scientometrics 2016;107(3):1523-26
(doi: 10.1007/s11192-016-1920-7)

The Flesch Reading Ease measure is widely used to measure the difficulty of text in various disciplines, including Scientometrics. This paper argues that the measure is now outdated, used inappropriately, and unreliable. According to the author, it is now time to abandon the notion of one measure and one computer programme being suitable for all purposes. Different computer-based programmes would have greater validity than the Flesch but probably they would still fail to take into account the effects of other variables that affect readability.

B - Open access impact

Tennant JP, Waldner F, Jacques DC, et al. The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research 2016;5:632
(doi: 10.12688/f1000research.8460.1)

This review aims to be a resource for current knowledge on the impacts of Open Access by synthesizing important research in three major areas of impact: academic, economic and societal. The evidence points to a favorable impact of OA on the scholarly literature through increased dissemination and reuse. Access to the research literature is key for innovative enterprises, and a range of governmental and non-governmental services, and it has the potential to save publishers and research funders considerable amounts of financial resources. Furthermore, OA contibutes to advance citizen science initiatives and researchers in developing countries.