Dive guides monitoring sharks on coral reef at similar level to telemetry.
Shark data collected by citizen scientists may be as reliable as data collected using automated tools, according to results published April 23, 2014, in the open access journal PLOS ONE by Gabriel Vianna from The University of Western Australia and colleagues.
Shark populations are declining globally, and scientists lack data to estimate the conservation status of populations for many shark species. Citizen science may be a useful and cost-effective means to increase knowledge of shark populations on coral reefs, but scientists do not yet know enough about how data collected by untrained observers compares to results from traditional research methods. To better understand the reliability of datasets collected by citizen science initiatives, researchers in this study compared reef shark sightings counted by experienced dive guides (citizen scientists), with data collected from tagged reef sharks by an automated tracking tool (acoustic telemetry). 62 dive guides collected data during over 2,300 dives using standardized research protocols, including reporting on the dive site, date, species, counts, estimated depth, current, visibility, and number of tourist divers in the group. Both data sets were collected at coral reefs on the Pacific island of Palau over a period of five years.
I have recently developed a report with the title, “7 All-Time Highly Cited Papers in Top 49 Publications of the World”.
This report has presented 7 highly cited papers from top 49 publications of the world according to Google Scholar, which are arranged by Google Scholar according to h-index. Moreover, 7 additional journals having impact factor more than 28, which are not mentioned in those top 49 publications such as “Cancer Journal for Clinicians”, are also presented in the report with the list of 7 highly cited papers from each journal
Research Finds Discrepancies Between Trial Results Reported on Clinical Trial Registry and in High-Impact Journals
Chicago – During a one year period, among clinical trials published in high-impact journals that reported results on a public clinical trial registry (ClinicalTrials.gov), nearly all had at least 1 discrepancy in the study group, intervention, or results reported between the 2 sources, including discrepancies in the designated primary end points for the studies, according to a study in the March 12 issue of JAMA.
The 2007 Food and Drug Administration (FDA) Amendments Act expanded requirements for ClinicalTrials.gov, mandating results reporting within 12 months of trial completion for all FDA-regulated medical products. “To our knowledge, no studies have examined reporting and accuracy of trial results information. Accordingly, we compared trial information and results reported on ClinicalTrials.gov with corresponding peer-reviewed publications,” write Jessica E. Becker, A.B., of the Yale University School of Medicine, New Haven, Conn., and colleagues.
Chicago – An analysis of research on peer review finds that studies aimed at improving methods of peer review and reporting of biomedical research are underrepresented and lack dedicated funding, according to a study in the March 12 issue of JAMA.
Mario Malicki, M.D., M.A., of the University of Split School of Medicine, Split, Croatia, and colleagues analyzed research presented at the International Congress on Peer Review and Biomedical Publication (PRC) since 1989. The first PRC was organized to “subject the editorial review process to some of the rigorous scrutiny that editors and reviewers demand of the scientists whose work they are assessing.” The researchers collected data on authorship, time to publication, declared funding sources, article availability, and citation counts in Web of Science. The analysis included 614 abstracts.
A lack of diversity across the scientific community represents a large loss of potential talent to the UK according to the chair of the Royal Society’s Equality and Diversity Network (EDAN), Professor Edward Hinds FRS. The comment comes as the Royal Society publishes a report which gives the fullest picture yet of the scientific workforce in relation to diversity.
Approximately 20 per cent of people in the UK workforce need scientific knowledge and training to do their current jobs. A picture of the UK scientific workforce, published today (7 March), sets out to analyse and understand the composition of the scientific workforce in terms of gender, disability, ethnicity and socio-economic status and background.
Professor Edward Hinds FRS, said:
“With diversity comes a mix of ideas, skills and approaches. If the UK’s scientific workforce is not diverse, we are bound to be missing out on some great talent. At a time when the UK is seeking to use its scientific capabilities to help improve lives and rebuild the economy, it is more important than ever that we ensure the best scientists can flourish.”