What editors, reviewers, researchers and librarians need to know about the PRESS, MECIR, PRISMA and AMSTAR instruments with regard to improving the methodological quality of searches for information for articles

The question that people involved in scientific information and publishing keep asking is “What can we do to further improve the quality of scientific publications?” The published is the end product from this work, which deserves to be reported properly and in detail.

The published text is the end product from this work, which deserves to be reported properly and in detail.
Evaluative instruments through which syntheses and synopses of evidence are made add rigor and methodological quality to published studies at all stages, so that the final product will have reliable and reproducible results.
Therefore, in answer to the initial question, we can survey the instruments available to aid in searching for information. A search for information forms an important methodological stage in any scientific investigation, and not just in studies that have the aim of producing a synthesis of the evidence.
The structured tools that are used in assessments and in producing certain types of study such as systematic reviews, technological healthcare evaluations, scoping reviews, rapid systematic reviews, overviews, integrative reviews, and so on, may form instruments that guide editors, reviewers, researchers and librarians. One such instrument was specifically created to guide librarians in evaluating and conducting high-sensitivity search strategies.
Four instruments fall into this category, as follows: • MECIR -Methodological Expectations for Cochrane Intervention Reviews; • PRISMA -Preferred Reporting Items for Systematic Reviews and Meta-Analyses; • AMSTAR -Assessing the Methodological Quality of Systematic Reviews; • PRESS -Peer Review of Electronic Search Strategies. [1][2][3][4][5][6][7][8][9] In Table 1, we present these four instruments for conducting sectional assessments and analyses, specifically for searching for information and developing a search strategy. Through this, it can be seen that the PRESS and MECIR instruments provide more detail for conducting searches than do PRISMA and AMSTAR, including provision of detailed guidance for this stage and greater rigor. [1][2][3][4][5][6][7][8][9]

MECIR
The librarian of the Cochrane Collaboration, who has the title of Cochrane Information Specialist (CIS), has the task of designing and implementing search strategies. This involves the entire process of defining the question, identifying the vocabulary that covers this question, transcribing the question into a search strategy, selecting the databases, transcribing the strategy for all the databases that were selected (mandatory, specialized and recommended databases), testing the performance of the strategy, adjusting it and running it in all the databases selected for the question. The librarian assists in saving and guiding the management of results obtained The structure of the search strategies in bibliographic databases around the main concepts of the review should be informed, using appropriate elements from PICO (problem-intervention-comparison-outcome) and the study design. In structuring the investigation, sensitivity should be maximized while seeking reasonable precision. Correct use of the operators "AND" and "OR" should be ensured -Mandatory C33. Development of research strategies for bibliographic databases: Appropriate controlled vocabulary needs to be identified (for example, MeSH or Emtree, including "exploded" terms), along with free-text terms (for example, considering spelling variations, synonyms, acronyms, stem operators and proximity) -Mandatory C34. Use of search filters: Specially designed and tested search filters should be used when appropriate, including highly sensitive Cochrane search strategies for identifying randomized clinical trials in MEDLINE. However, filters should not be used in prefiltered databases. For example, randomized trial filters should not be used in CENTRAL and systematic review filters should not be used in DARE -Highly desirable C35. Restrictions on database searches: The use of any restrictions in search strategies regarding publication date and publication format needs to be justified -Mandatory C36. Documenting the search process: The search process should be documented with sufficient detail to ensure that it can be reported correctly in the review -Mandatory C37. Doing searches again: The searches in all the relevant databases should be done again within the last 12 months before the review is published or updated, to check for any results from potentially eligible studies -Mandatory C38. Incorporation of discoveries from repeated searches: Any studies identified through repeating or updating the search within the last 12 months before the review is published or updated should be incorporated in full -Highly desirable PRISMA -Preferred Reporting Items for Systematic Reviews and Meta-Analyses -http://www.prisma-statement.org/. This is a checklist for the main recommendations and items to be included in reporting on a systematic review. It relates only to information searches. Information sources: ITEM 7: Describe all the information sources in the search (for example: database with dates of coverage or contact with authors to identify additional studies) and the date of the last search. ITEM 8. Present a complete electronic search strategy for at least one database, including any limits used, so that it can be repeated.
• Detailed description of the information flow in the different phases of the systematic review (PRISMA flow diagram). AMSTAR 2 -ASSESSING THE METHODOLOGICAL QUALITY OS SYSTEMATIC REVIEW -https://amstar.ca/Publications.php. This is a critical assessment tool that is used to evaluate the quality of systematic reviews on randomized studies and also, in this version 2, non-randomized healthcare intervention studies. Question 4. Did the review authors use a comprehensive strategy for searching the literature?
• They searched at least two databases (that were relevant to the research question) • They supplied keywords and/or search strategies • They justified any publication restrictions (for example, language) • They investigated reference lists or bibliographies in the studies included • They investigated registers of trials and studies • They included or consulted specialists within the field • They investigated the grey literature when this was relevant • They did a search within 24 months after concluding the review PRESS 2015 -Guidelines and recommendations for librarians' practices 8 Here, we highlight the recommendations for librarians, in addition to those in Table 3, which shows the simplified list of PRESS. Table 1. Instruments used for conducting sectional syntheses of evidence and assessing their quality, in order to evaluate search strategies and select databases [1][2][3][4][5][6][7][8][9] Continue... Ideally, the primary search strategy should be submitted to peer review to ensure conceptual precision. The research question, which is normally formatted in accordance with some variation of PICO and fine points about how the research was informed by the reference interview, should be sent with the research strategy. 2. Boolean and proximity operators: Assess whether the elements relating to the research question were combined correctly using Boolean and/or proximity operators.
Look again at the search regarding any instances of errors in Boolean operators. For example, OR may have been accidently replaced by AND (or vice versa), or AND may have been used to link phrases or words (for example, as a conjunction) instead of as a Boolean operator. Note that where NOT was used, there is the possibility of unintentional exclusions, and another device (for example, use of a subject title, verification label or limit) may produce an equivalent result. Check that any use of nesting between square brackets is logical and has been applied as necessary. Also, note whether use of a proximity operator (adjacent, near, within) instead of AND might increase the precision. If proximity operators have been used, consider whether the width chosen is narrow enough to capture all the foreseen instances of the search terms, which may vary depending on whether the database investigated does or does not recognize stop words. Consider whether the width is too broad. If there are restrictions (for example, human populations or elderly populations), check whether an appropriate construction was used. 3. Subject headers (specific for the database): Assess whether there is enough scope in selecting subject headers for the recall to be optimized.
Examine the following elements used in subject titles: absent or incorrect titles, relevance or irrelevance of terms and correct use of explosion for including more restrictive relevant terms. Consider using floating subtitles: in most cases, this is preferable to using subtitles attached to specific subject titles (for example, in MEDLINE, "Neck Pain/and su.fs. " instead of "Neck Pain/ su"). Note that subject titles and subtitles are specific to databases. 4. Search for text words (free text): Assess whether the search terms without adequate coverage of the subject title are well represented by free-text terms and whether additional synonyms and antonyms (opposites) and related terms are needed.
Free-text terms are normally used to cover subject headers of absent databases. Consider whether elements using free text might be too narrow or too broad, what the relevance of these terms is and whether synonyms and antonyms have been included. 5. Spelling, syntax and line number: Assess the correctness of the spelling and syntax and the implementation of correct searches.
Review the search strategy for words with spelling mistakes and system syntax errors that are not easily found through spellcheckers. Check each line number and combinations of line numbers to ensure that the logic of the search has been correctly implemented. 6. Limits and filters: Assess whether the limits used (including filters) are appropriate and have been correctly applied.
Review the search strategy to see whether limits that are not relevant for the eligible study designs or for the clinical question were applied, since this could introduce epidemiological bias. Check whether the methodological filters for the search were applied correctly: for example, to ensure that systematic reviews of economic evaluations are not restricted to clinical trials. through automated systems for selecting and identifying duplicated studies. 1,2 The CIS has to ensure that the research methods are documented in accordance with the MECIR standards. These also serve as a compass for the CIS in conducting the whole process. 1,2 Involvement of this specialist adds significantly to improvement of the reporting of the research methods and also to evaluation of the general quality of the development process and presentation of the review.

Information specialists' involvement in traditional research
tasks is always recommendable as a central methodological tenet for producing high-quality systematic reviews. However, these professionals' experience is increasingly being implemented in new ways.

Series (2014)
• How to increase value and reduce waste when research priorities are set. 12 • Increasing value and reducing waste in research design, conduct, and analysis. 13 • Increasing value and reducing waste in biomedical research regulation and management. 14 • Increasing value and reducing waste: addressing inaccessible research. 15 • Reducing waste from incomplete or unusable reports of biomedical research.  This instrument provides descriptions of six elements for use as guidelines for librarians' practices. Moreover, for editors, this can serve as an instrument for general methodological assessment of reviews. It is important that editors and reviewers should adopt or establish peer review strategies for evaluating articles submitted for publication that involve input from a specialist librarian. 9 The ideal is that all of this search process should be done at the start of the research, so as to avoid perpetuating errors, not just at the end of the study but throughout its course. There is no doubt that as soon as peer review practices for search strategies are implemented by editors and everyone involved in publication processes, authors will start to conduct searches with adequate criteria from the outset.

Limits and filters
Have all the limits and filters been used appropriately and are they relevant for the research question? Have all the limits and filters been used appropriately and are they relevant for the database? Are any potentially useful limits or filters missing? Are the limits or filters too broad or too narrow? Could any limits or filters be added or removed? Have the sources for the filters used been cited? and rigor for use in all research. It is also important to note that PRESS will shortly be available in Portuguese.
There is a clear need to improve the adequacy of search strategies for systematic reviews and for reviews in general. The presence of a search specialist, with experience in developing strategies throughout the research process has become essential for ensuring transparency and reproducibility of research methods, thus benefiting the quality of the reviews produced.
It is important that the reviewer using the search strategy and the information specialist who designed the strategy should be supported by a national forum for search specialists and should have access to teams that could review their strategies. Furthermore, they should also use the use the verification list of PRESS, which summarizes the main potential errors made in search strategies. 9 All efforts exerted towards improving the quality of all research and reviews are valid.
With the material that is made available, along with the tools and instruments, the next step is to work put a route along which editors can better assess search strategies that are submitted for publication.