Facebook Twitter RSS Reset

Evidence based practice – steps, Levels

2. Assembling and evaluating the evidence

Once a clinical practice question has been selected, the next step is to search and assemble research evidence on the topic. In doing a literature review as a background for a new study, the central goal is to discover where the gap are and how best to advance knowledge. cochrane reviews are an important resource. They have been found to be more rigorous than published in journals. Another critical resource available for integrative review is the Agency for health care research and quality (AHRQ).

Research based evidence sites are CINAHL and Medline databases, Cochrane library(www.cochrane.org) , American college of physician pier, National Guideline Clearinghouse (www.guideline.gov), Turning research into practice (www.tripdatabase.com) Professional association Guidelines /Standards of care, expert opinion/clinical expertise (clinical articles, web search)

3.Critically Appraising the Article

In determining the implementation potential of an innovation in a particular setting, several issues should be considered, particularly the transferability of the innovation, the feasibility of implementing it and its cost benefit ratio.

If the implementation assessment suggest that there might be problems in testing the innovation in that particular practice setting, then the team can either identify a new problem and begin the process a new or consider adopting the plan to improve the implementation potential (e.g seeking external resources if cost were the inhibiting factors)

4.Integrating the evidence with ones clinical expertise

If the implementation criteria are met the team can design and plot the innovation. Based on the IOWA model the following activities can be involved:

  • Developing an evaluation plan (identifying outcomes to be achieved, determining how many clients to involve in the pilot, deciding when and how often to take measurements)
  • Collecting baseline data relating to those outcomes to develop a counterfactual against which outcomes would be assessed.
  • Developing a written EBP guideline based on the synthesis of the evidence, preferably a guideline that is clear and user friendly and that uses such devices as flow charts and decision trees
  • Training the relavant staff in the use of the new guideline and if necessary “marketing” the innovation to user so that it is given a fair test.
  • Trying the guideline out on one or more units or with the sample of clients.

    5. Evaluating the change

The last step in EBP is evaluation of the pilot project in terms of both process (e.g how the innovation received, to what extent were the guidelines actually followed, what implementation problems were encountered?) and outcomes ( in terms of client outcomes and cost effectiveness)

A variety of research designs can be used in the evaluation ,of course ,with the most rigorous being an experimental design. In most cases however, a less formal evaluation will be more practical, comparing collected outcomes data or hospital records before and after the innovation and gathering information about patient and staff satisfaction. Qualitative and mixed method research designs can also contribute to evaluating an innovation. Valuable information on the feasibility and participant burdens can be obtained.

Evaluation data should be gathered over a sufficiently long period to allow for a true test of a “ mature innovation” . The end result of this process is a decision about whether to adopt the innovation , to modify it for on going use or to revert it to prior practices.

Levels of Evidence Based Practice

Levels of Evidence based practice