OT Digest

Evidence-Based Practice vs. Practice-Based Evidence

Episode Summary

This episode was a Livestream that was recorded as part of the OT Graphically Library membership. In this episode, I discuss the difference between these two terms: Evidence-Based Practice and Practice-Based Evidence. I walk through an example of what practice-based evidence can look like, and give resources to share about specific assessments that can help track and measure specific interventions.

Episode Notes

Email me at katie@otgraphically.com with any questions. 

Learn more about the OT Graphically Library Membership Here.

References:

Eppley, K., Azano, A. P., Brenner, D. G., & Shannon, P. (2018). What counts as evidence in rural schools? Evidence-based practice and practice-based evidence for diverse settings. The Rural Educator, 39(2).

Green, L. W., & Allegrante, J. P. (2020). Practice-based evidence and the need for more diverse methods and sources in epidemiology, public health and health promotion. American Journal of Health Promotion, 34(8), 946-948.

https://www.psychologytoday.com/us/blog/the-digital-doctor/202009/evidence-based-practice-practice-based-evidence

Swisher, A. K. (2010). Practice-based evidence. Cardiopulmonary Physical Therapy Journal, 21(2), 4.

Episode Transcription

Hi everybody. I hope you're doing well. Happy Tuesday, wherever you are in the world.  I am really excited to talk to you today about practice-based evidence versus evidence-based practice, which is what we talk about a lot here. How can we take evidence and put it into practice? So this is kind.

 

The reverse of that, you know, how do you in your practice make sure that you are, uh, tracking data that is evidence-based. So I just wanted to kind of, uh, give some definitions really quick because they are very similar words and that could be really confusing. So evidence-based practice is teaching practice supported by scientific.

 

Supposedly true knowledge, uh, generated as an outcome of randomized control trials, usually. Whereas, uh, practice based evidence is the opposite of that, where you take into consideration the context of your practice and kind of. Everything together, you're doing it together and tracking what is going on.

 

You know, I tried this, I'm gonna try this single intervention with this, uh, patient and or client, and then I'm gonna use, uh, tracking measures to see if it worked and if not, change it up. So that is in general, the difference. So it's kind of the way I think of it in my brain is starting at the paper versus starting in the practice.

 

The reason why this has become an issue more recently is because there's a really heavy focus on evidence based practice. And I would argue that maybe I also am part of that. And I think that is something I'm learning too. So this is me learning alongside you as well, but basically, because there's such a strong emphasis on it, it is kind of like.

 

Become this thing, especially in like schools and, especially schools where it's guide your policy and, and your insurance companies, you know, if you don't do evidence, it doesn't count is kind of what has kind of resulted in it. So it's become kind of this really intense, you know, it's a great thing, but it's also can be very limiting and not capture the whole story.

 

So it kind of ignores the context. So the, the benefits of evidence based practice is that it, you can isolate something and say, when I did this, this changed this and be pretty confident that that was true. , It's kind of, that's why, uh, so many scientists advocate for it, and are kind of not as pumped about  this new trend of practice based evidence.

 

So you really know what's causing the change. You know, that what you're doing is making this impact and this outcome. One of the negatives of that is it doesn't take into account the local situation or the client's personal. Situation or personal environment. It doesn't, how I would describe it, including the messy parts of being a human.

 

There's some people who argue that it's like, kind of, then let's just throw out evidence-based practice, because that's not working, but I would argue there's a way to lean into that. It's something called the key ingredients. So a lot of times you try an of intervention in one of our articles, for example, you do the steps and you're like, man, I just can't get this to work in my setting.  What then the next step, instead of kind of turning away from it, I would argue is to lean in and do reach out to the author or to figure out and ask them, what are the key ingredients? What are the things that like absolutely have to be done for this to be considered.

 

The co-op method for this to be considered Aeros, sensory integration for this to be considered, uh, graded motored imagery. And what can I kind of adapt? Because they know what's what you would be able to do, uh, to be able to adapt it, but that's tricky. So the idea is let's. Also, you know, we have that, but then let's also capture this practice based evidence.

 

So let's gather the data from all clinics across the world. It's very client centered. We can see what they need and then we're getting the information of what works in the clinic. And it's a lot easier. You don't have to kind of. It's not like you're putting a square peg or around hole.

 

It's a lot easier to implement into your practice and ideally reduce how long it takes to put evidence into practice, which right now is actually 14 years. It used to be 17 years when I first started this. So that was kinda cool to learn recently. So it's coming down, but that's still like, you know, 15 years is long time.

 

So the, the downside to doing practice based evidence only is that you might not get the same results that. You would see in an, like you wouldn't say, oh, I got this study and then I'm gonna do it. And the same way that they did it and I could, but based on the, the results I can expect, it'll be similar.

 

You can't necessarily. You don't necessarily know that. And then also, you don't know if what you're doing is actually changing what you're changing, you can assume, and you can track it. But there's a lot of other variables at play. You know, people are getting all sorts of other interventions at the same time, so that can be tricky.

 

And that is when you remove that scientific context, you're relying on you're hoping that it's what you're doing, but you don't, you can't say for sure. So that is the only con the, one of the few cons of this. But basically the way I've been trying to do it, and this is something I've been practicing in my own practice and really focusing on, because I think it's where I get frustrated.

 

When trying to implement something. So I really just first focus on the client, what are their needs, what is important to them? and what do they value first? everything else doesn't matter. Those are the things okay. We wanna work on. Putting our socks on. I was hoping I didn't pick that one, but I guess I picked one.

 

So we wanna work on pick putting our socks on. Okay. So that's really important because that helps us get out the door quicker. We need to get places quicker. We're busy, you know, that's important to us to be on time. If on time isn't important to us, we don't necessarily need to get ours on quicker, but for this family situation, we.

 

And then you go and kind of select those interventions from the evidence that you know about.  I know about video modeling. I know about,  adapting the environment, maybe having some kind of adaptive equipment, depending on, you know, if this is adults versus pediatrics, um, doing, you know, different, um, Co you using a co-op method?

 

I feel like that's like my favorite one, if you haven't heard I say it all the time, but I feel like that solves a lot of our OT problems, that intervention. and then you, plan for how you're gonna implement it. So I think this is the part that we don't stop and think about, like, how are we going to implement this?

 

and how are we going to track it? So I. The child and, and we're good at this. We do this so well in our head, but to put it on paper is like the next step, right? So I want this child to put on their socks. Uh, what is like five steps before that?  And what is, what is me measuring success. So is there an assessment that shows tracking improved success, which there are, you know, there's things like the PEDI where you could say the caregiver is giving less and less support each and you could track it each week.

 

You could do something like the goal attainment scale. I would say PEDI is probably not great for that. Cause I think you can't do it every week, but the goal attainment at scale is a good example where you can track, okay, this week we were. At a zero and then next week we're at a one. So you can easily see that they moved up a level and then, so, okay.

 

I'm gonna use the goal attainment scale, and I'm also gonna use the PEDI right at the beginning of our intervention at the end of our three months or however long you're seeing the, the child. So you have that plan ahead of time. Like that's how I'm gonna track my data. A lot of times I feel like I'm doing that mid, mid intervention and, and that's where I, I.

 

It's tricky cuz I'm, I'm thinking about it and it's kind of like a day late in a dollar short. But those are the two interventions you'll use to help you track what you're doing. And then you implement the intervention, which is, um, you know, pick one, one of the ones I've said before, let's say you use video modeling.

 

 I don't know that one. I need to look into how evidence-based that is too. Cause that has, uh, swung different ways more recently. So, but if you do this right, you wanna see, you know, the, the evidence says one thing, but let's see if it works in practice. Uh, you, you do the PD with the family at the beginning.

 

Uh, you do the dual attainment scale every single week saying, okay, we did a video model, we practice our socks. How did it go? What do we rank? And you, I usually just put it literally in a, in a Excel sheet or a paper, you know, of, I, I got a negative one this week and it's all on the same. So you can see it all in one page.

 

I think that I didn't wanna give like a really specific sheet for this, like a template for this, which I, I can make, but honestly, it's just like a table. That is, has enough spaces that it's, bi-week, you know, you have your date, your, uh, what you, your intervention, if you want, which is gonna be the same.

 

and you know, maybe any other notes you needed and then just what they scored on the goal attainment scale. So you're not doing so much extra documentation. and then you look over time, you know, maybe at six weeks, like, okay, where are we at? Are we at still at zero or, or negative one? And we haven't really made that jump.

 

Goal. Okay. Maybe this intervention is not the one that I need. Uh, this is not working. Let's adjust it, but that information is so important and this is kind of not something I necessarily have the answer. I don't the answer to, but. It's to give that back to the author of an article would be so helpful, uh, to sh say, Hey, I did this in the clinic.

 

I did it. Like you said, in the article as much as I could. I, I used this intervention and these are the results I got. You know, I think that's very powerful and, and very valuable. 

 

So, one quote I like to think about is from Albert. I think it's from Albert Einstein, but it says not everything that can be counted counts and not everything that counts can be counted.

 

So this is kind of like the big idea behind this, if it's not important to the family and you're counting. It doesn't matter because it's not important to them also, if you're counting it for, um, like reasons for, uh, yourself, but then the insurance company doesn't care, you know, that also doesn't count cuz you won't get paid for what you're doing.

 

So there's a lot of, and obviously that's frustrating. Absolutely frustrating too. so. . I just think that's a really important thing to keep in mind that really focusing on what the client wants and needs first can, can really help shrink that down. And I haven't been paying attention to, I think there's a chat here.

 

So if anybody wants to share anything this is my first time doing a live kind of this way. And, and it's upgraded a little bit since the last time I did it, but. Yeah, I think just that planning ahead of right when you start of these are the assessments I'm gonna track with each week.  I really like the goal attainment scale.

 

It's pretty easy. Sometimes the COPM is actually really great too for this that one is, I feel like harder to do every week, but something that you feel like can, you can track and get like a number over time is really helpful. I know there's a great app for the goal attainment scale. So you could put it into an app, but honestly, sometimes just charting on a piece of one piece of paper, keeping it simple is the best.

 

So I hope that was helpful. I'm gonna keep learning more about this as part of our library and this will be part of our course as well. So I'm gonna add this to our course. And I think it's just important to know that it's kind of a newer topic and people are realizing like, look, if we do this more, we can use this information to help implement the things that are working and stop implementing the things that aren't of the day that's important for.

 

Well, I hope you're all doing great. It was so great to connect and if you're watching later, definitely share your thoughts or other ideas or other resources. If you need, if you would like me to make that chart I can, cause I know sometimes just having somebody else do it is helpful. And then maybe I could share a list of other things besides goal attainment scale that could be used for that weekly tracking.

 

So thanks so much and we'll be in touch soon. Bye. Every.