
Leading Change in Continuing Education
Listen. Learn. Earn Continuing Education Units.
Get this course and more with an SLP Nerdcast Membership
MEMBERSHIP INCLUDES
- Unlimited access to 100+ courses for ASHA CEUs: All SLP Nerdcast Memberships get you unlimited access to courses for ASHA CEUs that go in your ASHA Registry and can count towards an ACE Award
- Access to conferences, live events and exclusive content All SLP Nerdcast Memberships get access to live events and exclusive content, including two annual conferences, SLP Linked and LEAHP.
- Unlimited Access to our Resource Library Upgrade to our All Access Membership and get unlimited access to our Resource Library that includes therapy materials, course handouts, and resources you need to save time.
"Thank you for making this excellent, research-based learning opportunity that is both extremely accessible and affordable. This is the best kind of PD: it’s one hour at a time so I can learn and then have time to synthesize and apply. It provides information I can apply to my practice immediately; and I can listen and learn while I drive, fold laundry, etc. thanks for the research and resources!"
-Johanna H.

Choose the Membership that's Right for You
Meet your Instructors

Speaker Disclosures
References & Resources
Korsten, J. (2002). Meaningful data: Making sense of + and -. Perspectives on Augmentative and Alternative Communication, 11(3), 10-13
Olswang, L. B. & Bain, B. (1994). Monitoring children’s treatment progress. American Journal of Speech-Language Pathology, 3(3), 55-66
Walz, J. (2013). Technology to support data collection and management in the public schools. Perspectives on School-Based Issues, 14(1), 10-14
Course Details
Course Disclosure
- Financial and In-Kind support was not provided for this course. Learn more about corporate sponsorship opportunities at www.slpnerdcast.com/corporate-sponsorship
Disclaimer
- The contents of this course are not meant to replace clinical advice. SLP Nerdcast hosts and guests do not endorse specific products or procedures unless otherwise specified.
Additional Information
- All certificates of attendance and course completion dates are processed using Coordinated Universal Time (UTC). UTC is 5 hours ahead of Eastern Standard Time (EST) and 8 hours ahead of Pacific Time (PT). If you are using SLP Nerdcast courses to meet a deadline (such as the ASHA Certification Maintenance deadline) please be aware of this time difference. Your certificates and course completion dates will reflect UTC not your personal time zone.
- Closed captioning and transcripts are available for all courses. If you need additional course accommodations please email [email protected]
- Refunds are not offered for digital products, downloads, or services
- Certificates of attendance are only awarded to participants who complete course requirements
- Please email [email protected] for course complaints
Episode Summary provided by Tanna Neufeld, MS, CCC-SLP, Contributing Editor
Audio File Editing provided by Caitlin Akier, MA, CCC-SLP/L, Contributing Editor
Promotional Contribution provided by Paige Biglin, MS, CCC-SLP, Contributing Editor
Web Editing provided by Sinead Rogazzo, MS, CCC-SLP, Contributing Editor
Transcript
[00:00:00]
Intro
Kate Grandbois: [00:00:00] Welcome to SLP Nerd Cast. I'm Kate. And I'm Amy. And we appreciate you tuning in. In our podcast, we will review and provide commentary on resources, literature, and discussed issues related to the field of
speech language pathology. You can use this podcast for ASHA Professional Development. For more information about us and certification maintenance hours, go to our website, www.slpnerdcast.com.
LP Nerd Cast is brought to you in part by listeners like you. You can support our work by going to our website or social media pages and contributing. You can also find permanent products, notes and other handouts, including a handout for this episode. Some items are free, others are not, but everything is always affordable.
Visit our website www.slpnerdcast.com to submit a call for [00:01:00] papers to come on the show and present with us. Contact us anytime on Facebook, Instagram, or at [email protected]. We love hearing from our listeners and we can't wait to learn what you have to teach us.
Amy Wonkka: Just a quick disclaimer, the contents of this episode are not meant to replace clinical advice.
SLP Nerd Cast. Its hosts and its guests do not represent or endorse specific products or procedures mentioned during our episodes, unless otherwise stated, we are not PhDs, but we do research our material. We do our best to provide a thorough review and fair representation of each topic that we tackle.
That being said, it is always likely that there is an article we've missed or another perspective that isn't shared. If you have something to add to the conversation, please email us. We would love to hear from you.
Kate Grandbois: Before we get started in today's episodes, financial and Non-Financial Disclosures, um, I am the owner and founder of Grand Wa Therapy and Consulting, LLC and co-founder of SLP Nerd Cast.
Amy Wonka is an employee of a public school system and co-founder of [00:02:00] SLP Nerd Cast. Uh, we are both members of ASHA's six 12 and both serve on the a a C advisory group from Massachusetts Advocates for Children. I am a member of the Berkshire Association for Behavior Analysis and Therapy, mass, a BA, the Association for Behavior Analysis International and the corresponding Speech Language Pathology and Applied Behavior Analysis special Interest group.
Amy Wonkka: All
Kate Grandbois: right. What are we talking about?
Amy Wonkka: Today we're talking about the riveting topic of data collection. This is gonna be so
Kate Grandbois: super fun. It's
Amy Wonkka: gonna be good. Who
Kate Grandbois: doesn't love lovers? I love this.
Amy Wonkka: I actually love
Kate Grandbois: this. Before we started recording, we ended up having a whole discussion about the Is Oly Triangle.
Mm-hmm. Which is not. The kind of data that we take as speech pathologists. No, but did we find out that is Oline was was a man? No, but we found
Amy Wonkka: out that there was a theorem.
Kate Grandbois: I'll, I'll look into it more. Okay. In case of you this, what would you call that? A, a teaser? Teaser for more information? Geo. Geo about geometry.
Geometry podcast. Never coming. Okay, so, so data collection and speech and language pathology. So [00:03:00] as SLPs, we don't get a ton of training in data collection. Um, Amy and I both went to BCBA school and we learned too much about data collection through that process. You can
Amy Wonkka: never learn too much about data collection.
You
Kate Grandbois: can never learn too much about data collection. But I think that is one way that, um, that education really contributed to our practices as speech and language pathologist for sure. I think you can totally agree with that. Um, so it. Prompted us to wanna dig a little bit deeper into what it means to be an SLP, um, who takes data and what kind of data you should take and why you should take data and all of that stuff.
Um, after all, we are a hard science and we rely on evidence-based practice, which implies that all of our clinical work should be based on data at the end of the day. I think. So that's probably a fair assumption. I think so. Um, so to give a little structure to our, uh, episode today, we are, first we're gonna talk a little bit about background information.[00:04:00]
Why do we need to take data? Kate? Why do we need to take data? What components should
Amy Wonkka: we consider when we take the data? Good question. What are some hurdles to collecting data and some strategies to overcome those hurdles?
Kate Grandbois: There's a lot of hurdles out there. There are, number one being that it's boring and that you don't like doing it.
I you like doing it. Definitely disagree. You like doing it. Um, but what kind of data should you collect and why? There's a lot of different kinds of data collection out.
Amy Wonkka: Back question.
Kate Grandbois: Valid question. So first background information. Why do we need to take data and what components should we consider when taking it?
So first and foremost, we need to take data because it's ethical. How is it ethical? So interestingly, and I was a little surprised to see this, I reread our code of ethics, the Asher Code of Ethics. Mm-hmm. And the word data is not in the code of ethics. Are there synonyms for the word data measurement is not in there either.
Hmm. Evidence is in there. Okay. But I was a little surprised by that. How do you demonstrate your evidence [00:05:00] without data though? You ask your friend Amy? I dunno, that's a really good question. Everybody who works with me is like, oh yeah, she always says that she's just gonna go and ask Amy. Um, but interestingly, the word data is not in our code of ethics.
However, there are several components of our ethical code that are related to taking data. So principle number one of our code of ethics, principle one a, I don't know if they number them like that, but principle one a states. Individuals shall provide all clinical services and scientific activities competently.
Hmm. Now, I don't know how you could be a clinician or therapist of any kind and provide competent clinical services without taking data on the outcomes of your treatment.
Amy Wonkka: I agree. I feel like any place I've ever worked requires goals, objectives, and if you're dealing with third party funding, you're reporting on those at regular intervals.
Kate Grandbois: Plus we all were taught very, you know, [00:06:00] concretely that the goals and objectives you write have to be measurable. Mm-hmm. And how do you measure something mm-hmm. Without data. Right. And
Amy Wonkka: who doesn't have the components of a soap note, like embedded deep, deep in their brain? I never really had to use a soap note.
You didn't have to do, oh, we wrote so many soap notes. I didn't write
Kate Grandbois: a lot of soap notes. Really? No, I didn't. All right. Um, there you go. I guess that's who subjective, objective. What's the, a. Soap soap. Isn't that how you spell soap? Assessment,
Amy Wonkka: progress Assessment. I, I dunno. I guess
Kate Grandbois: it's not that. Can somebody please email us or maybe we just need to do a little bit of Googling about that.
But anyway, so it is definitely related to our ethical code principle number one for providing competent clinical services. Um, the second principle of our, the second component of ethics, principle M or principle one M individuals who hold the cate, the certificate of clinical competence shall use independent and evidence-based clinical judgment keeping paramount the best interest [00:07:00] of those being served.
And again, I don't think that you can comply with this component of the ethical code without taking data and looking at your data, analyzing your data, having your data drive, your clinical decision making, altering your treatments based on the data that you take. I don't think you can do. Any of those components without, I mean, I don't think you can do any evidence-based work or making good clinical judgements without those components, which again, rely on data.
Amy Wonkka: Also, they connect with the SOAP note. Okay. Which is subjective. Did you a Google hole? I did. I always do. This is, this is how I roll. So, subjective, objective, assessment and plan
Kate Grandbois: plan. Most people who are listening probably already knew that They
Amy Wonkka: did. They did. But now we know. But I feel like those are pieces that you are incorporating in every session.
Kate Grandbois: Yeah, no, that's very true. Very true. You're collecting
Amy Wonkka: subjective data. Objective data, assessing the efficacy and planning for the future. Yeah. There you go. There you go.
Kate Grandbois: So I wanted to talk a little bit [00:08:00] about the word evidence. So the word evidence does show up in our ethical code. Um, and I think evidence-based practice can mean two very different things.
So the first piece is that. There is research or literature that a treatment with a specific population is effective. There is low risks, um, in terms of harm. And you're applying the, you're applying that treatment to your client because you are aware of those risks, harms, and benefits. Yes. Uh, and as we had discussed in a previous episode, there are different tiers of evidence.
Amy Wonkka: Mm-hmm.
Kate Grandbois: So there's a high and stringent tier of what we would consider evidence-based practice. Do you wanna talk a little bit about that, since I know that you can speak to that much better than that? So, when
Amy Wonkka: we think about evidence-based practice, you know, in medicine the gold standard is the double-blind placebo controlled trial.
And that happens infrequently, rarely, never in the field of speech language pathology, right? So when we think about less. Less stringent measures. That would be studies such as a single [00:09:00] subject design with reversal, right? So it's still an experimentally controlled condition, A B, A B design that would be more stringent than say, a case study I wrote about what I did.
Kate Grandbois: Right?
Amy Wonkka: So there are all of these different levels of evidence and all of those things can be helpful and review, reviewed and drive your practice as a clinician. They can also be integrated with your clinical expertise,
Kate Grandbois: right? So if you're, we're thinking about just looping back to the word evidence and what, since that's in our ethical code, what that means.
We have this research and literature literature component, but there's also the evidence that your treatment is working.
Amy Wonkka: Mm-hmm.
Kate Grandbois: So, consi, it's that, you know, our field has been for however many years, very reliant on these group design studies. Very reliant on a lot of different research structures in the literature, but there is this.
There is this, I guess not change, but idea that you can consider your treatment a single subject [00:10:00] design. So you have your treatment, you take your baseline measurement. I mean, you have your client, you take your baseline measurement, you apply a treatment, and then you see the outcomes of that treatment based on your data.
So you really can't use in any way you slice it. You can't use evidence-based practice without your data. So summary, reason number one to take data is because it's ethical. Um, we did a little bit of, I guess, research diving. Mm-hmm. In preparing for this episode,
Amy Wonkka: Kate found an awesome, awesome article.
There
Kate Grandbois: is a great article by, I'm gonna mess up her last name or his last name, Olswang, Olswang and Bain. So this is an article, O-L-S-W-A-N-G. It's an article by Olswang and Bain. That is a really, really, really great summary. Um, of. Of data collection. I, I think if we could say it's our seminal article. Is that the right, is that the right word to use?
Amy Wonkka: I mean, I don't think we get to pick [00:11:00] what's a seminal article in the literature for this. For this episode? Yes, for this episode. For this episode. All swing. Thank you. If you, if you
Kate Grandbois: read every episode, we're gonna try and give you the, the quick and dirty to cut the corners. If you read nothing, if you have any interest in reading anything, but you only have time to read one thing, this is the one, this is the one that you need to read.
Amy Wonkka: Mm-hmm.
Kate Grandbois: Um, the name of the article is Data Collection Monitoring Children's Treatment Progress, and it's published in the American Journal of Speech and Language Pathology. It reviews a lot of what we're gonna talk about today. It's extremely thorough. Um, and it's definitely worth your time. If nothing else, I would recommend just having a copy of it because it has an appendix mm-hmm.
That has a great list of different kinds of data collection. Why you would use those different kinds of data collection definitions for different kinds of data collection. Um, so there will be a link to this article on our website. And information about it in the handout for, and if you're an
Amy Wonkka: ASHA member, this
Kate Grandbois: is like free.
Yeah. This
Amy Wonkka: is
Kate Grandbois: a free download. Yep. Awesome. Yep. It's a really great article. Um, [00:12:00] so one of the things that, one of the biggest takeaways for me after reading this article was a reason why we take data is accountability. So not just for justifying why certain treatment decisions were made, um, but making it an integral part of making future decisions.
So data connects
Amy Wonkka: back to the soap note.
Kate Grandbois: I was gonna say, it takes us back to the collaboration piece of our, you know, we did an episode on collaboration a while ago, and you know, if you're having a disagreement with a parent or another professional that you're working with, it is really hard to argue with a number.
Amy Wonkka: Mm-hmm
Kate Grandbois: If they don't think that you treatment is effective, or if they don't think that you are, you know, have the best interest of the client in mind. Using your data for accountability of why you made the decisions that you made is a really nice jumping off point for a collaborative and effective conversation.
Amy Wonkka: For
Kate Grandbois: sure.
Amy Wonkka: And thinking about stakeholders, all of us who are working as speech [00:13:00] pathologists are working. And incorporating the needs of all of these different stakeholders. So, you know, thinking about back when we were outpatient, being outpatient, you are often working with third party funding agencies.
And so you do need to justify that's, that's also a piece of the ethical code. I can't quote which one, but you are providing services that are warranted. Mm-hmm. And effective. So when you are providing these services, you do need to justify, like this is number one worth the client's time. Mm-hmm. You're actually making a difference.
Mm-hmm. And worth whoever the funding source is. Their payment mm-hmm. For the service. Mm-hmm. It's a, it's a valid, valid service. I have
Kate Grandbois: big feelings about that too, because I've, I have encountered many situations over the years where a clinician keeps an individual in services when the services are not effective.
And, and I think when you don't have data to justify, or you don't have that level of accountability with why you're providing the services, then you're wasting your fa you're wasting that [00:14:00] family or that client's time and money. Mm-hmm. And I think that is a violation of, of ethical code. It's if you either don't have the data or the documentation to back up what you're doing
Amy Wonkka: mm-hmm.
Kate Grandbois: Or you, you know, you're taking bad data and the data doesn't reflect what your goals and treatment outcomes are, and you're just keeping the client on because the family, you know, fa family thinks that more speech is better. Mm-hmm. Or the family thinks that, um. They should be able to do these things, even if it's not, you know, a reasonable and attainable goal and you keep them on and you don't have the data to support.
I think that that's a, that's a big no-no, it makes me really mad.
Amy Wonkka: Well, nobody has infinite time. Right? Right. So when you are providing a service, whether that's something the family is driving to, or you know, something that's occurring in a school day or something like that, it's coming from somewhere.
The time that you buy for speech and language therapy mm-hmm. Is paid for by the omission of something else. And whether that is [00:15:00] chilling at home in front of your tv, in your sweatpants. Mm-hmm. Or, I love sweatpants. I love sweatpants too, so, so much. Um, you know, whether it's that time or it's, you know, recess or, you know, it's, it's coming from somewhere.
So it's very important that it. Be effective. Mm-hmm.
Kate Grandbois: Yep. Absolutely. Um, so why do we take data, um, based on this art? You know, this article did this really nice job of describing the importance of accountability. Um, we need to take data because it's ethical. In summary, we need to take data because it is an evidence-based practice.
We need to take data for accountability. Um, but what components do we really need to consider when we take the data? So I don't think that, I think it's pretty commonly known that not all data is good data. So there is this concept of good data versus bad data, and I wanted to try and tease that apart just [00:16:00] quickly.
So let's take this scenario for a second. Let's say that you're taking data on the number of times that your client produces an accurate. Articulation sound. So let's say an accurate s sound. Sound, um, and you're taking that data on the medial and you are taking data on the production of that sound in the medial word position.
You're in the classroom, it's super noisy, your client is really distracted. You're doing the best you can, and you realize after the session that you missed a bunch of his productions, or it was so loud that you were, you weren't a hundred percent sure that it was the right production. The question then becomes, is that data going to effectively guide your treatment and your clinical decision making?
Was that accurate data? Was that data measuring what it was supposed to measure? And is it data that is worth moving forward with?
Amy Wonkka: Well, I mean, I feel like the answer is no.
Kate Grandbois: Well, I painted the scenario you did to just to, just to say that not just because you're making tally marks on your note [00:17:00] sheet or just because you're counting in your head because your hands are full, carrying a million different materials.
Doesn't necessarily mean that it's good data or bad data. There are a lot of different variables to consider when trying to classify whether or not it's data that you should move forward with. Mm-hmm. And whether or not it's something you should continue. Um, overall good data, quote unquote good data should have three components.
And again, referring to the old Swang and Bain article, they really do a nice job of breaking these three components down.
Amy Wonkka: What are the three components, Kate?
Kate Grandbois: Three components. So first data should be accurate. Accurate. What? Yeah. Data should be accurate. Go figure. Shocker. And it's not always because it is hard, it is logistically hard to take data sometimes, depending on the setting that you're in.
So accurate. We ask ourselves what is accurate data? Accurate data is when the observer values match the true values. So I'm gonna use the example of a scale, uh, a scale in your kitchen [00:18:00] or a, a scale that you would use to step on and show you your weight. Mm-hmm. But let's say for the sake of. This discussion, it's a kitchen scale so we don't have to talk about our weight.
It's a kitchen scale in the kitchen, and you want to find out how much this bag of potatoes will weigh on scale, happen. Scale all
Amy Wonkka: the time.
Kate Grandbois: I always wanna know how much my potatoes weigh. Me too, always. So if you put the potatoes on a scale and it says that it weighs one and a half pounds, that measurement is accurate.
If the potatoes actually weigh one and a half pounds. So the value measured by the scale is the true value that is accuracy.
Amy Wonkka: So one thing this makes me think of is articulation. Okay? So we talk a lot about articulation. With respect to scope of practice and collaboration. And I think that this is one piece that comes up frequently, right?
So as a speech language pathologist, I had to go to school and take phonetics and phenology and all this time transcribing
Kate Grandbois: IPA secret magic [00:19:00] language. Yeah, secret
Amy Wonkka: magic language. But if you don't know the secret magic language you and you're collecting data perhaps on that medial S production, perhaps somebody who doesn't know the magic IPA would not be collecting accurate data.
Kate Grandbois: Okay. Interesting. With that,
Amy Wonkka: right?
Kate Grandbois: I think that makes sense. Yes. Yeah. So number one, data has to be accurate. If it's gonna be good data, it needs to be accurate data. Number two, data needs to be reliable. So reliable data is when the same values are determined over repeated measurements. So let's use the same analogy, you put the potatoes on the scale.
'cause you're really dying to know how much these potatoes much every day. This is a ridiculous analogy, but we're gonna go with it. And the scale says that the potatoes weigh one and a half pound. And you come back tomorrow and you are just so curious about the potatoes,
Amy Wonkka: so curious, and they're
Kate Grandbois: change over time that you put them back on the scale.
If the scale now says that the potatoes weigh two pounds, where yesterday it was one and a half pounds, your scale is not [00:20:00] reliable. Mm-hmm. So reliability of measurement is when the values are determined over repeated measurements. And I think that this happens. With a lot of different skills that we measure in terms of observer drift.
So observer drift is when the observer has moved away from the definition of what it is that they're measuring. So let's say, can you think of an example of a example? Intelligibility. Okay. Intelligibility example.
Amy Wonkka: So, so intelligibility is a huge example of observer drift because as you become a more and more familiar listener, yes, you better understand the context of what the individual is saying.
Mm-hmm. And you're more familiar with their idiosyncratic production of speech. So as a listener, often what happens is you'll hear people say, oh, I really understand. Had real, a lot of concerns. At the beginning of the school year didn't understand much web, and you used your secret magic
Kate Grandbois: language of IPA and you transcribed exactly what they said.
Mm-hmm. Because you didn't understand anything.
Amy Wonkka: But halfway through the year, speech is way better. But if I don't use my magic language of [00:21:00] IPA to actually collect the data and get, get accurate data, perhaps my observer drift leads me to believe that the intelligibility has improved, when in fact what has improved observer drift is my ability as a listener.
Kate Grandbois: Right. So if you as a listener have changed your operational definition mm-hmm. Of what it is that you're measuring, isn't that Right? Then you are no longer a reliable measurement. Your data is no longer a reliable measurement.
Amy Wonkka: I mean, probably if you're, if you've drifted, you're not adhering to your initial operation, definit slash didn't have one to begin with.
Kate Grandbois: Right. Okay. So number two, data has to be reliable.
Amy Wonkka: Mm-hmm.
Kate Grandbois: Number three, data has to be valid. So validity is something that I think confuses me a little bit. Mm-hmm. And something that I had to keep coming back to, um, valid, valid data is how well something measures what it's supposed to measure. So, continuing with the potato analogy is a little challenging, but we're gonna go with it.
We're gonna go with it. So let's say you put the potatoes on the scale because you wanted to know [00:22:00] the weight of the potatoes in pounds, right? So the scale is measuring what it is supposed to measure, but all of a sudden it's a weird, this is a weird scale. And instead it tells you that the potatoes have a certain density that is unexpected, that is not a valid measurement of weight in pounds.
'cause it does not measure what it is supposed to measure. Um, so to recap, in order for data to be good data, it needs to be, it needs to be accurate, it needs to be reliable, and it needs to be valid. Um. So thinking about it from a big picture, in order to get good data that is all three of these components accurate, reliable, and valid, you have to have a data collection system that has, that is easy to use.
The feasibility of your data collection system is really important. We're gonna talk about this in a little bit more depth later in the episode when we [00:23:00] get into logistical components and hurdles and how we can mm-hmm. How we can get over those hurdles. But when you're choosing a data collection system, you really need to choose one that yields good data, but also is a realistic option within your setting.
Amy Wonkka: Mm-hmm.
Kate Grandbois: And that could be, that could mean any number of things it could have to do with your environment, how cluttered things are, what materials you need to have, how loud it is, how many communication partners you're dealing with. There could be so many different variables to consider in terms of feasibility.
Um, this brought us to another article by Jane Corsten called, um, meaningful Data. M making sense of the plus and minus. This was published in Perspectives on Augmentative alternative communication, the articles listed on our site. Um, and this article helped summarize things to consider when taking a data collection system to make it more feasible for your setting.
The article really had this designed for assist assistive technology, but I think it could be [00:24:00] applied to anything. Also, as a side note, when I was doing this literature literature search, I was shocked by how little information came up right away about data collection in and of itself. So our field in general, I think we value data for sure.
Mm-hmm. But I had to do a little bit of searching. There were like two or three articles that came up right away that were very specific to different disorders and disabilities, but there was not a ton on just data collection in general. So the articles that we have listed on our website, I think are a reasonable representation of what we have available at our fingertips.
I mean, I
Amy Wonkka: think a lot of this is also covered in textbooks for us. Yeah, so I remember back to my clinical practicum course and we had a couple of different treatment manual for speech language pathology, maybe by FOMO Roth. Is that correct? Are
Kate Grandbois: you literally remembering that off the top of your head? I think so.
Yeah. It's
Amy Wonkka: like in like a teal. There's like a, a teal and cover. What's wrong with you? I dunno. Oh my god. It's a good book. But, so there are a couple of different Yeah, great book. Great [00:25:00] book.
Kate Grandbois: I have no idea. I've never read it.
Amy Wonkka: No. Well, maybe you didn't have it, that's why you don't remember it. I, I, yeah, I didn't have it.
There was also clinical management speech, language pathology, maybe dark blue cover. I, Nope, red, red, square, I square, have no idea. White spiral binding. Anyway, there are books out there. That you may have had in your clinical practice class that covers dust off
Kate Grandbois: your graduate school textbooks.
Amy Wonkka: People don't sell those things back.
You hang on to them for 20, 30 years. They're still helpful. Oh God. Um, but they cover writing manageable goals and objectives and measurable goals and objectives, I guess manageable and measurable. Okay. But I think that that's where a lot of that comes into play as well, is in our coursework,
Kate Grandbois: Uhhuh.
Amy Wonkka: And it's less fair point.
It's, it's less article
Kate Grandbois: heavy. Yep. No, fair point. Fair point. So anyway, this Jane course article was really geared towards, um, assistive technology, but I think it listed a really nice, um. Set of things, criteria to think about when dining, when dining, I'm hungry. When designing your data [00:26:00] collection system, um, for your setting.
So some of the things to consider include what is the goal that you're trying, what is the overall language goal that you're considering? What are the minimum performance criteria for the task? So, you know, it's great that Johnny said the word eat 10 times, but, and he used to say it zero. Mm-hmm. That's great.
That's huge. But if he said it two times and now he's saying it four times, what is that A big deal? So what is the minimal performance criteria that you're looking for? Mm-hmm. What are the possible obstacles to success? This is very much rooted in the environment and the stakeholders and communication partners.
What can be measured to demonstrate success? So, um, I talk about this a lot in my setting with vocabulary. So I am so thrilled that Johnny can say his colors at the desk. That means nothing to Johnny unless he can ask for the yellow green, the yellow jellybean. Mm-hmm. Because he cares about jelly beans and [00:27:00] he needs to use a modifier because it is functional and it is going to have a positive impact on his life.
Mm-hmm. And improve his quality of life. So what can be measured to demonstrate success can have a lot to do with the setting and the context in which you are measuring it or the condition in which you're measuring it,
Amy Wonkka: and probably also what's meaningful for the person.
Kate Grandbois: Right. So what is socially significant Yes, for the individual.
Um, how can the data be framed in order to identify whether the criteria was even achieved? So again, identifying some and defining some of those conditional criteria to make sure that the data that you're collecting is something that you care about. Um, when the criteria is not met, will the data provide information needed to make changes in either the tools or strategy?
So even a fail can be a win. So you didn't, Johnny, you know, your target was to have Johnny say a target word 10 times and he didn't do it, but he did say at one time in this one setting. So maybe [00:28:00] it's. Or he did say this one word in the gym. Maybe there's something about the setting of the gym that we really need to think about.
'cause he loves to play sports and he loves to play ball, and that was a really motivating activity. So you didn't meet the goal, but maybe you learned something else about the client.
Amy Wonkka: Well, along the way, you also learn, even if he doesn't say it at all, then you learn that with this either individual intervention or treatment package, we didn't see the results that we wanted, right?
So we need to revisit it, which gets at the later.
Kate Grandbois: I sometimes really like when a fail becomes a win because I think it's a nice way to reframe. The power of data collection mm-hmm. For the team. So yes, this didn't happen, but this, these are all the things that we learned, right? Because the process is cyclical.
You know, you go, you go through the motions of taking the data, revisiting, analyzing it, and then taking what you learned and starting again. You don't throw the baby out with the bath water. You, you sharpen the pencil
Amy Wonkka: Well, and in assistive technology, we talk about that all the time, right? The process is ongoing because all of those different, because [00:29:00] people aren't static.
Right? And when we think about communication or we think about educational access, we're looking at that as a process. And that's an ongoing process. Your environment changes. The tasks change, right? All of these different changes are going to affect your progress. Sometimes we try something and come back and revisit it in two years and those strategies that didn't work before do work now because there have been substantial changes.
Mm-hmm. Definitely,
Kate Grandbois: um, who is gonna collect the data and when?
Amy Wonkka: Mm-hmm.
Kate Grandbois: Where are they gonna collect the data? How often are they gonna collect the data? Um, these are all things that have a lot, that can have a lot of restrictions based on the setting that you're in.
Amy Wonkka: I feel like also one thing that I found with data collection is sometimes, oftentimes more robust is actually not better.
Mm-hmm. Because of all of those variables who will collect it. Because if your best chance at getting really nice, accurate, and reliable data [00:30:00] is a tiny 15 minute burst of data collection, because realistically feasibly, that's all you're going to get. Mm-hmm. You want that to be your criteria so that you can compare your data over time.
Mm-hmm. If I one time take a language sample and it's a 30 minute language sample, and then again take a 15 minute language sample, those. We're back to your potatoes. Now I have the density, it's telling me different things.
Kate Grandbois: Yep, yep. Um, and I think finally one thing that will change often, depending on the variables that you're working with, is who will analyze, summarize, and share the data.
And I think the data analysis can be really important just to exactly what you just said. If somebody isn't thinking about these qualities of data and they say, oh, I have a 30 minute language sample, and then I took another one that was 15 minutes, they're not necessarily going to be analyzing the data and extracting the powerful and meaningful components to it.
Mm-hmm. If that makes sense.
Amy Wonkka: I also think back. In the day. This, this podcast is a nice opportunity to reflect on things that, uh, would do differently in the in the [00:31:00] future. Personal growth. Personal growth. But there, when I first got outta school, I feel like I did a lot of data collection and not enough analysis.
So I find that that's a helpful interval even to set for yourself. Mm-hmm. So whether that's, you know, you have a consultation every month, or you write your progress notes for your third party funding agency every 60 days, that's your marker. Mm-hmm. Progress reports, people who are in a school environment, perhaps you report quarterly progress.
Those markers are helpful because they set the stage for you to do that. And if you're working in an employment environment that doesn't set those markers for you, I would encourage you to set them for yourself so that you have defined points at which you tell yourself, alright, I'm gonna review. This data and use it to inform my decisions and plan for my future treatment choices for this individual.
Because if you're just, just collecting data itself isn't magic, it doesn't do anything. Mm-hmm. If I just collect it and put it in a pile and put that pile in my desk, you might feel
Kate Grandbois: good about [00:32:00] yourself. Correct. But it's not gonna do, oh, I took my data. I feel really good.
Amy Wonkka: You're helping the folks who made post-it notes.
You know, I'm so ethical because I took my data. Correct. But without the analysis component, it's not Right. It's not forming your practice. It's useless. It's totally useless. Yeah.
Kate Grandbois: Um, and so the last I think point that was made really well by this Jane Corton, um, in this Jane Corton article, this meaningful data making sense of plus and minus is the different privacy restrictions that can come along with different ways of collecting data.
Mm-hmm. So it's 2020, hashtag 2020, there's technology everywhere, everywhere. We live in the future. Everywhere. Everywhere. And there's all these apps now.
Amy Wonkka: Mm-hmm. For
Kate Grandbois: different ways for you to collect your data, which, you know, cuts out a lot of the, um. Yeah. Accessibility to easy data collection. Sometimes like, you know, you don't have to carry your clipboard and your pencil and all these other things if you know you have your phone on you or you know, you're, I, I don't know, using some other technology iPads in your setting or what have you.
Or maybe you are taking data with an a a C user and you're using one [00:33:00] of the internal data logging components. But there can be a lot of privacy restrictions from this. I'm waving my red
Amy Wonkka: flag here. I'm waving it.
Kate Grandbois: She's physically waving, her arms
Amy Wonkka: physically waving. Well, there's no flag. There's no flag. The flag is imagined.
The flag is implied. The flag is implied. Um, when, when we think about that, it's a really good point because depending upon the environment within which you work, you have hipaa. Mm-hmm. You have ferpa. Mm-hmm. You have the Student Privacy Act. Mm-hmm. And there are a lot of considerations. So. You know, as a, as a dinosaur person, I totally still rock a post-it note in a paper data sheet.
Mm-hmm. Because the one thing I know is that I can make that as free from identifiers as possible. Mm-hmm. Put it in my personal file and, you know, that's, that's not going out into the internet anywhere. So while the, into the
Kate Grandbois: vast obscure internet,
Amy Wonkka: yes.
Kate Grandbois: It's crazy where the robots can do
Amy Wonkka: anything. The robots, the robots can do anything with that stuff.
So when, when you think about your data collection. [00:34:00] Be excited about the tools, but also be informed and make sure that you are meeting the privacy requirements of your particular setting, because it's going to vary depending upon where you are.
Kate Grandbois: And if you aren't sure or you don't feel informed about what those things are, then self-advocate and talk to your administration.
Mm-hmm. Because nothing is worse than being in the hot seat and accidentally breaking a law and then realizing that you did something totally wrong. Mm-hmm. So there's a lot of apps out there that claim that they're so great, but don't do it unless you have explicit permission mm-hmm. To do so and talk to your administrators regardless of your setting.
Um, so that's a sort of a nice segue into our second point of the day, which is what are some hurdles to collecting data and what are some strategies to overcome those hurdles? Mm-hmm. Um. The first point I have listed here is that it's bulky. So the technology sort of has a nice appeal sometimes because it can feel less bulky.
Amy Wonkka: Mm-hmm.
Kate Grandbois: In terms of, you know, I think every speech pathologist has had the experience where [00:35:00] they're sitting at their treatment table with little Johnny, and he's so cute. Um, but he starts like eating the, mouthing the materials on the table, or he's crawling under the table and refusing to get out. Mm-hmm.
And you've got your note out or your soap note or whatever it is, and you've got your little tally marks and then all of a sudden the pencil's on the ground and, you know, it's a, can be another difficult thing to physically manage. Um, and that is where I think technology can seem really appealing.
Mm-hmm. Um, the other hurdle, I, one of the only other hurdles that I could identify was that it can be time consuming. And this is, I think back looping back to your point, which is more data is not necessarily better data.
Amy Wonkka: Agreed. And I would add to those. One of my biggest gripes with data collection has always been, it it removes you from the interaction.
Yep. Right. So when we think about [00:36:00] speech language pathology, a lot of what we're doing is communicating with someone else. It's transactional. Mm-hmm. You need to be present in the moment. So if you are constantly distracted and over to the side collecting data, you're also not able to capitalize on forming relationships with people.
Mm-hmm.
Kate Grandbois: And
Amy Wonkka: when
Kate Grandbois: you, I and you can miss things, your eyes are on the sheet mm-hmm. And you didn't see somebody sign, or you didn't see somebody engage in some sort of nonverbal communication act,
Amy Wonkka: and even just your, you're less present. Mm-hmm. You're less present as a communicator mm-hmm. When you're supposed to be there helping to facilitate somebody's acquisition of next level communication skills.
So when you are, are always somewhat removed, it does have an effect on the, the relationship mm-hmm. That you're having with your clients. I also, to back it up, I see another hurdle when we don't write measurable goals and objectives. Mm-hmm. So, which is unethical. Correct. But it's, but it's hard to do.
Mm-hmm. Right. So it's hard to do and it's hard to strike a [00:37:00] balance between writing a goal and objective that is measurable, but also doesn't trap you into something that's rigid and artificial. Mm-hmm. And detracting from, again, a, a genuine transactional relationship with another person. Yes. I think that this is where with data collection, having done, you know, the a BA coursework, I think people tend towards sort of, sort of a mass trial, repetitive practice.
Repetitive practice of something is the easiest way to collect data. Mm-hmm. I'm gonna do the same thing with Kate over and over and over again and just make a plus or a minus depending on how she did it, which, which might be necessary for you to learn it at first. Mm-hmm. But that's not real life, right.
We don't. That's not how people live. So I think the more naturalistic you get, the harder it is sometimes to craft that goal. And the harder it is to find a way to collect data that doesn't feel weird and and extraneous.
Kate Grandbois: Yep. Definitely. And I think that it's somewhat related to my next point, but maybe not at all, which is I know [00:38:00] one of the hurdles can be access to information.
Mm-hmm. So you have a complex behavior or complex communication act that you're trying to measure and you don't know what the best measurement is, or Yeah. You're less familiar with how to take duration data and, or, you know, you're less familiar with time sampling data. Yep. You're less familiar with anything other than a tally mark.
I think, you know, we, we really heavily rely on tally count data basically. Um, but there's lots of other different kinds of data out there.
Amy Wonkka: I would agree with that. I feel like my. Initial data collection methods were almost all frequency count, right? Mm-hmm. That's like another word for tally marks. Mm-hmm.
So sometimes I, I also developed a relatively elaborate coding system where I would track, I would write different letters depending upon the type of prompting that I, that that was effective for the person. Oh, like a
Kate Grandbois: g for a gesture prompt. Mm-hmm. Mm-hmm. Or like a, I use BSG for button specific prompt.
Okay. As an a, a C [00:39:00] person,
Amy Wonkka: I use sp sustain point, but that's cool. Okay. Sp and I, and I think like to step back from that a little bit, sometimes, sometimes you need that information and it's helpful, but what I found is that just became my default and I collected that level of information when I didn't need to.
Yep. Um, so just being mindful about. Again, like what your actual question is, what's your clinical question? What am I trying to solve with my data? What am I trying to figure out? I'm smirking because,
Kate Grandbois: because in, in my family, SP means secret poop. So when you said, I call it sp, that's what someone in my family would say when they needed to go have a secret poop.
So,
Amy Wonkka: so again, I mean, operational, whatever acronym you
Kate Grandbois: use to whatever acronym you use, I think you, they just need to be specific to you. Oh, yes.
I hope I don't regret operational. Oh, I don't regret that. Anyway, moving on. Okay, so, [00:40:00] so access to information mm-hmm. Is a hurdle. It's the fact that it's bulky in and of itself is a hurdle. It's very time consuming. Um, it may take away from the interaction and it takes away. Yep. And it can take away from interaction.
So. So aside from the, so now that we've identified some hurdles, I think it would be really helpful to talk about some strategies. Mm-hmm. And one of the first strategies I have listed here in my notes is something that you've already touched on, which is taking good data less often. I think this is my favorite one.
Okay, good. That's excellent. Yes, so, so in terms of taking good data, less often. Can you, is there a way for you to recap exactly what that means? 'cause we've touched on a, a lot of different points.
Amy Wonkka: So I feel like for me, good data to go back to what we were saying before, you know, good data is going to be that data that is accurate.
So it's gonna tell you. That the potatoes weigh one and a half pounds when the potatoes weigh one and a half pounds. Right. It's going to be reliable. So if I weigh [00:41:00] the potatoes today, and I don't take any potatoes away, potato analogy
Kate Grandbois: is gonna haunt me in my sleep. Sleep and
Amy Wonkka: tired. We're gonna make french fries.
So secret. Secret poop. Secret poop.
Kate Grandbois: This is gonna haunt me in my sleep. Put it out for, it's a good thing. This is a data collection episode that nobody's gonna listen to
Amy Wonkka: because
Kate Grandbois: data is so boring.
Amy Wonkka: No data's great. Everybody should listen. Okay. Tell your friends I interrupted
Kate Grandbois: you. Go ahead.
Amy Wonkka: No, it's good. So we want it to be accurate.
We want it to be reliable. We want it to be one and a half pounds over and over. And we also want it to be valid. We want the potatoes to be measured in mass, not density. Right? Right. So, okay. When I think about taking good data, less often, I want it to be all of those same things. So thinking about duration, how long am I collecting it for?
I think that that gets at, mm-hmm. The validity piece, right? Am I comparing apples to apples? Am I getting like different types of information? Um, making sure that there are clear operational definitions about what you're recording. So. That comes even more into play. I think that comes into play as an individual who's collecting the data yourself, but even more so if you're considering having other people collect [00:42:00] that data.
Mm-hmm. So an operational definition that is understood clearly by one, one data collector may not need to be quite as specific as it would if I was then to say to you, Kate, take my data sheet and go collect data on client X, Y, Z. You might have questions. And some of the things that like mean something in my head, like sp might mean different, might mean different things when I hand that data sheet to you and might need additional definition, a permanent product, Uhhuh.
That's right. That's right. Can't take it back. Moving on. It's cool. Um, go ahead. Um, so, so thinking about that, what level of, what level of operational definition do you need? Um, and then just thinking about making sure that it's defined enough so that no matter who's doing it, you're gonna get a consistent measurement over and over and over again.
Kate Grandbois: Right. Um, and taking, and I think when you do those, when you consider all of those things and do it less often, the actual act of taking data collection can feel way less overwhelming.
Amy Wonkka: Well, and it like stops me from hating it for [00:43:00] changing and like making my interaction with another person artificial and weird.
Kate Grandbois: Right.
Amy Wonkka: So I feel really good about, you know, maybe every week if I see you frequently, maybe every month. If you are coming to see me outpatient, like once a week, maybe every month I do some probe data where the first, like, she's laughing. We have different words for things. It's fine. Caitlin's, go on. It's good.
Um, you know, I do some data where I'm not spending the whole session collecting data. I'm spending a defined period of time collecting data on a specific set of defined behaviors. And then the rest of the time I'm just being a therapist. Yeah. Right. And I'm focusing on the interaction. I'm focusing on cues.
I'm not overly hung up and, you know. The client hopefully feels like I'm present and they're with them and a genuine communication partner, right?
Kate Grandbois: So my favorite strategy is not take good data less often. It's
Amy Wonkka: graph
Kate Grandbois: on my notes here I have graph, graph, graph, exclamation point, [00:44:00] exclamation point, exclamation point.
Graphing is such a great way to quickly and visually analyze data. So if you have your sticky note or whatever with your tally marks or percentages or whatever measurement you took and you throw it into an Excel spreadsheet, you can. Throw a graph together with the internal Excel. You know, they have the little, like TaskRabbit, they don't have the paperclip anymore.
That dates us. Remember that little paperclip that we dance around? Yes, yes. So, you know, they have different tools that you can use to help you graph something, but then you throw it together and you can see trends. You can compare data collection in one, data collected in one setting to data collected in another setting.
Um, it's a really quick and easy way to analyze data that I don't think is utilized often enough by speech pathologists.
Amy Wonkka: Okay. I, I still love take good data less often, but I also like that one. Well, it's not a
Kate Grandbois: competition. I also like that one. No,
Amy Wonkka: I mean it kind of is,
Kate Grandbois: kind of is.
Amy Wonkka: Oh yeah. Yeah. So another, I think we can agree on the third one.
I have one more related to graph. Graph. Graph, okay. You [00:45:00] don't even have to draw a graph. You can use some other software. Probably Excel. My Excel skills are just like so, so subpar, but looking at Google sheets. Again, be aware of your privacy compliance needs, but if you have a hipaa, FERPA compliant G suite.
Suite account, G suite, yeah. Um, you can also set formulas in sheets where if you enter a certain value, it just changes the color code of the cell. So you could set a criteria at, you know, let's say I want 50% of the time, I want you to respond by turning your body toward me when I say something to you.
Mm-hmm. To work on some of those joint attention skills and reciprocal communication. When I plug in my data for each date, I can just visually look really quickly and see like how many cells are green.
Kate Grandbois: Right. Okay. I like that. So you can
Amy Wonkka: set it under conditional formatting. Interesting. And so that if you don't have the time or you don't wanna fuss with building graphs.
And you're just kind of reviewing [00:46:00] it yourself. Okay. I feel like graphs are better for sharing with other people, but if you just want a quick and dirty, like visual scan, you can share that
Kate Grandbois: with a parent or something you, and just explain. You think it's, it just looks a little
Amy Wonkka: less fancy.
Kate Grandbois: That's all. Looks sort of like your own version of like a pie chart, isn't it?
Which, which section of the pie is bigger or
Amy Wonkka: just lets you look and
Kate Grandbois: see. Isn't it
Amy Wonkka: like how much green? Is it like an navy special? Is there more green? It's, yeah, it's an Navy special.
Kate Grandbois: It's a navy special. Okay. So, but I think something we can agree on is having a good data sheet. Yes. Hearts. Hearts. For data sheets.
Hearts for data sheets. So shameless plug, we are gonna put up some tried and true data sheets available for download on our website. Mm-hmm. Um, and I think this is, the data sheet component piece is a very, um, personal thing. I mean, it really does need to be designed to the specific needs of the client. I personally like data sheets that are check format.
So you just like write the target word and check, check, check all along the row. Yeah. To list all the different, 'cause then you can collect lots of different [00:47:00] kinds of data within one row.
Amy Wonkka: I think it de is really variable depending on who is collecting the data and what type of data you're collecting.
Mm-hmm. So I find that I'm able to use a lot more descriptive and detailed data if I am collecting data about maybe the frequency and type of utterance for somebody who's a more emergent communicator. Mm-hmm. Right. So if you're gonna have a low frequency behavior mm-hmm. Which would be somebody who's not using a lot of words right now, and when they're using words, they tend to be single words.
Mm-hmm. Or they're using their body to communicate that type of stuff, you can kind of write down as a bit more. Detailed data. Mm-hmm. When you, one way I can tell that a student has made a large jump in their communication skills is when I used to be able to maintain log data. And then I go in to do an observation and they say so many things, oh yeah.
That I can't keep up, and then I have to go back to the drawing board and revise my [00:48:00] data sheet. And that is always such an exciting time. Right, right. But it's also, you know, it's, it's very telling because when you reach a certain point of length of utterance mm-hmm. It's very hard. And that's why, you know, there's a lot of recording mm-hmm.
That goes on if you're doing like more involved mm-hmm. Narrative analysis and things like that. That was gonna be
Kate Grandbois: my next thing, is that, you know, if you're in a setting where you are allowed to record your students mm-hmm. Either by video or audio, and then go back and code it later, that can be helpful.
Or if you have an SLPA or a speech pathologist in your department who's willing to come and help you out, um, having an extra set of hands can be a really nice way to work around some of those issues.
Amy Wonkka: And for people who are in a school environment, sometimes I'm the person who's collecting that data.
When, when the student is in a different environment. Right, right. So I go in and I'm able to collect and code that data. Um, but yeah. Recording. You always wanna make sure that you have written consent,
Kate Grandbois: video and audio recording. Yep. Yep, yep. Um, which brings us to the point that we've already made a couple of times.
Mm-hmm. Just, you know, maximizing your technology and what [00:49:00] technology you have access to. Um, I think when you're talking about a, a c users, that's something that becomes a little bit, um, more fluent because you may or may not already have privacy permissions, um, in place with that family. If they already have a device and you're storing their data on a cloud or a backup system and all those kinds of things.
Um, but maximizing technology is definitely something, um, to consider. There is a good article that reviews this, um, called Technology to Support Data Collection and, and Management in the Public Schools. It was written by Jennifer Walt Garrett and published in Perspectives on School-Based Issues that will also be up on our website.
There's a lot that goes into technology, so we're not gonna talk about it too much here. Um, but that's a really good reference if it's something that you wanna learn more about and then take to your administrators 'cause you think it would be a good strategy for your setting.
Amy Wonkka: And I think the big takeaway for technology with respect to data collection is just be aware that that should be raising questions for you, that you should do a little bit of research before you go ahead and move forward [00:50:00] with it.
Mm-hmm.
Kate Grandbois: So final talking point for the day, what kind of data should you collect and why? Um, this is a really good opportunity to loop back to the Jane Corsten article. Um, meaningful Data, making Sense of the Plus and nine and Minus. Um. In terms of why, you know, you decide what kind of data col collect, the kind of data that you decide to collect should be rooted in what your goals are, um, and what the minimal performance criteria is for the task at hands.
And those two questions together are really gonna help you flush out exactly what components of your client's communication it is that you really wanna capture in your data. Um, there are lots of different measurement types that lend themselves to different kinds of communication behaviors. So, um, again, not to keep referencing this article, sway and Bain, but it was so good.
They give, you should read it. They give it's, so give a really, really great description of the different kinds of data measurements and why you would use each one of them. [00:51:00] So, for example, quantitative versus qualitative data. What is quantitative data? So quantitative is objective, measurable, um, data that measures communication behaviors using quantitative concepts.
Um, quantitative concepts are those behaviors that, um, are, I think I'm repeating myself. They're just objective and can be usually quantified by a number. So like stuff with numbers? Yeah, basically. Okay. Stuff with numbers. Okay. Um,
Amy Wonkka: but stuff with numbers doesn't tell the whole
Kate Grandbois: story. Well, and it's really what numbers do you use?
For what you're trying to do. So again, this article has a really nice, um, appendix at the end and we're just gonna go over some of them. Um, there's frequency, which we've co we've talked about a couple of times. Yeah. I think most people are most familiar with counter talent, which can also be called a count measurement or Italian measurement.
It's the number of times that a target behavior was observed and counted. So Johnny said a word eight times. There's rate, which is the same as frequency, except [00:52:00] there's a specific time component. So Johnny said a target word eight times in 15 minutes. Mm-hmm. Um, that may be something that you're interested in looking at for, um, someone who has apraxia issue or some other movement issue or some other other behavior that's distracting them, or a hurdle to communication.
You want them to be more fluent in their communication. What else would you use rate for?
Amy Wonkka: I mean, I can think about rate in terms of activities. Mm-hmm. So thinking about a school environment, thinking. Perhaps elementary school, maybe you have a 15 or 20 minute recess time. You have a student and you want them to respond to their peers or initiate with their peers a certain number of times compared to what they maybe right now they're not doing that at all.
So you might look at rate in terms of, you know, within the context of right, this activity, how many times do they
Kate Grandbois: do it? Yep. There's duration, which is the measurement, um, of the amount of [00:53:00] time that someone engaged in behavior. So Johnny jointly attended to a task with a peer for 45 seconds. That was really one of the only examples, um, I could think of with duration, because so many components of communication are concrete.
I
Amy Wonkka: think
Kate Grandbois: voice,
Amy Wonkka: voice is where you think about duration a lot, right? So phonation and respiratory support, how long can you sustain?
Kate Grandbois: Mm-hmm. Or maybe social skills, like thinking of those communicators who have preferred topics, so mm-hmm. They engaged in a dialogue for. About trains for 40 minutes without letting their communication partner have a turn.
Those kinds of things.
Amy Wonkka: I mean, could you also think about duration relative to the level of support also? So during Oh yeah. Oh, I like that. You know, centers so and so, you know, my client will engage with peers for increasing a certain amount, increasing duration of time. Time. Yeah, I like
Kate Grandbois: that. Um, there's fluency, which is the measurement of how quickly a learner can give a response within a [00:54:00] period of time.
And when we say fluency, we don't mean anything to do with disfluency or quote stuttering. It's more related to the ease with which a learner can engage in a task. So thinking of fluency as related, uh, I think a good example is thinking of fluency as it relates to reading or a naming task that's related.
Yeah. Rapid naming related to word finding difficulties. Um, so using the same example, Johnny named 15 animals in 20 minutes. 20 seconds.
Amy Wonkka: Mm-hmm.
Kate Grandbois: Um, response latencies, uh, this is a one I think that's really underused. I really like response LA latency data. So latency refers to the amount of time that passes between when you expect someone to engage in a task and when the response actually begins.
So this is something that could be, I use this a lot for initiating communication, um, responding verbally to responding to verbally delivered directions or executing a task after a verbal direction has been given. So for some of those kids who, like a lot of my more emergent communicators with autism, they have a lot of, [00:55:00] um, self stimulatory behaviors that can be hurdles to initiating communication.
So I like to see the response latency measurement decrease. So they show interest in a stimulus, they show interest in their snack or whatever, and I want them to pick up the picture of the snack and hand it to their communication partner. And not get distracted walking around the room, engaging in Stereotypy, and then eventually walk over to their communication partner.
There's a lot that goes on in that example, but response latency I do think is a measurement that does not get enough attention.
Amy Wonkka: And I think response latency comes into play a lot with responding to joint attention acts also. Mm-hmm. Also. Mm-hmm. Mm-hmm. So, thinking about, you know, pointing to indicate what is the duration of time between the Communication Partner Act and the response to the Joint Attention Act, I also consider response latency sometimes as a prompt in itself.
So just being aware of response latency. [00:56:00] Mm-hmm. And then. When you've reached a certain criteria mm-hmm. Then it would kind of qualify as independent. We all use sort of a therapeutic pause with that, like expectant face. Mm-hmm. Like it's your turn to say something. Mm-hmm. Mm-hmm. Um, and that is still some, some level of cue.
Mm-hmm. Which for some folks might remain in accommodation in something that they always need, but being able to fade that latency out, you might, you know, identify at, at this point in time, this is, this is a, this is the goal. Latency responding
Kate Grandbois: time. No, I think that's a really good point. Um, those, that's a cursory list of different quantitative measurements that, again, you can find, um, in the article that we've listed on our site In terms of qualitative data, qualitative data is generally speaking.
Not the number data, so it's more subjective. Um, it can be, often it's a narrative form. Um, and if it's done properly and systematically, it can be very [00:57:00] valuable. But I think the properly and systematically is the key components to make sure making sure that qualitative data is, um, is effective, is accurate, reliable and valid.
Um, qualitative data can come in the forms of interviews, document reviews, observational reports and questionnaires. Um, I think in, at least in my practice, I see qualitative data coming a lot from communication partners. Mm-hmm. So, oh, the, you know, the wonderful story you hear, oh my gosh, my son. Brought me his communication materials in the sh while I was in the shower and asked me for a snack.
And that's never happened before, right? Mm-hmm. Like that's like you love those stories, um, but sometimes those little anecdotes or those staff reports or parent reports can be muddled with different qualifying variables that you, you weren't there, you didn't see the exact cut and dry of what was happening.
So I do think that unless you have a good grip [00:58:00] on the systematic application of that measurement, you might not, it might not be accurate and valid and reliable.
Amy Wonkka: I feel like I love me some qualitative data. Um, but when it's, I think data paints the most accurate picture for us when there's a little bit of both.
Mm-hmm. So, to come full circle mm-hmm. When we think about the soap note, the subjective. Objective assessment and plan. Aren't you so
Kate Grandbois: glad you Googled that?
Amy Wonkka: I really am. I really am. That the subjective component kind of correlates with this qualitative data. So when we think about people's reports. How we felt about it, what we noticed subjectively about the client, that's, that is qualitative data that can also color our quantitative data.
Mm-hmm. So if I'm collecting my latency data, but I also notice that my client appears really tired and mom said he didn't sleep well last night. Mm-hmm. I do wanna [00:59:00] make a note of that, because that's a qualitative, that's a very good point. Qualitative modifier. That's a very good point. That informs my interpretation of my quantitative data.
Numbers don't tell me that. Mm-hmm. But numbers alone aren't a whole person. Yep. And so the combination of qualitative and quantitative data, in my perspective, is sort of necessary to make, again, we're not running a research study. I'm, I'm not a researcher, I'm a regular Joe. You know, I don't, I, I'm, I'm doing therapy, right.
I'm doing therapy, and I am providing consultation about real people in real time, in real life, in real situations. And so without that qualitative piece, the quantitative data kind of isn't fully complete. Mm-hmm. And so I think that finding a way to, and they, they talk a bit about it in that awesome article, but old swing and and bang.
Um, old swing. Old swing. I think old swing. Yes. I'm sorry. Old. If we're saying your name wrong, old swang andin, send us an email and we apologize, but we love your article. Uh, they do [01:00:00] talk about, you know, ways that you can sort of better quantify qualitative data. And I think that that's an interesting thing to think about as well.
Uh, anybody who's been to the doctor who's used the pain scale mm-hmm. That's sort of what they're trying to do there. Right? Right. Like, they're trying to take a pretty nebulous concept like, didn't sleep well. Well, did you like not sleep well, it's a 10, right? Or did you not sleep well? It's a four. Right.
And not that we need to do that for all of our qualitative data, but I think that that's also an interesting idea that they brought up.
Kate Grandbois: Definitely. So in summary, data is important.
Amy Wonkka: Mm-hmm.
Kate Grandbois: Data is ethical taking data is ethical. It's part of our ethical code. Um, data, good data is better than bad data. Good data is valid, accurate, and reliable.
It's better to take little amounts of good data instead of lots of bad data. So take good data less often. Um, this, if you read nothing else, this Ssw and bang article. Bang [01:01:00] I said, and Bain Poor guys. I, oh, this is really terrible. Yeah. We apologize to you. I'm sorry. Formally formal apology. Um, this article, if you read nothing else, read that article.
Um. In summary, I would say that the combination of using properly systematic qualitative data in combination with quantitative data is your best bet. Mm-hmm. Would you agree?
Amy Wonkka: And I would say as you're writing your goals and objectives, have data in the back of your brain, like, think about that. Yep. As you're writing your goal, okay, this is really what I want my client to do, how am I gonna measure that?
Mm-hmm. Because if you write all of your goals as frequency or tally mark goals, then that's gonna make it hard for you to be flexible at all with your data collection. Yes,
Kate Grandbois: definitely. Um, so yeah, data collection is maybe not as boring as we as we thought it was. Turns out it's riveting. Turns out, turns out coming full circle.[01:02:00]
So, enclosure, um, there is a certificate for certification maintenance hours available on our website. Go to our website, www.slpnerdcast.com. Click on the episode, take a short three question quiz, and purchase your certificate. There's also a handout available, which will have a list of our key talking points, as well as, um, not PowerPoint slides, brief handout with our talking points and the recent and the reference list.
The reference list is also listed on the episode description on our website. And email us with any questions. Thanks guys.
Amy Wonkka: Collect your data.
Kate Grandbois: Collect your data. Yay.