false
Catalog
Making Your Data Work For You
Making Your Data Work For You
Making Your Data Work For You
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hi everybody, welcome to Making Your Data Work for You with Dr. Bethany Backus and Dr. Julia Lomi from the University of Central Florida Violence Against Women's Cluster. Before we get started, I'd like to offer a disclaimer. This webinar is hosted by the IFN, and it is funded through the Office of Victims of Crime under a grant through the U.S. Department of Justice. The opinions and findings and conclusions or recommendations expressed in this presentation are those of the contributors and do not necessarily represent the official position or policies of the U.S. Department of Justice. Again, this is hosted by the IFN, and we are excited to have you here today. Without further ado, I will let Bethany and Julia take over. Great. Thanks so much, Shay, and we're happy to be here. It's great that it's a small group of folks, so feel free to post questions in the chat as we go along. We'll definitely try to get to them. We also have some polls as well. So just so everyone knows, and I know a few of you that are on this already have worked with you in some way or are currently working with you, I'm Bethany Backus. I'm in the Department of Criminal Justice and Social Work here at the University of Central Florida and serve as kind of our science or co-lead of the Violence Against Women Faculty Cluster, which is an interdisciplinary group of researchers at the university looking at primary, secondary, and tertiary prevention and responses to violence against women. Just a little bit about me. I've been in academia for about four years. Before that, most of you know me primarily through my work at the Department of Justice, the National Institute of Justice, where I was there for a decade, where I oversaw our programs of research on gender-based violence, violence against women, and did a lot of work on the forensic side of things, social science forensics, sexual assault kits, sexual assault forensic examiners, so worked with some of you on those projects, and also did a lot of work on our domestic violence and stalking programs as well. So a lot of kind of what I'm going to be talking to stems from a lot of what we did with our practitioner partners at the agency to help them really understand their data and how to put them in a better position, maybe to go after funding, to present to boards of directors, to garner more community interest, and things like that. All of my work is very community engaged, so I like working with practitioners and people in the field. Prior to DOJ, I spent about eight years in the field doing direct services, and either clinically as a social worker, but also in community health and research positions. So I'll let my colleague, Dr. Olomi, introduce herself now. Thank you, and I feel like the order should have been flipped because I feel self-conscious following such a wonderful, wonderful resume. My name is Dr. Julia Olomi, and I'm an assistant professor in psychology at the University of Montana. My research interests kind of focus on gender-based and family violence, and then systemic responses to that violence and how that prevents or increases risk for re-exposure, re-victimization. I am also a clinical psychologist, and I primarily focus on working with trauma-exposed children and their families. And I used to be Dr. Bethany's co-stock at UCF last year, and so I'm really happy to be able to present with her today. My research is also very much community engaged, and part of my mission is to make sure that the research that we do produce here is community engaged and informed to make sure that the findings actually not just, you know, advanced science, but also provide applicable and relevant and sustainable recommendations for our community partners. So I'm very excited to hear how this presentation might be useful for you all. Yeah, we want to thank IFN and OVC for providing us this opportunity to talk with you today. And let's see if I can get my slides to work. So this funny statement, Shea already kind of went over it. We also have it here, this disclaimer, just saying that the opinions, findings, and conclusions, recommendations are those of the contributors and not expressly the views of the, or the position or policy of the U.S. Department of Justice. I'm very familiar with that disclaimer, and it's required on everything. So just to let you know what our agenda today is, we're going to first tell you about the purpose of our presentation. We're going to get into the nitty-gritty of like what is data, what does it mean, what are we talking about when we say the word data. We're going to talk a little bit about how to take stock of your agency and community data. And so throughout this, we also want to challenge you to maybe jot down some notes or start to think about things you might already use or things you might want to access in your community data that you might not have thought of before. We're going to spend some time building logic, talking about building a logic model. We're not going to build one, but we're going to give you resources so you can build one or enhance the one that you already have. And we're going to talk a little bit about elevating your profile with data. And if we have time, we have what we probably would think of as handouts, but we have these tips and tools or guides and resources that we have at the end of this, but we will be providing the presentation to folks as well, so you can easily access that information. So for today, the purpose or kind of our goals is we want to help you take stock of the data you have available to you already and that you can easily access and think about how to use that data in new and different ways. We also want you to think about hoping you'll be able to learn more about how you can use data for your agency's benefit and what that benefit is and what that looks like. It's going to look different for everyone. There's not really a one-size-fits-all. Agencies are very different in terms of what their needs are and their locations and what resources they might have available already and what they don't. I think also particularly for SANE programs, it might depend whether you're hospital-based or community-based, where you're located, and what type of sustainability and support you already have versus what you might not have going into developing a program or maintaining it. And then I'm hoping that you'll think about how you want to use your data. So these are kind of the three things I want you to be thinking about throughout taking stock and kind of jotting down about those three things. So how you want to use your data or what may be important for you to know, what's important to your community to understand, so what types of questions are you asked that you might not be able to always answer, but it's a common question. And it could be anything from, well, how much money do you really need for equipment, something like that. That's a data point of thinking about money versus how do you know that survivors benefit from this program. And so that's another thing that you can capture or be able to explain better with data. So start to think about the ways you might want to use your data. So we have a poll here, and I think Amy is going to help us post it for you to do. And we want to know, how would you rate your knowledge of using and applying data? Do you think you have very strong knowledge, just strong knowledge, moderate knowledge, low knowledge, or no knowledge? And this is helpful for us. And I also want to let you know that we know people are at all different types of skill levels related to research and data and use of data and how to apply it. So hearing this will be helpful to us, but we also want to know that some of these aspects of this presentation may be really helpful to some, and some of it might be things that you might already know. Okay, so it looks like, can people, can you see that? I don't know if people can see it on their screen when I'm sharing, but it looks like we have about 50% of people rate theirs as moderate knowledge, about 38% as low, and only about one person or 13% as strong knowledge. This is a great example of data and statistics. So we only have eight people, but if we look at percentages, we think one of eight, 13% feel this way, right? So looking at your denominator is important, but Julie will talk a little bit about that too, because we're looking at just eight people. So that's helpful for us to know as we get started. And I'm going to hand it over to Julie for this next slide. Great. Perfect. Thank you. So we're going to start with just, you know, what is actual data and what do we mean by that when we're talking about that today? And, you know, we can think of data in either data can be words or it can be numbers and working with data doesn't have to be hard. I don't know how much anxiety you're feeling at the prospect of talking about data today. Honestly, the hardest part about working with data is being very specific about what you collect and then very consistent on how you collect it. And then thank you so much. So I'll go over a little bit what I mean by numbers and then what I mean by words. So numbers are going to be what we call quantitative data and then words are qualitative data. And so quantitative data is primarily going to relate to numbers. And what I mean by that is when you're looking at kind of the output or the result or the report that you're going to create with quantitative data, it's primarily going to be reports about how many or how much something is happening, how often something is happening or how often people are reporting that they're feeling this way. And it's usually gathered by either counting data or counting things or by measuring something, which is slightly different or it can be slightly or drastically different from qualitative data where that relates more to words or language. And that's more about understanding maybe the why or how something is happening. So you might have an inkling based on your quantitative data that something happens very often, but you want to understand why that something happens very often. And that's where qualitative data can help you generate some themes or some ideas. And lastly, we tend to think about qualitative data as being grouped into meaningful themes or categories. So less about, again, how often or frequencies and more about just thematic ideas. All right. Next. Perfect. Thank you. Okay. So first, let's deep dive just a little bit more about quantitative data itself and what that might look like. So usually, right, quantitative data is going to be a survey or questionnaire, and it's going to be highly structured or perhaps kind of half and half semi-structured. And by structured, I mean that, you know, the questions will be already pre-written and sometimes those questions can already have the answers provided. So you have to select an answer that's already provided. And it can be administered either online, on the phone, written. You can administer it. So even though it is a structured survey, you can still read it out loud and provide, you know, the choices. So it's but it is a little bit more versatile in the sense of thinking about if you are providing or administering quantitative data, you could have somebody use it, you know, complete it on their phone without having you there, without having the person who's collecting that information there, which is nice if you're limited in time or limited in resources. They can be, you know, they can be in the form of a polls or they can be so quantitative data as even though they can be very structured, they can be either very long or very short. Obviously they tend to be more shorter and polls in particular are very good example and useful tool for quantitative data because they're meant to measure a very specific thing. So some of the polls that we're using here, for example, is a good way to show how you can use the poll and how short it can be. It doesn't have to be this really, really long survey. Another good example, another good way poll shows us is that quantitative data can be used. Like I said, you know, somebody is filling it out on their phone, on their own time, or polls can be used in vivo or live like right now. So if you think about quantitative data, consider how, for example, if you're going to use a poll at the end of the meeting, as opposed to the end of a quarter or at the end of a season or something like that. So quantitative data can be flexible in that way and doesn't necessarily take as much time to score because it's a survey or because it's already structured in that way that you can kind of look at, you know, these different numbers that I'll go into detail in a second. And then the few things that it is really important to remember, right, is that because it is numbers and because it's quantitative data in nature, it's very important to be consistent. And what I mean by that is two things. So first of all, it's very important to be consistent in who you're administering it to and the format in which you're administering it to. So you can't, you know, if you're going to provide a poll to one group and then in a completely different poll to another group, it's going to be very hard for you to compare those numbers across those two groups. So you want to be consistent in how you provide and how you administer that. But you also want to be consistent in terms of the responses that you're going to provide. So for example, in your first group, let's say, you know, we ask the poll about very strong knowledge or moderate knowledge on applying data. And then in our second poll, we ask about, you know, I feel okay about it. So, so those are two very different answers. And again, here, you'll want to be, that'll be really hard to compare as well. And Julie, let me just chime in for a second to to explain, because I think you made a good point about being consistent. But it's also thinking about even administering the same questions to people is thinking about how you're doing that. So if you're administering it online, in person, verbally, you want to make sure some consistency. And it's also like you can think about if you're studying something, and you're like, oh, we're studying, we're administering all these things to people who work here. And then maybe there's a bunch of people have only worked there for a short period of time. And so their responses may be very different. So you also want to think when you're administering these things, to be consistent, too, is what types of data do you need to know, to be able to ensure that it is being administered consistently, in the same way in the same manner to a similar group of people, or a different group of people. And if it's a different group of people, then understanding what those differences are and how to collect those. Thank you. That's a really good point. All right. Next slide, please. Perfect. Okay. So let's say you, you know, you've been consistent, and you administered your quantitative, you know, survey of some kind. And what are some of the data that you can receive? Or what how can you use the results from these different surveys? And kind of the most simple or practical output from quantitative data is what we call descriptive data, which is, you know, pretty self-explanatory, which is essentially describing the data that you received. So that could be like, like I mentioned earlier, you know, showing how often something occurs, so counts or frequency. So how many times has this happened in this person's day, or this group's day, or this site's day, for example? Or how often do certain nurses respond this way, as opposed to that way? So that would be a really simple example of descriptive data. Another way would be, again, to use percentages. And that's, again, important for two ways. First of all, percentages is a good way to transform your numbers. So to get a better understanding of in relation to everyone else. So for example, I tell you, hey, you know, five nurses reported that they felt horrible about this interaction. That's relatively informative. But if you then learned that it's actually 2% of your nurses that felt horrible about that interaction, that actually speaks to two very different experiences, or gives you kind of a different understanding of the way that data is described. Similarly, or the flip side of that, just like Bethany mentioned earlier, you want to be careful with percentages. Because if I were to say, oh my God, you know, 99% of the nurses reported that their interaction was horrendous. And then when you look at how many actually nurses that was, and it was actually two, that also gives you, so you want to look at the denominator, because you want to make sure that data is representative of what you're actually trying to describe. But having said that, quantitative data is really helpful for things like counts, frequencies, and percentages. Now, you can also use, again, mean, modes, and medium. I hope this is not giving you nightmares from high school algebra. But these are also really helpful, again, to at a glance, get a sense of the experiences or kind of the data as a glance. And experiences of, in this example, it would be the nurses of, you know, what would be kind of the trending, the trending number, the trending score, or the majority experience. So instead of looking at percentages or counts, you might say on average, right, nurses might be experiencing this interaction that way, or on average, this nurse might be experiencing that way. So it's very practical if you're trying to make, to illustrate general trends, or like most commonly indicated responses, and just kind of thinking about this bird's eye view of the data. Now, you can think about, so all of this descriptive data that I just talked about, you can then kind of add, either do something completely different, or just add a different layer. So thinking about group comparison. So let's say, for percentages, we can look at a group as a whole, or we can then use those same percentages, but then compare by group. So let's say, Bethany mentioned earlier, something about experience. So maybe some individuals have been, you know, the organization for a short amount of time, and then other individuals have been in that same organization for a longer amount of time. One way to use quantitative data is to be able to compare groups. So I could, for example, compare individuals who've been in the organization for a short amount, compared to those for a long time, and look at, you know, their percentage, or their experience of that interaction, if I'm using that weird abstract example. And quantitative data can be used for these kind of meaningful groupings. Another way, or the most simplistic way of thinking about it is age groupings. So for example, our experience is differing by age, but you can think of other ways, right? Like location, software use, whatever the resources, you know, whatever might be interesting. And obviously, you don't have to compare it to two groups, but you can compare it to any amount of groups you want. Obviously, it might, after a while, you might not have enough data, and it might not make sense, but consider grouping within the data set as well, if that makes sense. And some other groupings that might make sense for you all as providers, when thinking about your clients might be like, where they were referred from. So thinking about referrals, so you know kind of what agencies are referring them in, maybe percentage of those that kind of move forward with the criminal justice system, compared to those that don't. So really thinking about what groupings are meaningful for you all, and the work you do. Maybe it's looking at those who, you know, will follow up with you, or who respond, so how many you're able to get in contact with for a follow-up call, or those that don't, those that use advocacy services, versus those that don't. There also are different things to consider in community demographics, or cultural things that you want to see. So maybe you want to see like, are we serving our community in a way that's representative of our community? And so you could compare that to kind of your demographics of your computer community, whether it's age, race, ethnicity, socioeconomic status, and be able to see if those things. Those are kind of the really common groupings we tend to see in this work that we do in terms of looking at services to clients and kind of satisfaction of services, continued use of services, and experience with services is typically we might look at some of those things. And then there's internal groupings that Julie kind of talked about in terms of looking at staffing. So experience, years of experience of a nurse, nurses in forensic nurse settings versus other settings and so on. Thank you, Bethany, and that's a really good point that fits perfectly into pre and post test. So if we are going to consider either, for example, satisfaction surveys, or even looking at whether or not you are serving the community you're aiming to serve, pre and post tests are very helpful for those. And for what that means is essentially, you know, collecting data, quantitative data, or satisfaction survey before you implement a change, and then administering that same survey, for example, the satisfaction survey after that change implemented. And then you're able to compare before that change and after that change to see whether or not your efforts were effective in, for example, changing your satisfaction ratings, or being able to see whether or not you improved in your ability to serve a larger chunk of the community or a more representative sample of the community. Now something about pre and post tests that is really, really important is to definitely have a pre and a post. So having a satisfaction survey, just as it is, just one time is very helpful and can be informative. But in the context of measuring growth, progress, or whatever change you're trying to implement, it's very important to obviously have kind of a baseline to start with, in order to be able to compare it with your satisfaction survey or other type of survey after. And yeah, anything you want to add to that, Bethany? So one of the resources we'll be sharing that some of you might already know about, and this happened at NIJ when I was there, was this SANE toolkit developed by Dr. Rebecca Campbell and colleagues that actually helps walk through kind of doing a pre and post test related to an implementation of SANE programs and looking at criminal justice outcomes. And that's the type of framework, that guidance, you can apply doing pre and post tests, not just to that particular scenario, but to other things. And maybe it's just like, hey, we're going to see whether having an advocate on call or sitting in the room with you or being on court is helpful in changing kind of survivors' perceptions of the services they received or us being kind of more trauma-informed. And so there's different things you can do. It doesn't have to be this huge intervention. It could also be like a slight tweak or a slight modification that you're testing to see if there's a change. And so there's different ways to set that up. So it's not overwhelming. So we're not saying you have to implement something huge to be able to test it. Sometimes it's a very small, simple change. That's a really good point, actually, because it could also be for a litmus test of making sure you are continuing to do what you're meaning to do. So let's say you have reached your goals. It could be your rite of not necessarily implementing a change, but just monitoring the quality essentially of what you're doing. And just maybe it's about not having a change as well. So that's a really good point, that it doesn't have to be like a big thing. On the contrary, it could be the opposite. Thank you. Excellent. Perfect. Okay. Pivoting to collecting qualitative data. So qualitative data, unlike quantitative, it's much less specific in terms of numbers. And it's more about either doing, so either observations, so you're not necessarily interacting with somebody, but kind of observing them from afar or reading different files and kind of taking one step back. It can be semi-structured interviews. So I talked about those semi-structured surveys in the previous slides. In semi-structured interviews, this is the idea that you might have some questions that you want to ask, and then you're also kind of seeing where the conversation is going to lead you and see where the person you're interviewing is going to go with their information that they're providing with you. And similarly, again, to piggyback off that semi-structured survey we were just talking about, you could have kind of a questionnaire that has half quantitative data. So half quantitative, like very specific numbered questions. And then you have a second part that is more of these open-ended questions that are less about circling a number and more about generating kind of thoughts, content, and themes. And then content review, again, would be kind of reviewing different files, whether that is patient records or different policies and procedures at different organizations, and kind of looking through those different contexts and extracting or analyzing data that way. All right. So what does analyzing qualitative data actually look like? So first, you'll want to organize your data. And for qualitative data, again, it depends right on what type of data we're talking about. So let's say we're talking about an open-ended interview that was conducted with a bunch of different nurses about their experience with a specific implementation. So then you would maybe, for example, print out all of those transcripts of those conversations that was had, kind of group those transcripts together. That would be one way of organizing the data. Or if you're reviewing policies and procedures, it would be kind of extracting and getting all of those policies and procedures in one way. And then also, it means gathering any other additional data that might help you make sense of things. So let's say you were interviewing nurses. What other data will be important for you to be able to analyze? So, for example, demographics, how long they've been in that organization, their experience level, etc. Similar policies and procedures. Are we talking about rural organizations versus more urban? What are some other data that is important for you to know to create this context once you start diving into it? So once you have kind of your data organized and you're ready to go, you're going to review and explore that data. And that's going to likely require you to read this data, again, these different files or transcripts, probably several times, so definitely more than once, just to get a sense of what it contains. So you're not, with qualitative data, you're not always sure what it is that you're going to expect. And you're kind of diving in and you want to read to get some idea of what it is that you're reading. You may want to keep notes, thoughts, reactions that you're having as you're reading this data. But it's not quite the same as analyzing just yet. So now that you've organized your data, you've explored it, you have a general sense of what you're seeing. This is when you step three, you can create some initial codes. And what that means is, you know, depending on your process, it can be anything. If you're still working with people like, you know, I don't know, putting sticky notes everywhere or highlighting or whichever, creating a concept map like, you know, on a big whiteboard and putting stickies everywhere, just creating these initial codes or initial themes that you're seeing. But just anything really to help you kind of get some general sense, general themes in your data sets. So now you have, you know, some initial codes or initial themes about your qualitative data. And now you're going to review those codes or combine them into themes. So what you're going to do is then you're going to reread all of these different data transcripts or policies and procedures, depending on what it is that you're doing, and see whether or not those fit in those initial codes. They might, which is great, or you might need to review those codes or realize sometimes those codes are redundant. You don't need two codes for the same thing and you might end up merging them into larger themes. But essentially, you're kind of going back and forth between three and four. And then lastly, let's say, OK, you've read your data, you've analyzed it, you've created these thematic categories that really capture the different contexts of your data, of your qualitative data set. You're now going to essentially translate that to the public. And that's obviously going to matter who it is that you're, you know, delivering this result to. So depending on the purpose of the study, depending on your audience, depending on the content, you want to make sure that you select, kind of decide what is best included in your results to be able to illustrate or illustrate your mission or present your study in a way that's impactful. So, for example, if we're looking at, you know, this new intervention that nurses are doing and you're trying to advocate, for example, that this intervention is something that is very positive, when you present those themes, you might use illustrative examples or illustrative excerpts or quotes from every, all the themes that really illustrate how positive, powerful and impactful that was. Let's say you are trying to convince that is it cost effective or actually beneficial from a business standpoint, then you might want to use, you know, you might want to illustrate or highlight themes that really explore how efficient it was or how time saving it was, or essentially using that data and not, you know, fudging themes, but rather highlighting important themes, depending on your audience and the mission or the purpose of your study. All right, and thanks, Julie, and I think also a piece of that is one thing we didn't kind of put in here, which I'm thinking we might have should have is also using a combination of quantitative and qualitative data. And a lot of people do this in their annual reports or other things where they might have some numbers of clients served or things like that and then have a quote from a provider or from a client or survivor about the services and what it means to them. And so that those can be very powerful and moving for different groups of people because it hits both lenses. So it gives people some numbers or concrete information that some people just feel better having that type of stuff, where other people relate to data in a more, I don't know, they want to feel emotions or feel kind of connected to the data a different way. And so sometimes qualitative data can be really helpful in helping people feel connected to what's happening and try to understand how it's really impacting and helping folks. So moving on now to thinking about so that was kind of a primer on data and the types of data and what we're talking about when we when we think about data, these kind of numbers or words, there's different ways to get them. Some are easy ways to get them that are at your fingertips. Some might be more difficult. It might involve kind of actually doing some sort of research or collecting data in a more formal or systematic way. So some some examples of data that you already probably have and if you don't have could easily kind of implement are and a lot of folks might. And we'll have a poll next to talk about this, but intake forms, so forms that you might do generally that take down information, demographics, issues, address, so on about different clients or people that you're serving, follow up and referral. So who where is this person going or what referrals did you provide them? How many? So there could be two parts of that. Some of it's descriptive with saying like the types of referrals that you're giving out most often. We're primarily giving referrals to to the food bank to get to get food. This is really interesting. I don't know why we're giving so many referrals to the food bank right now or where you're looking at. Wow. Our referral numbers increase. So on average monthly, we were giving each patient five referrals. Now we're giving them, you know, eight to twelve referrals. And so there's different ways to kind of look at some of these data that are both kind of descriptive from a word standpoint and then descriptive numerically. You can look at the types of service provider provided and this might be helpful to look at this by grouping. So maybe you're providing, you know, those that are younger, 18 to 24, certain type of services compared to those that are older, different types of services. And that helps you better understand your clients and the types of partnerships and other other things you might need to address related to different demographic groups. You can implement client satisfaction or experience surveys. Some people do this regularly. Some people do it on occasion. Some people don't do this at all. They might just follow up or or call people or ask them about it. But you can also collect information this way by giving everyone who comes into your place after they leave a link to take a quick survey that gives you some feedback on the on the program. And then organizational information, people don't realize how important some organizational information is, especially to to thinking about getting funding or showing your reach. So thinking about who's coming to your meetings, what meetings you're going to in the community and with partners, you know, how many you're attending the staffing, what type of funding or donations or events are coming out of out of these different community partners and community events that you're doing, because then you're able to kind of, like I said, show your reach, show who's working with you and show impact at the community level. So for the next poll, if you all could take this, we want to know what types of agency data do you currently analyze or use? Do you use any of this regularly in terms of and I'm not thinking about what you typically report back to OVC or OVW or some of your funders. But do you use any of this to analyze things within your agency to look at it, to evaluate how you're doing? So intake forms and check all that apply, type of follow up and or referrals that you're giving, services provided, client satisfaction or experience surveys, organizational information or none of the above. OK, here we go. OK, so it looks like all of you are using some type of of data already, whether it's intake forms. Seems like most of you are implementing some type of client satisfaction or experience surveys, either regularly or at some point. Some of you are tracking services, organizational information and other things. So that's great that you're already using this. So then hopefully what you can think about today is kind of what you're already compiling and then how maybe to take it a step further or use it or maybe what else you want to look at on there. OK, so and then other data that you might already compile that you're thinking about, you're already compiling data in a different way that you might be able to pull out to use for different purposes, whether it be for funders. So what are you providing to the funder? Either metrics that you're providing back to the funders or you're putting in an application for funding. Your annual reports of most agencies will do some type of annual report or your services or information will be part of an annual report for a broader agency. Internal assessments and meetings. So are you providing internal information or organizational information at internal assessments or meetings? Are you doing internal audits? You're probably collecting information about efficiencies or effectiveness or staffing. So that type of stuff could be important. And then the types of training and technical assistance. It is also important to show kind of we're actively trying to be trained, use evidence based methods, we're obtaining technical assistance. So even counting the number of people trained, how often you're getting technical assistance or requesting it and the types you're getting is important to show that you're actively engaged in the field in doing this work. And then some examples of community data. We don't have a poll for this, but and now I kind of wish I did to see what other types of community data. So what's great is that there's ways to get other types of data that are directly related to what you're doing in your agency, but are related to your overall goal and processes. So some of this is more criminal justice and systems related data. So you may have partners. So when we're thinking about particularly Dr. Campbell's toolkit, she was looking at kind of criminal justice outcomes. How did maybe a new same program impact like case processing and that type of thing? So some of the data they might be looking at was also arrest and case processing data, case outcomes and case files. But there could be other things that might be helpful for you. So if you really want to show like, you know, a problem statement or you have to say like we need this because, you know, these are the calls for services for sexual assault in our community or these types of things that are happening and we don't have, you know, adequate support to provide this. So you might be able to get some information about crime rates or calls for service. You might be able to get kind of health care data, you know, depending on different restrictions. You might be able to get, you know, just aggregate information about who's being seen and the type of health issues that might be happening in your community, social services data. So when I think of social services data, I'm really thinking of, you know, when I think about gender based violence, I'm thinking about maybe data that might be related to adult protective services, child and family services, and just general social services. So you might want to capture some of that depending on if that's kind of of interest or related to your population or demographic. And then at the top we have kind of the census and American Community Survey. So just having an understanding of your community and so this is something really simple. So when I was working with some communities in Texas, you know, they were trying to say like we have so many people that we're serving, we can't meet these needs. And so one of the things we did was just look at population growth over the past five years. And that was a really simple kind of thing to look at just how much your population in that county has grown, but then for them to turn around and be able to say to funders, hey, like our population, especially in this age range has grown dramatically over the past five years, we need more money and funding to kind of provide these types of services because we can't handle it. We can't handle the amount of caseload that's coming in. And so sometimes there could be some really simple solutions with data that's already available to you. That's pretty accessible. And we'll talk a little bit more about census data. So you might be thinking, well, this is great. I don't know how to get this data. And sometimes you might have good relationships with folks with your systems providers and you can, they'll share the data with you can do kind of a data sharing thing. But then you also can move forward to try to get this these data in a different way through Public Information Act requests. So at the federal level we have something called the Freedom of Information Act, where people can request anything. You could request to see successful grant applications, you could submit something to the Office of Victims of Crime, the Office of Violence Against Women, anywhere health services, the FIPS office, wherever to ask for to see successful proposals to guide like what you want to do. Same thing at the state level, you can ask for the same types of things you can ask from your criminal justice system to see, you know, you can get case files. They might redact some information but if you're like, you know what we're not seeing any of our sexual assaults that are moving forward being prosecuted or anything. You could do a Public Information Act to request some case files to look at that. These different things are to look at kind of case processing if that isn't open in your state. So there's different ways to go about doing this and writing that request but the public is, is guaranteed access to things within reason. And so, we actually do that a lot with our own research, you know, we might want to pull you for one study I'm working on related to homicide. We're doing Public Information Act to get to better understand and to my partner and familial homicides, But the way the department asked us to do it was to do a Public Information Act request and so that's how we're going to be able to get the data to look at that. There's different ways to kind of supplement what you have, and that's just one reason. Some of you might've already used that process before, but then just be aware that if you are a public entity, then people can also use Public Information Act to get information for you. So it really, really relates to public and government entities. So private entities wouldn't be subject to this. All right, I'm just checking the chat. Make sure there's no questions. All right, so now, and feel free to put questions in the chat as you have them, and we'll also kind of wait to the end to do them. I'm gonna walk through some information about logic models, and I'm sure many of you have maybe worked with a logic model, created one. I always hated logic models when I was in direct services. I just, like, they're extremely frustrating, and I think even still now, they sometimes frustrate me, but they're really helpful, and I shouldn't say I force my students, but in my community-based research class, it is something, an activity I have my students do is to really think about a logic model, because it's so helpful in thinking about not only what you have, but what might be missing, and also really helping you think long-term about some goal-setting and what you really wanna know and might be important for sustainability, which is ultimately the goal with a lot of these programs is that we don't want one-off funding, funding for three years, funding for five years. We really wanna be able to sustain these programs and make them a cornerstone of our communities, and so one way to do this is really developing a sound logic model that can help you track this information over time to see what you're putting into it, and so a logic model is just a really helpful visual tool, first and foremost, so it's really visual. People can see it. They can kind of hopefully understand what's happening in the logic model. It typically includes programmatic inputs, activities, and outputs, and it helps you think about how to then measure those things as well and how to track those things, and then it can get at more esoteric things like long-term outcomes, like in 10 years, what do we maybe want to see happen? What do we want our program to look at? So it gives you kind of a map to think about and can be really helpful also in strategic planning as you're thinking about it. The other thing I wanna say about logic models is that a lot of agencies, funding agencies, are now requiring you to provide a logic model in your grant applications or to create one, and so it's also an important tool to be thinking about because it might be evaluated in your grant application, whether you have one or not, and what it looks like. So luckily, the Office for Victims of Crime does have kind of, through the SANE Program Development Operation Guide, we do have kind of Building a Sustainable SANE Program, Program Goals and Objectives, so it's a helpful planning tool. We have this link here. Many of you probably already maybe know about it. It's also at the end, and we can kind of provide it out. I think Julie might post it in the chat as well, and so it just provides some information. We'll walk through each piece, but some of your inputs you really wanna be thinking about. The resources required to support the development and implementation of the program. What are all the resources? And you might wanna group this, too, in resources that we have already and resources that we might need, and so there's ways to make the logic model kind of just real simple and streamlined, like here are our inputs, and ways to make it kind of multi-tiered to say, okay, this is what we have, and then a box below, kind of this is what we still need for our inputs, and then for activities, you wanna think about what are the key services provided by the program, and so that could be anything that you're providing through the program, and then outputs, the direct results, so really thinking about immediate impact or direct results, so this is kind of where you get at some of your numbers or widgets, so we're serving this many people. We're logging this many hours. We are training this many hours. So just thinking about all of those types of things, the direct results that you can kind of count immediately, and then this is helpful in kind of tracking success over the year, and then outcomes, and a lot of times people might break outcomes up into different groups, so short and long-term, initial, intermediate, and long-term, and so some of these might think of your larger things, so what do you really wanna know about your program? Did it increase prosecution rates? Did it improve healing for survivors? Did it make survivors feel empowered? Were there fewer medical complications? Did people get needed medication? So what are kind of the effects that you're thinking of the program? And so, go to the next slide. So this is the logic model I'm gonna walk us through, and this is one of the examples that's in the Toolkit for Practitioner by Dr. Campbell, and this is kind of evaluating the SANE program, so I thought this would be a good example to show you all. It relates to the work you're doing, and even if you're not doing SANE work or SAFE work, if you're working in kind of a dual-serving agency, or if you're doing other gender-based violence work, this still can relate. You can still build something that looks very similar. A lot of these things relate to many areas in our field. And so the first thing we're gonna look at, and what I'm trying to do is to link for you what things are, and then what you might think about how to measure it or how to make sure that what are some data pieces you can use to capture some of this. So in thinking about this logic model, this first thing is inputs. You might think about like, oh, what inputs do I really need? But sometimes you might need to show this over time. So one of the inputs they have here as an example is positive relations with police and local hospitals to identify and reverse survivors. I probably would have raised this as there's existing relationships with this pieces and maybe put positive more as the intermediate or initial outcomes, hoping that if you don't have that, that it will become positive. But to know whether that's happening, you could do something like there's collaboration surveys where you're asking each of the agencies to rate how they feel about the relationship with each other. So how strong it is. And there's existing measures out there that capture this thing. You could also measure this by saying who's participating in our sexual assault response team meetings or any of our coordinated response meetings. Or do we have representation from all these places? Are we able to work together? You know, this could also relate to who's coming to our trainings. If we're doing community-based trainings for systems providers and community-based providers, who's showing up? Who's engaging in this training with us? For funding, you can look at total funding by type and what it supports, whether it's funding specific for equipment or whether it's funding for staff. You may also consider reporting attempts at funding. So we, you know, we applied to eight different grants and we received three. And so you can show that you're, you know, actively attempting to try to get funds as well from different places. So next we're looking at the activities. So what types of things are you doing? And so some of this, in this one, we have provide patient-directed care by treating patients one-on-one, working within the patient's boundaries, adapting each patient needs, et cetera. So you can get at this by doing training and debriefing sessions. Do you feel like you're doing this? Maybe reviewing medical charts from this. And this could be like an internal audit or kind of a one-on-one or peer review of different sessions to see what's happening. The other thing is convey professionalism to patients. You could get this by getting a client experience and satisfaction surveys or feedback. You can also get this through observations and so on, but I know a lot of places don't necessarily want a work framework where we have kind of observations by peers or supervisors. And so thinking about what makes the most sense for your environment in your particular structure is also important. And so client experiences and satisfaction surveys might be a better fit here than having someone kind of watch someone do the thing, because is that comfortable for the survivor? Is that comfortable for the worker? And to think about some of those things. Next, when thinking about outputs. So these are kind of the things we might be able to count or really get at quickly. And so this was kind of a one big catch-all the way they did this. So it's like, are we serving sexual assault survivors of diverse ages, races, ethnicities, classes, language, religion, sexualities, and ability seeking medical attention and our forensic evidence collection. So this is really thinking about who's being referred to your program and are they representative of the community and of those programs. And this is helpful to show you like who's not being referred or if agencies aren't referring to you. Maybe there's some trust issues or there needs to be some more networking done to build collaboration and mutual respect and trust with that agency. So they are referring patients over to you. And so you can look at client demographics, referrals by agency, and then compare that to community demographics as well or to client demographics at different service agencies. So typically you can share aggregate information between agencies, not individualized information on clients, but you could know that they're serving 50% of African-American, 25% white, and 25% Asian women. And we're getting all white women into our program. So what's going on? What's happening here? What do we need to do to better meet the needs of other clients that are in the same situation but aren't coming to us? And so this could be a really helpful look for people to really try to think about who they're serving and how to kind of expand their outreach and services and potentially be more culturally responsive. Next, these next three slides really relate to the outcome. So the way they broke this one down is they looked at initial, intermediate, and long-term outcomes. So the initial outcomes, when you're thinking about this, you want to think about this being kind of more immediate, that maybe you can see some of these outcomes within six months to one year. And so this is thinking about survivors. The two I highlighted here were survivors who feel someone cared and believed them. So this is something you can gather through client experience and satisfaction surveys and feedback kind of right away, sending that out to them after they leave or having them take it with them or sending either mail or online at some point. And then the next one is survivors will know where to go for help information and or additional services. For this, you can think about number of referrals that you've sent them to for additional information, amount of information that maybe you gave them or the types of resources you gave them. So there's different ways. So maybe it's a checklist of these are the things that I provided to them and give them resources for. And you also could get at this through an experience survey or a longer-term follow-up survey to figure out if they did go to any more services or if they knew about them. Because sometimes it's not about whether they went to them, it's more about whether they have knowledge of them. We know that people will seek help when they're ready to seek help and when they want to seek help. So I like to see people have more outcomes that relate to victim autonomy rather than like, did they do this? So just them having knowledge is just a big gain for us because a lot of times when we hear that people didn't seek help, a lot of times it's because they don't know something exists, right? And so just knowing that they have the knowledge of additional services or provided information on additional services is an important indicator. When you think about intermediate outcomes for these, you probably want to think about, depending on your timeline, these are kind of post one year, but I wouldn't go beyond kind of three years with that long-term. I would think long-term might be more than three years or more than five years. So intermediate outcomes, you want to think over a year, but maybe less than five years. And so emotional healing for survivors, this is one of them, that can be really difficult to measure. So what does that look like? This would be something that could potentially be measured really well through qualitative focus groups or longer term follow-ups with survivors. If you have the time and effort, you might not be able to get this. So some of these might just be, this is what we want for our community. And maybe ways you might be able to also get to this is that you're having more referrals and people being referred by people that had used your service before. So you know that they appreciate the work, but being able to start to get at some of this through longer term follow-ups or longitudinal work could be important to better to help us understand this. And some of this could be applied. Like if there's other studies out there that are examining sexual assault forensic examination, services and providers that already have found that using this trauma-informed model or this evidence-based approach has led to long-term emotional healing, then you also can say, well, we're using these models that lead to this. And then you might be able to replicate some of the questions that they ask survivors or just know that kind of you're aligned with what we know based on kind of current best practices, what's happening right now. The other box I highlighted here was improved standard of care for sexual assault providers. And so some of the sexual assault survivors, so some of this you might be able to measure through just like, are we keeping up to date on our training? Are we internally reviewing our stuff to make sure that we're doing the standard of care level care that we want? Are we getting technical assistance and taking advantage of that when we can? So there's different ways to look at that as well to see if you're kind of at that standard of care. And that might be, that's gonna be based on how you're defining your standard of care and what elements you want to address kind of in your review of whether you're doing this or not. And then in terms of longer term outcomes, we're thinking of kind of much longer term. And so this might range for you of five years or on, or kind of sometimes I see these as like 10 years. And so for this really thinking about, you can use survivors, we'll see long-term improvement in physical health. So follow-up interviews using evidence-based practices that show that these types of things have resulted in improved or kind of no impacts, negative impacts to physical health could be something that relates to this particular area or measuring it. And then I have question marks here because I was actually surprised to see that this didn't have things related to, although this is really about patient care and psychological wellbeing impact, but kind of adding in this thought about, part of being able to provide good care is making sure that you're funded and you have staff and that you're able to have relationships and referral networks and so on. And so thinking about long-term, these are some other things you might consider incorporating into your logic model, as well as thinking about some of these, what program enhancements you might want, how much sustained funding, increased staffing, improved relationships and network. This is a different example of a logic model. And so they don't all have to look the same and they can look very different. So there's lots of examples online. So this is kind of a different diagram of looking at the inputs. And so without the things and kind of the activities and outputs and then kind of the impacts. And then you kind of have this overall, you see like overarching long-term improved standard of care outcome at the bottom. All right. And so what might kind of elevating your own agency profile with data look like? And what are some of the steps that you need to consider in order to kind of go down that route and digest everything that we just said until now? So a couple of things, a few things to think about, and we'll go into, add to what Bethany said about what are some things that you can do with data is thinking about what data do you want to use in terms of what message you want to send? What's important? So thinking about what data you want to use, it could be something like, as day to day, you are observing something and you think that what you're observing is important. It's important to elevate, it's important to talk about. And so you might consider if that's the case, is that can be qualitative data or quantitative data? And then how do you want to use that data? And that's going to be important for two reasons. So if you are, let's say using qualitative data and you want to use then that data to do different things that we'll talk about, it'll be important to consider how far can that data go? And how much are you allowed to share that data? How do you want to use it in terms of how much time is it going to take you to analyze that data? And what are the resources that are at your disposal to be able to use the data in the way you want to use it? And then who is going to be your audience? So who needs to learn about your data and why? So let's say you are observing something that's happening in your place of work and that you want to advertise that either to raise awareness or change it. That's going to be very different than saying, than you trying to advocate for something, for example, raising awareness about the services that you're providing. And so that's going to be very different in terms of how your data is disseminated or advertised and who your audience is going to be. Similarly, if you are, let's say, raising awareness about something to the community, perhaps the medium in which your data is given will be very different than, let's say, you are trying to advocate for something for your funder. So that might look like a report as opposed to a flyer or infographic that talks about specific numbers or qualities about your data versus a very long report that talks about very specific quantitative numbers. And then thinking about elevating your agency, knowing that we talked a little bit about some data that you already have, for example, that you might not need to collect new data, but use data that you are inadvertently already collecting in a new way, or might you need to consider collecting new data or a different way. So perhaps you are collecting some kind of data already, but you're only doing from a quantitative manner and you'd like to incorporate some qualitative questions to that that you're already collecting. So for example, if you're thinking about your intake forms and perhaps your intake forms is primarily qualitative, for example, do you want to include also a couple of quantitative data or vice versa? And examples of what are some ways that you can elevate your agency profile could be things like, and Bethany mentioned a few of these already, but kind of informing designs of new programs and services or improving existing services. To Bethany's point earlier, we don't want to intimidate you with these huge overhauls, but rather in what ways can you use the data you already have or data that might be easy to collect to be able to improve existing services, for example, or identifying staff training or supervisory needs all the way to perhaps increasing, to Bethany's point earlier, increasing or maintaining funding or spreading awareness about the services that you're providing or spreading awareness about changes that you are making that are being effective. So talking about a way to distill some of the changes that you're seeing or doing that you think could be translated into toolkits for other agencies, for example, or other practitioners, that would be one way of sharing with the field that is different from maybe more academic avenues that could be, for example, like conferences or publishing your work in a journal. Elevating your eight, oh, sorry. Nope, yep, we can stop there for tech, yep. Thanks. Sorry, are you finished? I can go on. No, go for it. Yeah, absolutely. I thought you were finished, but maybe I was wrong. Okay, so our final poll is, have you ever worked with an external researcher before? So this isn't someone who works in your agency as a researcher, but someone externally at a university or at a firm or something like that. Okay, so half and half, about half of you have, half of you haven't worked with a researcher before. So, and, you know, I hear often, you know, people having really bad experiences with the researchers and then some people having really good experiences. So we wanted to also talk to you a little bit about like, how to know whether you might need a research partner, especially if you don't have an internal person allocated to you. Some larger organizations might have someone that works with them internally as kind of, I think we had someone on here that did compliance or assessment, so that might be someone that does some work potentially to support looking at some numbers for folks, or some people might have an evaluation person on their staff. But if you don't, some of the things to think about as to whether you might need a researcher is to ask some of these questions like, are there key questions that I can't answer on my own? I need someone to help me answer these things. Do, am I required to produce evidence or demonstrate that I'm an evidence-based program? If so, typically that might mean you need someone external or someone to provide an objective evaluation or lens to show that it's an evidence-based program. Do you as an agency want to engage in larger research programs or be part of a larger evaluation? Then you might want to seek out opportunities to be a site for a larger project or to be part of that, especially if you feel like you can contribute because you think your program is really great, or you can gain from that by getting additional resources from being part of the research project. Like I said, you might have lack of staff expertise to examine some of the internal data. And this can be simple. I mean, I do this a lot. I think Julie does this too with community partners, sometimes they're just like, I just need help. I need to answer these things and I don't know how to make sense of it, or to put it in some sort of way that, you know, I have this question, how do I answer it with these data? Sometimes it's just a short consultation with someone to figure out how to look at the data. And like I said, a need for an objective and independent approach, which sometimes is required depending on who your funder is or what you might want to do, kind of having someone external evaluate or do some of your research for you or analyze some of your data kind of lends a certain, provides kind of more evidence backing for that program as well. It's kind of one of those things that, oh, if you produce the data, maybe it's not as objective as someone else who would have produced the same data. Don't know if I did a good job explaining that. So when thinking about identifying potential research partners, some of the first things you might want to do is, do I have preexisting relationships with folks? Are there people that I like to work with? People I don't want to like to work with, you know, is there a local college or research group in my community? Do I know someone who worked with a researcher? Maybe they could refer me to someone. And so thinking about those types of things, you want to make sure there's commitment by both sides. So you're committed to them and that they'll be able to commit to you, that there's mutual respect. One of the important things is making sure that they're interested in the topic and agency mission. I think us researchers do a better job when we're actually interested in the topic. And we appreciate the mission of the agency and kind of care about that agency's mission. And you want to make sure the research partner is going to be communicative and inclusive of you as their practitioner partner. And that they keep you kind of involved and apprised of what's happening. That's one of the worst things that we hear a lot from practitioners who a researcher comes, gets their data, and then they don't hear from them. That researcher goes and publishes a paper and gives them some limited information and doesn't explain it to them. And those are really horrible situations. Luckily, I don't think they happen as much anymore. But definitely something that we've seen, that I saw a lot kind of at the federal level and also doing community engaged work, working with communities and partners that have been burned by folks like that in the past. And then thinking about what a research partner may expect from you. So most likely, researchers are going to want the ability to publish from the data or from the project. Is there a way for them to do that? I often will publish collaboratively with my practitioner partner if they want to be on a paper, or maybe we're working on more of a white paper or something like that. But it's really up to the agency. I've been on projects where you can't use this data for research, and that's fine too. But sometimes you want to lay that stuff out front and center, and you want someone who's transparent about what they might want to do. They might need access to data or clients to do the work. So if we're doing qualitative work, or we're going to do surveys, they might need access to those people and might need your help in getting access to those people and referring. Typically, you're not able to give us a list of people you saw that's illegal, right? We don't want that. So sometimes it's having you be that person that asks someone, hi, we're working with a research partner. Are you willing to talk with them? Can I send your name on? And getting consent from that person. That also is part of kind of when we think about the Violence Against Women Act confidentiality practices and procedures that a lot of grant funding is subject to, you want to make sure that that research partner understands those and can work within those confidentiality provisions. You want a research partner. A research partner will expect that you're going to be participating, giving feedback on materials in a way that you can, that's objective. And so a lot of times I might ask for feedback on, am I asking the right questions? Is this getting at what you want? Because ultimately, you're my client, and I'm working with you, and I want to make sure I'm giving you what you need and want. Research partners may ask for financial compensation, and that's something you'll want to think about at the beginning, because a lot of places aren't able to financially compensate. And so maybe there's a different thing. So maybe it's okay if I don't get financial compensation because I can publish a paper on this instead, or maybe there's a way to write the person in to get some financial compensation. So you don't want to think about this when you're writing grants. If there's a call for doing some sort of research or assessment, you do want to think about budgeting. And some of that money might need to go to providing incentives for participants or survivors to participate in a research and study. It might need to go to getting certain data, software, or other types of things that aren't necessarily income for the researcher. And then they might want a recommendation to other agencies. If they did a good job, they might want referrals they can work with with other types of agencies. And then I think Julie's going to, oh, there we go. All right. So let's say, you know, we've convinced you, you're ready, you're going to collect all the data, you're like, you know, you really have all these burning questions about your own practice and your organization. What are some ethical considerations that is important to consider as you embark on this new data-driven journey? So first of all, it's very important to ensure that data is not identifying. And what we mean by that is many things, but mostly I think we can all, we've all heard of this idea that, you know, it's very important to maintain the privacy and the anonymity of a person who's completing your survey or where you're collecting the data, but there's a different way data can be identifying. And that would be, for example, if you are in a small organization and you're collecting data to essentially maybe assess for, you know, nurse satisfaction within your organization, and it is a small organization, and then you were to then either report on those numbers and you were to say, you know, in this department, they reported this level of satisfaction, but that department has like two nurses in there, that can inadvertently be identifying in a way that you wouldn't necessarily want to. So you want to be very intentional about when you are collecting and displaying data to make sure that they're not indirectly identifying, you know, possible members of your organization or members of your community. If you are going to talk about, you know, data in terms of, let's say, you know, this past year you provided services to 500 people, you know, two of those people were from this marginalized community, but in your community, there are only two people who identify as such, then that would be a different way of that the data is indirectly identifying. So being very mindful as to how that data can be indirectly identifying is one ethical consideration. Now, another will be to consider whether you need approval from an IRB or Institutional Review Board. Now, there is such a thing as a private IRB or Private Institutional Review Board, but unfortunately that costs money. That could be one pro of partnering with a research collaborator because they might be in an institution that already has an IRB, and that might save you a little bit of time. That is something to consider as well. And then there is the more traditional aspect of confidentiality and privacy when we think about data collection. So when we talk about privacy, we talk about primarily, you know, a person's privacy. So the extent to which a person is able to control how much others know about where they've been or who they're seeing, who they interacted with. So for example, if you are collecting, if you are conducting, collecting some data in your institution or organization, everyone knows that you're the person who's in charge of collecting that data. Then let's say a colleague of yours who is seen entering your office, that might be considered a breach of privacy because others might know that, oh, this person is entering this office. We now know that they're participating in this research. So that would be an example of privacy, which is different from confidentiality, which is more about the information. So protecting access to the information that this colleague has provided or this research participant has provided. And that's often more traditionally what we think about when we think about confidential research practices, which is, for example, hiding the data in a cabinet that's locked and then in an office that is also locked or on a hard drive that requires a password. So these are the ethical considerations about confidentiality and privacy. That's really important to think about, especially if you are either collecting data in your organization or in a smaller community where these things are easily breached. For example, the privacy example, right. If you are in a small community, it's and you might be wearing several hats. It might be really hard to maintain privacy of somebody just because of the nature of the space and resources that you might have. And then when we think about collecting data, it's very important to be transparent about the purpose of the research and how it's going to be used. So for example, if we're using the example of, you know, we're kind of assessing maybe workplace satisfaction, for example, it'll be really important to explain in a transparent manner how that information is going to be used, because it's very possible that those essentially completing the survey don't want any negative repercussions of being honest about their experience, for example. Similarly, for perhaps patients who are coming in and who are completing different surveys like satisfaction or any survey to be very transparent about the fact that these surveys or this these questions that are being asked are not at all linked with the services or the medical services that they'll be receiving or any other possible services that they may receive coming to you. And that it must be very clear that it's not linked to your data collection. For example, if they are if they're somehow involved with the legal system, making sure that you're being very transparent about the fact how is that data going to be used and is it going to be going back to other services and kind of hand in hand with that? It's really important to consider workplace hierarchy. So again, if we're using the example of more kind of organizational workplace satisfaction, there are different ways of collecting data. One of them would be in a group setting as opposed to an individual setting. If you are collecting data like as a focus group or in a group setting, it'll be very interesting and important to be cognizant of workplace hierarchy so that you're not asking like different nurses to report on their satisfaction in the presence of their supervisor, for example. And then similarly, as you are disseminating the data or kind of reporting back on the results, it'll be very important to be cognizant of workplace hierarchy and how there might be repercussions of the report of results on the different employees or nurses that kind of provided that that feedback. In the same way, it's really important to consider power dynamics again when you are administering these surveys, for example, if you are reaching out to the community or different patients and kind of asking them about either satisfaction or their experience or what have you to consider the power dynamics that come into play when you are asking those questions and whether or not they might be self-conscious or highly aware of the possible power dynamic that they perceive you having, even if you don't. So, again, even though you might be in a position of being a medical provider or a SANE nurse, but they might perceive you as being linked with the legal system in some way and things like that. So those are other ethical considerations, especially when you think about communities who are either immigrant communities, not English speakers or marginalized populations who might have had different experiences with service and legal provision. Great. I think, Shay, this might be a good place to stop because I know we're at 420 and a lot of the rest of the material we have is more information and resources with hyperlinks that we can send out. So the next section we were going to talk about was just kind of examples of how to display your data and then talk about how to access census data and the links to different guides on logic models, working with researchers, and then also how to share your data. And then also links to the SANE toolkit guide that provides kind of information about pre- and post-tests that Dr. Becky Campbell did. So I think this is probably a good place to stop. I can go to questions or put up our contact information and see if there's any questions. I didn't monitor it. I mean, I monitored the chat, but I didn't see any come through as you guys were talking. So it would be a good time if anybody has some questions to put it in the chat or you can raise your hand and we can unmute you if you want to ask. Doesn't look like anybody's raising their hands. So again, we'd like to thank both Dr. Olomi and Dr. Backus for presenting this really informative and important subject today. I learned a lot myself. Thank you for bringing it to us and for the time it took to present, but also to prepare. If anybody does have any questions, you can see that there's on the slide right now is the contact information, but we'll also be providing the slide deck for handouts once we put this up in the learning management system and on the SAFE TA's website. So again, thank you for presenting and thank you to the OVC for funding this project so we can bring this to our partners and our sites.
Video Summary
The webinar, presented by Dr. Bethany Backus and Dr. Julia Lomi from the University of Central Florida Violence Against Women's Cluster, focuses on the importance of using data to inform decision-making and improve services for survivors of violence against women. They explain the difference between quantitative and qualitative data and provide examples of various types of data that agencies can collect and analyze. The process of analyzing qualitative data is also discussed, emphasizing the steps involved in organizing, reviewing, and refining the data. The presenters highlight the value of consistency and the combination of quantitative and qualitative data to gain a comprehensive understanding of the issues. They also encourage agencies to consider collecting community data to better understand the needs of their communities and advocate for additional resources. The ultimate goal is for agencies to recognize the value of data and how it can be used to improve services, evaluate programs, and advocate for change. Ethical considerations and collaboration with researchers are stressed, along with the importance of sharing data with stakeholders. Resources for accessing census data, developing logic models, and working with researchers are also provided.<br /><br />Credits: Dr. Bethany Backus and Dr. Julia Lomi from the University of Central Florida Violence Against Women's Cluster.
Keywords
webinar
data
decision-making
services
survivors
violence against women
quantitative data
qualitative data
data analysis
community data
ethical considerations
collaboration
QUICK LINKS
Submit an Issue
Sponsorship
Chapters
Careers
Foundation
International Association of Forensic Nurses
6755 Business Parkway, Ste 303
Elkridge, MD 21075
×
Please select your language
1
English