Blog

Try “Implegration”: Tool #6 in our “Tools for Implementation” Series

In our last edition, we explored Communities of Practice which leverage the expertise developed when each member of an ongoing learning group cultivates his/her MI skills, or any other skills that are evidence-based. Because Communities of Practice are peer-led, each member is motivated to invest in ownership of the group, and to deepen skill proficiency in the process.

This is where the term “implegration” comes in – a term coined by Carl Ake Farbring in Sweden. As staff develop their individual expertise with new skills, and also expertise on what works and doesn’t work for skill development, they become a rich source for ideas around creating engaging ways for supporting, mastering and integrating EBP’s into practice. In other words,“implegration” invites you to actively integrate staff ideas into the implementation process

 Tremendous potential immediately on tap

Staff participants in EBP implementation have tremendous untapped potential for improvising solutions when it comes to implementation issues and/or roadblocks. Who better to generate ideas for solutions than those who are actively experiencing the implementation process! The beauty of implegration is that, just like a Community of Practice invites ownership of the implementation process on the micro level, implegration invites ownership of the implementation process on the macro level by encouraging participant input and ideas for overall implementation success.

 Internal vs. outside experts

Rather than going to outside ‘experts”, look to internal experts by setting up focus groups to explore staff’s ideas for addressing whatever particular hurdles might be presenting themselves in your implementation process. Ensure that these focus groups provide a safe environment for idea-sharing, perhaps by encouraging small group dyad or triad conversations to brainstorm ideas, then share those ideas with the rest of the group. Ideas should include not only potential solutions, but also the reason for the particular solution approach and why, from experience, staff think this solution might work. If there is already an established Community of Practice, this same group might serve as the focus group, or a peer facilitator who is particularly excited about the skill might be designated to convene a group (for instance, if the focus group is about MI implementation, then a staff member who is an “MI champion” and has naturally developed a strong excitement for the skill).

 Some examples of “Implegration” topics

To make this concept real and tangible, here are some examples…

A seemingly small, yet quite impactful example:

The officer who, each week, puts a new and different post-it note on her computer to remind herself of a different MI skill or technique she would like to practice during her office visits. When this officer shares this “set of cards” that she’s developed over time, a whole new tool is introduced to accelerate the learning of all. In addition, this tool has already been tested and proven to be helpful.

An example on the group level:

A group of officers, joking around in the break room one day, creates a game that they play at lunch time where in order to open the refrigerator door, you’ve got to answer an MI skill question posted on the door, then replace that question with a new question. They find it to be a quick way to keep themselves thinking about the skills, and the element of curiosity (i.e. “I wonder what’s gonna be on the door today”) keeps the process fresh and interesting. The officers share this game with the larger implementation board who shares it as a new tool for officers across the organization to use.

An example on the organizational level:

The implementation team, interested in using an “implegration” approach, convenes a focus group of participants in an MI roll out. During the meeting, they discover that participants are having a hard time doing their required recorded training sessions because the particular clients needed are only available during times when they have been assigned other organizational responsibilities. After some brainstorming, the staff participants come up with a solution that the implementation team would never have been able to identify on its own, in addition to the fact that the implementation team would not have even known about the problem without creating the opportunity for an implegration meeting.

 Summary

Whenever new norms are established that support new practices, and those new norms come from the inside out, implegration is present, and the process of implementation has deepened itself by actually walking its talk – implementing with real attention to the ultimate integration of the innovation.

If interested to talk more with us about all things implementation, we have several programs that support EBP’s. Please explore the rest of our website, or contact us at 303-544-9876 or lisah@j-sat.com for more information.

Communities of Practice: Tool #5 in our “Tools for Implementation” Series

Communities of Practice are not just your average work meeting

In our last edition, we explored the value of Decision Support Databases to gather performance assessment data and watch for big over-arching goals such as achieving competency and long-term skill sustainability. In this edition, we take the next step into Communities of Practice – the first “tool for implementation” that engages the power of group learning rather than simple individual feedback.

A Community of Practice, or CoP, is an informal, peer-facilitated gathering that convenes regularly to continue learning and practicing skills that were acquired during training and follow-up coaching sessions. Another way to define CoP’s is with the following three critieria: 1) The meetings involve an informal group who are relatively committed to the joint enterprise of learning a skill, and 2) Mutual engagement (i.e. no hierarchy, and 3) A shared repertoire of techniques, terms, skills, tools and resources. A CoP can focus on any kind of skill development, Motivational Interviewing being just one example.

CoP’s are often the difference between average and accelerated skill learning. They bring individual skill development to a group context, resulting in a multiplier effect for skill development. Once you’ve seen the powerful effect that CoP’s can have in the implementation process, like we have here at J-SAT, you’ll never go back to implementation without CoP’s in the mix.

At first glance, Communities of Practice can look like just another training or work meeting, but they are actually an entirely different animal. In fact, the success of a CoP depends on management and participants understanding the key differences between a CoP and a typical training or work meeting!

The key phrase to remember is “peer-facilitated”

The descriptor phrase “peer-facilitated” is always used in definitions for CoP’s. The irony is that this key phrase is often lost on most implementation teams. Because of our culture’s norms about how to “run a group”, the usual approach to a CoP is to designate a group leader who then makes sure that the CoP happens. This person more often than not becomes the person in charge of everyone else’s learning – facilitating the skill practices for each meeting and perhaps even giving mini training sessions as part of each CoP get-together. This misses the whole point of having a CoP! It is very easy to fall into this trap because we are so used to gathering as groups with some sort of leadership in place while everyone else sits back and consumes the training, guidance or direction that the leader provides. But that is passive learning and it gets old pretty quickly. So let’s take a look at what a CoP can be instead…

 No one person in charge of everyone else’s learning

This is the big shift to make when forming and norming a CoP. At J-SAT, we just created a whole manual on implementing and thriving CoP’s, and the biggest “aha” in the manual describes this very concept – to make sure that no one person is in charge of everyone else’s learning. The key to making your Community of Practice a thriving, engaging, generative and (dare we say) fun learning experience is to continually give each participant the opportunity to bring his/her ideas, input, creativity and expertise to the group. This means that topic facilitation regularly switches from one participant to the next and rather than a group leader, there is a group coordinator who simply makes sure the group meets but is not in charge of everyone’s learning.

If you make sure to always orient CoP decision-making from the perspective of “no one person in charge of everyone else’s learning”, the group will not be in danger of slipping into mind-numbing teacher and listener mentality, and each participant will instead feel ownership of the group and that they play a key role.

 The reason this leads to true engagement

Because the key to CoP’s is this “from the bottom-up” rather than “from the top-down” approach, participants have a sense of their own power to influence their development, as well as the development of others. Participants also experience that their presence and participation in the group truly matters and therefore are more likely to show up and to actively engage. The more informal nature of a CoP allows for more spontaneity compared to typical office meetings, and there is the opportunity for creative, “out of the box” thinking to stimulate learning, growth and genuine interest.

Our director, Brad Bogue, wrote an article in which he described how the size and flattened authority in CoP’s in some ways conforms to ancient network patterns that are easily understood by everyone. Akin to hunter-gatherer times, CoP’s band together groups of 7 – 15 people, the size of which capitalizes on small group dynamics. The individuals members who support the group to exist, in turn simultaneously draw strength from them. At the same time, CoP’s (i.e. a small group of people banding together), can and do support not only one another, but the larger tribe or agency. What people learn individually, through practice, feedback reports and coaching is readily transmitted and shared within CoP’s, and the result is significant multiplier effects in learning.

At J-SAT we’ve been experimenting with Communities of Practice for quite some time, and have created a comprehensive manual on what we’ve learned from experience works best. We love to talk about it, so if interested, please let us know at 303-544-9876 or lisah@j-sat.com for more information.

Tool #4 in our “Tools for Implementation” Series: Decision Support Databases

What’s your Decision Support Database?

In our last edition, we explored the value of performance assessment, and how the timing of gathering data and the way in which the feedback is presented are key factors to success for your overall training implementation and for your training participants.  In this edition, we take the next step into decision support databases, which use the accumulated performance assessment data to discern when individuals achieve competency or partial competency and overall training goals have been met, especially when sustainability is of significant interest.

Most importantly, decision support databases help you do just that – make decisions.  Teamed with the coaching and performance assessment elements that we shared in our previous editions of our “Tools for Implementation” series, you can refer to a decision support database on a regular basis to check in with the long-term story of training implementation and to make course corrections along the way, if needed.

 A simple version of a decision support database

The term “Decision Support Database” is quite a mouthful and can sound rather daunting to the average person.  Let us de-mystify for you:  You have a decision support database even when you track just a few simple performance measures in a simple Excel spreadsheet. For instance, if you are implementing Motivational Interviewing training, an extremely simple version of a decision support database might involve keeping an Excel sheet that tracks the numbers of open and closed questions that participants used…

Over time, you can watch this data in your simple decision support database (Excel sheet) to determine whether folks are making the shift from using closed to open questions.  This useful “meta-information” then guides you in decisions, like what kinds of practice exercises to offer when the training group meets for a skill refresher – if the overall meta-data says that participants are primarily using open questions rather than closed, you could move on to another skill practice, like increasing reflections.  But if your meta-data shows that closed questions are the primary kinds of questions used, then you can focus your skill practice to strengthen ease and familiarity with asking open questions instead of closed.

 More complex decision support databases

Knowing how folks are doing with open and closed questions is good information, and if you are implementing for sustainability, you will likely find yourself wanting to track more than that!  That’s when your decision support database becomes more complex and tracks multiple layers of information.  For instance, instead of just tracking open and closed questions, you might also track all of the skills involved in reaching MI competency (e.g. open questions, closed questions, complex reflections, affirmations, no advice-giving, reflection-to-question ratio) so that you can see how your participants are faring in reaching an overall level of skill.  In this case, you would likely employ an Excel sheet that generates graphs to visually show you all kinds of relationships between the different data elements that you have gathered.

The information gathered in a more complex database guides you to make larger decisions about what to implement next, based on how participants are doing and your overall goals for implementation. For example, in one state-wide project for scaling-up MI (50+ % of the officers achieving MITI-3 competency thresholds) a subset of officers expressed a great deal of frustration at one point.  These officers had submitted many tapes for independent ratings and they were becoming discouraged that they still weren’t able to achieve competency. When coordinators looked at the data that emerged out of centrally collecting all the MITI-3 measures, along with other coding, training and coaching information for hundreds of officers a pattern was detected that explained a good part of what was going on. There was a distinct tendency with officers who had turned in the most tapes, to also be the officers with least phone or face to face coaching episodes. In another words, some officers in their eagerness to achieve competency, were submitting one tape after another without undergoing much coaching. Because the coordinators had a decision-support database they could refer to, once the above pattern was detected, the remedy was relatively simple… provide a better balance of coaching and coding.

The range and number of examples for the kinds of thing decision-support database can assist in is really extensive. Anywhere there are fidelity measures or performance assessments taking place, feedback and skill development are invariably involved as well. This makes for a very dynamic and seemingly chaotic system context – like a popcorn machine with an open top – unless a database is quickly established for tracking things. When the time and location of each corn kernel is identified before the heat is turned up, again when each kernel pops to highest point and then when it lands, this provides the basis for aggregate data, from which many patterns can be detected. When one sector (e.g., residential community corrections versus parole) requires a much longer cycle to achieve fidelity for a program, its time to look at the data and see what it says. Or if you have a group of people coding tapes and you want to see if they have similar inter-rater reliability rates by pulling out aggregate samples of each coder’s profile you will quickly see where the group’s strengths and weaknesses are.

Building decision-support databases can’t be done very effectively through large Management Information System (MIS) technologies.  Getting a ‘job ticket’ and waiting through all the bureaucratic committee processes takes too long. Fortunately all the is necessary is an Excel spreadsheet and a little moxie to get started. Over time the data and the spreadsheet can be refined, defined better, and formulas can be utilized to automatically graph and report various indices out to the data. The key is to start throwing the spaghetti on the wall – gathering the data in ways that eventually are apt to be meaningful.

 The difference when decision support databases are and aren’t used

Without a decision support database to guide your choices, you will still likely have outcomes from learning a new skill like MI, such as staff feeling better about some aspects of their jobs, or having less conflicts with their clients or having deeper insights about what is really going on with their clients.  Your agency may experience some beneficial changes in the norms for interactions with clients.  All of these are desirable and positive changes, and then using a decision support database can take your overall outcomes to the next level.  Using the decision support database as your guide, you identify the level of fidelity to recognized measures and thresholds of performance so that you can reflect on the overall picture of outcomes for your MI learning process and make decisions about what you need to supplement and where you can celebrate where you are doing well.

At J-SAT we’ve benefited from using decision support databases to assess the outcomes of the coding and coaching process with our clients.  If you are interested in Assessment or MI-Only coding and coaching services, please contact us at 303-544-9876 or lisah@j-sat.com for more information.

Tool #3 in our “Tools for Implementation” Series: Performance Assessment

Performance Assessment, aka “How are we doing with these skills, anyway?”

It’s not uncommon to hear the words “performance assessment” and to feel stressed in some way.  These words can remind us of being tested in school, or of comparing ourselves to others, or they can make us think about all the work we feel need to do to “measure up” with seemingly not enough time in the day to do it.

Fortunately, there are some things you can do to set the stage for all involved to feel less stressed about skill assessments like direct live observation or tape recordings, and maybe, dare we say, even look forward to it.  How, you say?  Below we address why performance assessment is such a powerful tool, and also share two key things you can do to make performance assessments more palatable to those involved…

 The power of data

When framed dryly, performance assessment is all about data.  Upon taking a closer look though, we see that performance assessment paints a rich storyboard to show us where we’ve been, how we are doing and where we can go.  There are some simple, straightforward guidelines from implementation science that we can follow to get the most benefit.  First, don’t measure anything you aren’t going to use, and second, make sure to use your data twice, not once.  Recognize performance assessment as part of a profoundly helpful information flow to assess, coach and then support individuals in skill competency, plus to also make decisions about next steps in the overall skill development program.  When you do so, you use the rich storyboard of information once to support individuals in their best forward movement, and then once again to watch for important trends and course corrections over time and/or by unit.

 Key #1:  Make it friendly

Next is the art of framing performance assessment to those involved, or the art of positively experiencing performance assessment if you are one of those involved.  You can choose words and frame feedback in ways that are supportive, rather than critical, such as calling feedback information “reports” rather than “critiques”.  You can introduce the assessment process in such a way that is personalized by working with trained observers who see and present themselves as “in service” to the participant’s skill development, rather than as critiquers of skills.  Lastly, participants can be shown examples of how the storyboard of data gives them a wealth of information to experiment with and refine skills that feel relevant, e.g. give examples in which the storyboard of data gave insight to skill practitioners in ways that profoundly affected their job performance and perhaps even their job enjoyment.  When performance assessment is seen as a useful tool rather than a form of judgment, the door opens to true deepening of learning and skill.

 Key # 2: Make it Timely

Presenting the performance measures in a timely way can also make a big difference.  Nobody’s going to get very engaged over yesterday’s news. So it’s important to share the storyboard of data with those who can benefit from seeing it at certain key points in the learning process.  For example, it might seem most beneficial to provide aggregate data and/or individual data pertaining to MI, CBT or practice model skills once a month or quarterly. However, if it feels more timely and helpful to participants to have that information sooner (or later), that is something to check in about.  Providing the storyboard of information at the best time for participants, and getting their input about this timing, will give the overall project an even more collaborative feel.

 Key # 3:  Make it supportive, not controlling

 To make performance assessment truly interactive and therefore much more interesting and powerful, see if you can view it as a tool for development rather than a marker to reach.  For instance, scoring or competency markers can quickly become the “mountaintop” to reach, and once reached, the story is over.  However, performance data comes alive when used as a tool for ongoing development – information that helps us refine skills, and then not just sit pretty, but refine even more!

At J-SAT we’re big on the art of performance assessment and generate hundreds of MI and Assessment skill development reports every year.  Please contact us at 303-544-9876 or lisah@j-sat.com if you’d like more information.

Tool #2 in our “Tools for Implementation” Series: MI Coaching

MI Coaching, plus some of its often-overlooked roles

MI Coaching is of course a big part of what we do here at JSAT, and this is because we believe in it so much. It also happens to be one of the key drivers for effective implementation of any new skill within an organization. So we get just a tad excited about this subject!

In this article, we define “coaching” as the one-on-one meeting between an individual who is learning MI and an MI coach who is proficient in MI and trained to support new MI practitioners to recognize emerging skills and develop their own unique style and proficiency for using them. Ideally these meetings continue over time as the new practitioner increases skill and confidence.

Coaching Role #1: Making the most of the initial training

Though not often overlooked as some other aspects of coaching that we talk about later, it is still important to recognize coaching as a key driver for successful implementation. One of coaching’s biggest roles within the implementation process is to ensure that the initial training does not fade away in the memories of those who attended. We’ve all had the experience of taking a two-day training, feeling inspired about applying the new skill in our jobs, then finding the skill binder some months later, dusty on a shelf because we got so busy we didn’t have time to return to it.

The presence of a coach and the structure of regularly scheduled meetings address this gap in the implementation process. Rather than get dusty on some shelf, the training manual is pulled from the shelf time and again as the new practitioner practices skills in preparation for his/her next coaching meeting. All of that money invested in initial training is actually put to use rather than going down the drain.

Coaching Role #2: The power of relationship

One of the most powerful, and perhaps one of the most overlooked roles of coaching is the development of the coaching relationship between the coach and new practitioner. As we step into any new skill, we are vulnerable and uncomfortable because we inevitably need to stumble as we experiment and discover what works and what does not. Imagine having your own personal cheerleader – one who not only cheers you on, but who is also genuine with you in identifying where you can stretch and develop your skills to bring out the best in your ability. With coaches walking alongside them in their learning process, new practitioners feel seen, understood, celebrated and supported to hone their strengths plus work through their challenges. All by someone who really gets to know the practitioner’s unique style and whose job it is to identify what will best support his/her learning process. In a real sense, the relationship is the message – you are not alone in the enterprise.

Coaching Role #3: Developing and celebrating individual skill and style

And speaking of unique style, this is yet another role that coaching plays in the implementation of a new skill within an organization. Learning new skills requires adaptation – oftentimes new practitioners must adjust a mindset or a belief about what works in order to truly embody the heart of the new skill. This is very true in the case of MI. It is not uncommon for practitioners to walk away from a training wondering how they can fit this skill in with the logistics of their job and/or their personal style and approach. A coach’s role is to help the new practitioner recognize how MI fits with his/her unique personality and approach to interviewing. The key here is that the answer is different for everyone, and that is why individualized attention from a coach who gets to know the practitioner is so helpful.

Of course, there’s so much more to say about coaching, but let’s leave it for now to think about these often-overlooked key roles that coaching plays. And if you would like to talk more about coaching with those of us who love thinking about it and doing it, please contact us at 303-544-9876 or lisah@j-sat.com.

Tool #1 in our “Tools for Implementation” Series: Consider Voluntary and Non-Voluntary Selection

When it comes to selecting staff to develop MI skills, there are two camps of thought.  In one camp, all staff are required or “voluntold” to be trained and held to the same standard, creating an “all systems go” culture in which everyone experiences the new learning.  In the other camp, the belief is that training and learning MI should be voluntary.  There are, of course, merits to both approaches.  Here are our Director Brad Bogue’s thoughts on choosing the voluntary camp when implementing MI with your staff…

If you can include some elements of voluntary selection in your implementation process, you may find greater intrinsic motivation in your pool of learners.  Intrinsic motivation is likely related to a greater capacity for reflective thinking and stronger abilities to adapt to using the new skills.  In short, when you can create opportunities for staff to say “yes” to the training, rather than having to go, you likely will fill the training room with brains more open, ready and hungry to learn.

This leads to another benefit of voluntary selection – by avoiding pushback, implementations of MI are more likely to go deeper and faster.  Once the first round of inspired staff reach critical skill and style thresholds in their use of MI, interest in the skill can go viral as other staff observe them demonstrating genuine expertise and the increased positive outcomes that go with it.

Of course, there can be challenges to overcome when aiming for voluntary selection…

What if very few self-select?

To increase the likelihood that staff say “yes” to voluntary training, you would strongly highlight the reasons to say yes.  Write up or give a talk about the exceptional benefits that come from learning the skill, and specifically the tangible positive shifts they are likely to see in their interactions with clients, i.e. “how this could make you even more effective and satisfied while doing your job.”  Also highlight any kudos that learners might get by participating in the training.

What if you don’t have the luxury of being able to present training as an option?

Consider a first round where all staff must attend the initial training, and then a second round where inspired staff have the option to self-select for more in-depth training, such as follow-up coaching and/or attending peer-learning groups where the participants become tried-and-true experts in the skill.

What are your thoughts and challenges around the recommendation to aim for as much voluntary participation as you can when implementing new training?  As always, we are here to help, and you are welcome to give us a call or shoot us an email to brainstorm what might fit for your specific situation.

If you’d like to learn more about our Skill-Builders coding and coaching services for MI skill development, please click to our Skill-Builders page for more information.