Solving problems is what we do. But one of the biggest wasters of time, money and effort is our propensity for selecting strategies that don’t do what we need. In working with jurisdictions, the mantra is to understand what is driving the performance you are getting before you decide how to fix it.
A useful practice is to anchor all strategy development in empirical data. This can be quantitative, qualitative, or, ideally, both. What is it that you know about your system? Observation is key. Once you’ve observed something, it is critical that you get behind or beneath the data — to the root cause — to understand what is driving this finding. Only when you’ve done this well can you select a strategy that speaks to the needs of your system. It does not matter that it has worked somewhere else. It even doesn’t matter that it’s evidence-based. If it does not speak to the conditions on the ground in your jurisdiction, you are wasting your time.
Example: There has been a great deal of focus on caseworker visits as a means to improve permanency outcomes for children. A jurisdiction finds that it is achieving 85% of the expected visits, short of the current federal standard of 90%. What to do? Well, in order to be both effective AND efficient, it is important to know why your system is not performing optimally. Is it a simple data entry issue, that is, workers aren’t faithfully recording their visits in a timely way? Or, is it a more fundamental issue that visits are not occurring? Digging into your data and discerning which it is tells you what to fix. Discovering that a handful of workers in two units are driving this outcome means that your focus and energy is directed. And your workforce is not burdened by a misdirected effort on your part.
Once your appropriate intervention is implemented, you are ready to monitor it to make sure that you really are achieving the intended outcomes. When your intervention is tightly connected to what’s driving your system, mid-course correction, if needed, can readily get you where you need to go. An equally important point in monitoring is that you need to watch carefully to see if there are unexpected outcomes occurring as a result of your intervention. Keeping an eye on the whole system pays dividends.
Example: A large number of children waiting to be adopted are stuck. A jurisdiction looks carefully at what these children need in order to move to finalization, and a well-targeted project is launched to remove barriers and move these children to permanency. This is a familiar “backlog” scenario, and one that many states have undertaken. As you track and monitor the backlogged children, the question should be what is happening “upstream?” Did you re-deploy staff from elsewhere in order to expedite the adoptions, adding to the stresses in your system that developed the backlog in the first place? In short, while looking hard at moving children through the back end of the system, did you stop paying attention to the front end and stop moving children who are earlier in the process, hence creating your next backlog?
Always keep a broad, systemic view of the child welfare system, even when the interventions you propose are highly targeted. A theory of change called Systems Dynamics underscores that action on one part of a system may well cause unexpected changes in other parts. Much like a balloon, when squeezed at one end and it inflates at another spot, child welfare systems have similar elastic properties. If you deploy finite resources or focus all your attention on one part, there may be other effects elsewhere in the system. Vigilance is the key, and maintaining a broad view of the whole system is paramount. A broadly based set of indicators, regularly monitored and discussed, go a long way to keeping jurisdictions from being surprised by unintended consequences of highly focused work.
Click below to download the report in PDF.