Sunteți pe pagina 1din 140

TRAINING AND DEVELOPMENT IN

Human Services
The Journal of the National Staff Development and Training Association

SPECIAL ISSUE
Organizational Development

Volume 6, Number 1, 2011


ISSN 1941-2193
TRAINING AND DEVELOPMENT IN

Human Services
The Journal of the National Staff Development and Training Association

Editor Editorial Review Board


Dale Curry Jane Berdie
Kent State University Berdie and Associates
Charmaine Brittain
Associate Editors Denver University
Anita P. Barbee Grace Clark
University of Louisville International Center on Global Aging
Michael J. Lawler Gary Crow
University of South Dakota Lorain County Children Services
Michael Nunno
Guest Editors Cornell University
Phil Basso Peter Pecora
American Public Human Services University of Washington and
Association Casey Family Foundation
Kathy Jones Kelley Andrew Schneider-Munoz
American Public Human Services University of Pittsburgh
Association
Editorial Consultant
Managing Editor
Karen VanderVen
Jann Donnenwirth University of Pittsburgh
University of California, Davis
Acknowledgements
The editors thank the following organizations
for support in producing this issue:
Center for Human Services,
University of California, Davis
Children and Family Training,
Office of Children, Youth, and Families,
Colorado

Volume 6, Number 1, 2011 ISSN 1941-2193


Guidelines for Article Submission

In general, submissions should follow the Articles must be typewritten (MS Word) and
guidelines of the Publication Manual of the submitted electronically (e-mailed as an
American Psychological Association (latest attachment) to:
edition). Submissions should pertain to one of
the following areas: Dale Curry, PhD, LSW, CYC-P
Associate Professor, Human Development
• State of the Field and Family Studies
• Learning Activities Director, International Institute for
• Instrumentation Human Service Workforce Research and
• Conceptual or Empirical Development
School of Lifespan Development and
Learning exercises should include the Educational Sciences
following 11 areas: 1) introduction to topic Nixson Hall
(human service area learning points) and brief Kent State University
review of the literature (conceptual under- P.O. Box 5190
pinning to the learning activity), 2) learning Kent, Ohio 44242
activity title, 3) activity goals/objectives/ Email: dcurry@kent.edu
competencies addressed, 4) group size,
5) time required, 6) required materials,
7) physical setting, 8) procedures/process,
9) cautions (how might the activity be
“uncomfortable/threatening” to participants),
10) adaptations, and 11) supportive materials.

Instrumentation articles should include the


following nine areas: 1) conceptual
background from which the instrument was
developed, 2) description of the instrument,
3) intended use, 4) procedures for utilization,
5) data regarding sample norms (if available),
6) reliability, 7) validity, 8) cautions, and
9) a copy of the instrument or how to obtain.
TRAINING AND DEVELOPMENT IN

Human Services
The Journal of the National Staff Development and Training Association

CONTENTS

Editors’ Introduction .................................................................................................................... 5

Section One: Conceptual and Empirical

Organizational Development and Effectiveness in Human Services .................................. 8


Dale Curry, Phil Basso, and Kathy Jones Kelley

Organizational Development Specialist Competency Model ........................................... 20


Freda Bernotavicz, Kay Dutram, Sterling Kendall, and Donna Lerman

Evaluating Organizational Effectiveness in Human Services:


Challenges and Strategies ................................................................................................. 37
Cynthia Parry

Section Two: State of the Field

Embedding Field-wide Guidance for Agency Effectiveness:


The 2010 PPCWG Institute............................................................................................... 52
Phil Basso and Bertha Levin

Improving Local Office Supervision:


Arizona Department of Economic Security ...................................................................... 68
Phil Basso and Elizabeth Hatounian

Building Organizational Effectiveness Capacity in a Training System:


Pennsylvania Child Welfare Training Program ................................................................ 84
Kathy Jones Kelley

Texas: A Case Study in Organizational Effectiveness ...................................................... 98


Jon Rubin and Daniel Capouch

Use of a State Practice Model in Re-design of the


Minnesota Child Welfare Training System..................................................................... 107
Kathy Jones Kelley
Section Three: Instrumentation and Methodology

The National Child and Youth Care Practitioner Certification Exam .............................119
Child and Youth Care Certification Board

Application Potential of Professional Learning Inventory—APPLĪ 33 ..........................129


Dale Curry, Michael Lawler, Jann Donnenwirth, and Manon Bergeron
EDITORS’ INTRODUCTION

Training and Development in Human Services is the journal of the National Staff
Development and Training Association (NSDTA), an affiliate of the American Public Human
Services Association (APHSA). APHSA, founded in 1930, is a nonprofit, bipartisan
organization of individuals and agencies concerned with human services. The association’s
mission is to pursue excellence in health and human services by supporting state and local
agencies, informing policymakers, and working with partners to drive innovative, integrated,
and efficient solutions in policy and practice. NSDTA, founded as an APHSA affiliate in
1985, is an interdisciplinary professional organization comprised of training and development
professionals serving diverse populations in a variety of settings across the lifespan. Some of
these disciplines include social work, public administration, counseling, psychology,
recreation, education, child and youth care work, family life education, juvenile justice,
economic assistance, mental health, and other human service fields. Further, within the
training and development function are variety of roles: administrative support,
communications specialist, evaluator/researcher, human resource planner, instructional media
specialist, instructor/trainer, manager, organizational development specialist, and training
program and curriculum designer.

The mission of the NSDTA is to build professional and organizational capacity in human
services through a national network of membership sharing ideas and resources on
organizational development, staff development, and training. It has a vision of competent and
caring people in effective organizations creatively working together to improve the well-being
of society’s children, adults, and families. NSDTA accomplishes its mission by:

• Promoting a network of contacts to discuss and disseminate best practice methods and
strategies
• Providing a national forum for discussion of staff development and training issues
• Providing leadership in development of local, state, and federal programs and
procedures that enhance the skills of staff and develop standards and evaluation
criteria for training programs nationwide
• Developing public policy recommendations and advocating for staff development and
training issues
• Creating opportunities for continual learning and professional development for itself
as an organization and for its members

Training and Development in Human Services 5


The primary goal of NSDTA’s journal Training and Development in Human Services is to
provide a venue for both human services training scholars and practitioners to contribute to
the knowledge base and advance the field of human services training and development.

The publication is structured in a way that encourages the traditional conceptual and empirical
journal articles that contribute to advancement of the field. The journal also provides very
practical articles that can be immediately applied by training and development practitioners. It
is typically sectioned into four areas: (1) Conceptual and Empirical, (2) State of the Field, (3)
Learning Activities, and (4) Instrumentation and Methodology. This special issue focuses on
organization development (OD) in human services and contains articles on all but the
Learning Activities area.

The first section emphasizes empirical and conceptual articles that have the potential to
promote scholarly thinking and move the profession forward. The first article in this section
provides an overview of organization development principles and application to human
service settings. This is followed by an abbreviated reprint of the NSDTA competencies
model for the OD practitioner. The last article in this section provides a conceptual
understanding of issues pertaining to evaluation of OD activities.

The second section contains articles that provide benchmarks for the state of the field of
training and development in human services. This issue provides in-depth organizational
development case studies from four statewide human service systems and the Positioning
Public Child Welfare Guidance (PPCWG) Institute.

The third section provides information concerning instrumentation and methodology in


human services professional development and training. This issue provides information on the
National Child and Youth Care Practitioners Certification Exam and a shortened version of
the Transfer Potential Questionnaire (Application Potential of Professional Learning
Inventory—APPLĪ 33).

Overall, this issue contains articles that build upon the NSDTA roles and competencies
model. In particular, this issue emphasizes the broadened role of the human services
professional development and training practitioner, with a focus on organization development.
Several articles illustrate the use of OD principles to improve organizational effectiveness in
human service systems. In particular, examples from the field of the APHSA organizational
effectiveness model are provided. We hope that you will take advantage of the authors’
contributions to the journal and the field.

We welcome comments concerning the structure and content of the journal. Please consider
contacting journal staff with comments or contributions for future issues.

—Dale Curry, Phil Basso, and Kathy Jones Kelley

6 Training and Development in Human Services


SECTION ONE
CONCEPTUAL AND EMPIRICAL

Articles in this section typically encompass a wide range of content areas in human services
training and development. Some of these areas may include issues related to curriculum
development, training delivery strategies, fiscal analysis and budgeting, research and
evaluation, training of trainers and training managers, and ethical issues in training and
development. Articles in this section are expected to either articulate a conceptual approach or
provide empirical support for theory related to human services training and development.
Both qualitative and quantitative evaluation and research are encouraged.

Since the field of training and development in human services is relatively young and still
emerging, further conceptual development and empirical validation of practice is much
needed. Practitioners are strongly encouraged to reflect upon their work and add to the
knowledge base by submitting articles to this journal.

Although millions of dollars are spent annually on human services training and development
activities, programs often are developed and implemented without the use of established
conceptual frameworks or empirical evaluation of program effectiveness. Continued
conceptual development and empirical inquiry in human services training and development is
essential.

The conceptual and empirical articles included in this issue provide a broadened view of
human services professional development and training and evaluation. The first article
provides a brief overview of the conceptual background that underlies many organization
development initiatives. The second article is excerpted from the NSDTA roles and
competencies model and delineates competencies for the organizational development
specialist. The last article engages the reader in a discussion of conceptual issues pertaining to
evaluation of OD activities.

Training and Development in Human Services 7


ORGANIZATIONAL DEVELOPMENT
AND EFFECTIVENESS IN HUMAN SERVICES

Dale Curry, Kent State University, and


Phil Basso and Kathy Jones Kelley, Organizational Effectiveness,
American Public Human Services Association

Over the last several decades, the field of human services professional development and
training has moved from a narrow perspective of training to broader concepts of human
performance improvement, human resource development, and, increasingly, the environment
in which training and the application of training interventions occur. Furthermore, leaders
have advocated for a more strategic use of professional training and development activities,
necessitating a broader systems perspective including the identification of key players
affecting individual, team, organizational, and service delivery system outcomes
(Bernotavicz, Curry, Lawler, Leeson, & Wentz, 2010; Curry, Lawler, & Bernotavicz, 2006;
Lawson, Caringi, McCarthy, Briar-Lawson, & Strolin, 2006).

The National Staff Development and Training Association (NSDTA), an affiliate organization
of the American Public Human Services Association (APHSA), and APHSA’s Organizational
Effectiveness department embraced this broadened perspective in several ways. APHSA
developed an organizational effectiveness (OE) consulting practice that has delivered models,
tools, and facilitation services to more than 25 state public human services agencies over the
past seven years. (See OE field project list at http://www.aphsa.org/OE/OE_Services.asp.)

In addition, NSDTA developed and disseminated its roles and competencies model that
delineates competencies emphasizing nine key roles (administrative support, communication
specialist, evaluator/researcher, human resource planner, instructional media specialist,
instructor/trainer, manager, organizational development specialist, and training program and
curriculum designer). (See http://nsdta.aphsa.org/resources.htm#CompetencyGuides.)

One of these important roles pertains to organizational development (OD), the application of
behavioral science knowledge to enhance an organization’s effectiveness and efficiency. OD
focuses on organizational culture, processes, and structures within a total systems perspective
with an emphasis on planned organizational change (Huse, 1978; Jackson, 2006).

Many principles of OD practice are similar to principles of human service work, except that
they are applied to work teams, organizations, and other systems (e.g., problem identification,

8 Training and Development in Human Services


data gathering, assessment, collaborative planning, readiness for change, change planning,
implementation, strategies to prevent backsliding, evaluation of change efforts). Human
service practitioners can parallel process these organizational change concepts to more
familiar client (children, youth, adults, families) change/intervention concepts.

NSDTA has supported the work of the APHSA OE department to promote the development
of tools and processes that in turn advance organizational effectiveness in human services
through strategic organizational planning and change/development processes. APHSA OE
team members have regularly presented workshops pertaining to OE at recent NSDTA
professional development institutes. Within this current journal issue, several organizational
case studies provide descriptive examples of OD principles applied to human service
organizations/systems.

This article provides an overview of several key conceptual approaches to OD, while
attempting to connect these organizational frameworks to approaches that are commonly used
in human services with individuals and families.

Conceptual Approaches

The related disciplines of organizational science, OD, and industrial psychology emphasize a
myriad of conceptual frameworks. While lacking a unified theoretical approach, conceptual
models guide the activities of OD professionals, though practitioners typically apply or adapt
a mixture of frameworks (Brown, Pryzwansky, & Schulte, 1987; Edmondson, 1996;
Piotrowski, Vodanovich, & Armstrong, 2001). A description of few conceptual frameworks
from some of the most common OD frameworks follows.

Problem Solving to Facilitate Change

Typically, a basic OD process parallels the general problem solving process and begins with
(1) sensing a need for change, based on a problem or opportunity, followed by (2) developing
an organizational consultant-client relationship, (3) assessing the situation, (4) identifying a
desired organizational state/goal, (5) developing and implementing action change plans to
improve organizational effectiveness and (6) monitoring and evaluating the effectiveness of
the action plans. This process is repeated until the organizational goal is attained (Jackson,
2006).

A general problem solving process is commonly used in most human service settings with the
unit of analysis and target of change being the individual client or family. A typical case flow
process follows the general problem solving model. For example, Rycus and Hughes (1998)
describe the casework process in child protective services as including (1) establishing
relationship, (2) conducting assessment, (3) determining goals/objectives, (4) identifying
intervention activities, (5) implementing case plan, and (6) reassessing. The development of

Training and Development in Human Services 9


treatment plans and educational plans in mental health and educational settings follow a
somewhat similar approach.

In an OD context the unit of analysis and target of change becomes the team, organization, or
human service system. The importance of providing sufficient attention to each step (e.g.,
assessment phase) and avoiding the tendency to move to strategies/solutions too soon is an
important tenet of this approach. Although the problem solving approach is fairly
straightforward, it is not uncommon for teams and organizations to not have an explicit model
for solving problems and/or attaining potential opportunities. Problem solving in OD is
considered a strategic process (Jackson, 2006).

Systems Theory

Characteristics and principles of systems can be used as metaphors to help understand how
organizations function. Some of these include: isomorphism; purpose/goals; roles;
boundaries; wholes and parts; interrelatedness; input, output; equilibrium, self-
regulation/rules; equifinality and multifinality; and entropy, negative entropy, and
differentiation (von Bertalanffy, 1968; Garbarino, 1982; Mattaini, Lowery, & Meyer, 1988;
Melson, 1980; Senge, 1990; Zastrow & Kirst-Ashman, 1987). Below are descriptions of these
characteristics and principles as summarized by Holden and Holden (2000).

Isomorphism. This is the tendency for system principles to remain the same across systems.
For example, human service workers use system principles for understanding families. OD
practitioners apply many of the same principles to teams, organizations, and other performing
systems.

Purpose/goals. The purpose and/or goals among systems may be different. But all systems
have a purpose/function. Organizations often make the purpose formal through mission and
vision statements. Human service workers are also concerned with the functions of a system—
the family. How families fulfill their functions (e.g., economic provision, responsible
reproduction and parenting, supporting family emotional needs) is often the focus of human
service work.

Roles. In order to achieve system goals (above), somewhat predictable or expected patterns of
functioning often occur among the parts of a system. In organizations, patterns of functioning
of employees sometimes translate into formal job descriptions. However, informal roles may
also exist. In the absence of role clarity and commitment, role confusion and role
diminishment may occur. Again, human service workers may see the parallel in families. For
example, roles such as the disciplinarian, breadwinner, house manager, and comforter may
form to achieve family goals. Unfortunately, less desirable roles may also form such as the
family member designated as the person with the problem. Both positive and negative roles
can emerge in teams and organizations as well.

10 Training and Development in Human Services


Boundaries. These are dividing and defining lines between individuals, subsystems, and
other systems. They influence degree of separateness or connectedness among the systems
and subsystems. Thus, boundaries also affect the flexibility of communication, activity,
movement, and other interaction among the system elements (e.g., employees). Hierarchy
within organizations creates formal boundaries (chain of command). As with roles, unclear
boundaries can result in misunderstandings or more significant negative patterns. Human
service workers can make comparison to the generational hierarchy and boundaries that exist
in families.

Wholes and parts. Zastrow and Kirst-Ashman (1987) define a system as a set of elements
which form an orderly, interrelated, and functional whole.

Interrelatedness of the parts. As stated above, the system elements influence each other.

Input. Energy and communication flows to and from (output) other systems/subsystems.

Equilibrium. Systems have a tendency to maintain a relatively steady state among the system
elements. Thus, initial system resistance to change can often be expected.

Self-regulation/rules. A system has self-regulating mechanisms to achieve its purpose. In


organizations, these may be formalized into work rules. In families, rules (spoken and
unspoken) influence family member behavior and provide feedback to family members to
maintain course.

Equifinality and multifinality. These principles refer to the fact that there are many ways to
achieve the same purpose. A change in one part of the system, even if distant from the
perceived problem, may affect positive change in another part of the system. Thus OD
practitioners may attempt to positively influence organizational change in ways that may
seem less directly related to the perceived problem. For example, personality clashes among
employees may be dealt with by clarifying organization or team goals, roles, and rules rather
than intervening directly with the individuals perceived to have the problem. Similarly,
human service workers may focus on strengthening the marital relationship as a strategy to
improve a child’s misbehavior.

Another implication of this principle is that impacting one part of the system may have
several intended as well as unintended effects on the whole. In many ways, one cannot really
understand a system until one tries to change it (Lewin, 1951). In both OD and in family
engagement efforts this leads to a cycle of engagement and change that is developmental and
often relies on experiential learning.

Entropy, negative entropy, and differentiation. All systems have a natural tendency to
move toward disorganization, deterioration, and death. At the same time, there are countering
forces toward growth and a more complex existence.

Training and Development in Human Services 11


In many ways, systems theory seems easy to understand in concept. However, it requires a
change in thinking (systems thinking). In OD, systems thinking requires one to shift focus
from the individual (employee, manager) to the team and/or organization and back again. In
our western mainstream culture, we have been socialized to view causality (achievement,
responsibility, blame) from an individualistic perspective. In addition, our tendency to expect
cause and effect to be closely related in time and space can interfere with our ability to view
problems more holistically (Senge, 1990).

Parallel processing again to human service work, we see the same individualistic versus
systems thinking influencing practice and interaction with the community. It is common for
many in society to view success and failure as an individual’s responsibility, failing to see the
influence of family, community, broader society, and the developmental process itself.
Statements such as “If she really wanted to leave that violent husband, she would have a long
time ago” are common among lay persons and even among some human service
professionals.

Field Theory

According to Lewin (1951, p. 169), “There is nothing so practical as a good theory.” Field
theory provides a simple, easy-to-understand approach to change, involving the interplay
between two opposing sets of forces. A field is a representation of the complete environment
of the individual and significant others—one’s life space. As individuals move through life
spaces (e.g., juvenile court, client homes, office), driving and restraining forces are influenced
by one’s perception of the current psychological field (e.g., support for organizational change)
and influenced by individual needs (e.g., relevance of training to work needs) (Lewin, 1951;
Smith 2001).

Individual, team, and organizational change occurs when equilibrium is disrupted. An existing
field of forces is changed by increasing driving and/or decreasing restraining forces for
change. The number and strength of driving and restraining forces will determine if change
occurs. If the strength of the total number of transfer driving forces is greater than the
restraining forces, change is likely to occur. If the total strength of the restraining forces is
greater or equal to the driving forces, new individual, team and/or organizational
behavior/change will not occur or will not be sustained (Broad & Newstrom, 1992; Curry, et
al., 1991; 1994, Curry, 1997; Curry & Caplan, 1996; Lewin, 1951). An example of an
application of field theory pertaining to transfer of training in human services is discussed by
Curry, Lawler, Donnenwirth, & Bergeron (in press) in this journal issue.

Application of field theory is not limited to organization development. Human service


workers have also employed force field analysis to a host of human service change areas. For
example, Curry & Dick (2000) provide an example of the use of field theory in helping
human service workers understand why survivors of domestic violence may remain in
harmful family environments.

12 Training and Development in Human Services


Behavioral Change Process Model

The behavioral change process model as described by Kurt Lewin (1958) and Edgar Schein
(1987) in its simplest form involves three basic stages (unfreezing, change, refreezing). This
change model has similarity to the levels of competence model that familiar to human services
training and development professionals (Caplan & Curry, 2001; Curry & Rybicki, 1995). In
order to unfreeze old habits of behavior, one must become aware/conscious of the need for
change (transition from unconscious competence to conscious competence). The creation of
psychological safety and reassurance is necessary to create the climate to unfreeze. Once
unfrozen, the new awareness may create sufficient motivation for change (learning readiness
according to the levels of competence model).

In the second stage (change), individuals develop new behaviors (the stage of conscious
competence in the levels of competence model). This is considered a critical stage where a
significant amount of both individual and organizational support for change is necessary. The
new behaviors must be congruent with individual and organizational values and norms
(Jackson, 2006; Schein, 1980).

In the third stage (refreezing stage), change is integrated to the unconscious level
(unconscious competence) and feels as natural as the previous behavior. In this stage, new
individual behaviors and organizational norms become more routinized, a part of everyday
work life. Although the concept of refreezing may sound rigid and unchangeable, if this
change cycle is ongoing, an internalization of the developmental process itself becomes the
norm, rather than a specific organizational change.

These same change principles can be seen in the individuals and families human services
workers serve. For example, a parent embracing, learning, and proficiently demonstrating new
parenting behaviors can be observed going through the unfreeze, change, and refreeze stages.
For example, helping a parent to recognize the need to discipline differently often requires
that the parent can feel psychologically safe enough to explore alternative strategies.
Similarly, when implementing new behaviors, a significant amount of support may be
necessary to maintain or refine and not relapse to previous behaviors.

Organizational Learning

Learning can occur at the individual, team, and organization/system level. Organizational
learning is the basis for organizational change and considered essential for organizational
survival. Organizational learning occurs when the organization develops systematic processes
to acquire, use, and communicate knowledge (Jackson, 2006). Senge (1990) describes five
essential components (disciplines) of organizational learning (personal mastery, mental
models, shared vision, team learning, systems thinking). Creating a culture and climate for
individual, team, and organizational learning is critical for effective implementation of
learning (transfer climate). Organizational support for professional development and training

Training and Development in Human Services 13


is also associated with long-term worker retention (Curry, McCarragher, & Dellmann-Jenkins,
2005; Holton, Bates, & Ruona, 2000).

Action Research/Action Learning

Planned change, described as a cyclical process by Lewin (1948) and known as action
research, is fundamental to OD. Action research is a systematic method of data collection and
feedback, action, and evaluation based on further data collection—the process of problem
solving in action (Jackson, 2006; Lewin, 1948). According to McTaggart (1996) action
research is not a method or a procedure for research but a series of commitments to observe,
solve problems, and take advantage of organizational opportunities to develop through
practicing a series of principles for conducting social inquiry. AR involves both learning and
doing. According to Lewin, both the learning and doing are part of the same process.

Experiential learning (learning by doing) is a common feature of many OD efforts. Human


service professional development and training practitioners are familiar with experiential
learning in the traditional workshop/classroom setting. Use of Kolb’s learning style inventory
and an understanding of the cycle of experiential learning (Kolb, Rubin, & McIntyre, 1974)
are often prerequisites to becoming a trainer in many human service training systems (e.g.,
Ohio Child Welfare Training Program). OD practitioners typically extend experiential
learning beyond the classroom to organizational change initiatives.

Sociotechnical Model of Organizational Effectiveness

A basic premise of this model is that the success of an organization is dependent upon the
degree of congruence between the organization’s core technology and the organization’s
social context (Glisson, 2007). In human service settings, examples of core technology
include practice/treatment models, assessment tools, case/care planning approaches, and case
review processes. The social context includes factors such as norms, expectations, and
informal procedures (the way things are typically done in organizations).

Recent research in human service settings indicates that organizational culture and climate
influence key service elements including employee morale, innovativeness, turnover, and the
quality of service delivery, ultimately affecting service outcomes for individuals and families
(Glisson, 2002, 2007; Glisson & Durick, 1988; Glisson & Hemmelgarn, 1988; Glisson &
James, 2002; Jaskyte, 2005).

Implementing best evidence based practice to promote positive service outcomes involves
planned intervention of a human service agency’s core technology within the organizational
social context.

14 Training and Development in Human Services


The APHSA Comprehensive OE Approach

In 2004 APHSA created an OE department to shift the focus of its leadership development
efforts from classroom-based curricular training to organizational development consulting and
facilitation. The APHSA OE approach has incorporated many of the concepts discussed
above. APHSA OE initiatives in human service organizations and systems attempt to integrate
principles of OD and implementation science to promote organizational effectiveness and
better service outcomes for individuals and families. These initiatives have resulted in a
comprehensive practice model and set of supporting models, tools, templates, and narrative
guidance that is, itself, being subjected to a learning-by-doing cycle of improvement and
further development. Case examples from several state human service systems (Pennsylvania,
Minnesota, Arizona, and Texas) are described in this journal issue.

Summary

The role of the human services professional development and training practitioner has
expanded from development of individuals to include assessment, intervention, and
evaluation at the team and organizational/system level. A well established knowledge base
that transcends fields and disciplines such as organizational and industrial psychology and
management science is being utilized in human services settings.

As an example, the establishment of the child welfare implementation centers by the U.S.
Department of Health and Human Services, Children’s Bureau provides support for planned
system-wide interventions in various regions. For instance, the Midwest Child Welfare
Implementation Center is helping to build Ohio’s capacity to implement evidence-informed
and promising child welfare interventions. Some of the activities supported by the center are
formal assessment of organizational culture and climate; development and installation of the
technical assistance technical assistance model; a rule review; implementation of
organizational structural and functional changes to facilitate the new technical assistance
model; and ongoing fidelity monitoring.

Another example is the learning leaders/learning community initiative by the Atlantic Coast
Child Welfare Implementation Center, which provides guidance on topical areas of interest
and explores new and creative approaches for peer learning to advance child welfare and
organizational implementation of best practices. Examples of leadership at the national level
for organizational development in human services settings are the APHSA OE initiatives and
the activities of the National Resource Center for Organizational Improvement at the
University of Southern Maine.

Human services professional development and training practitioners must recognize the
important role that has emerged to promote organizational effectiveness at the team,
organization, and broader system level. New challenges exist for understanding how to use
the existing OD knowledge base as well as add to that with examples from human services

Training and Development in Human Services 15


organization. For example, how do we expand upon our increasing ability to evaluate training
programs and effectively evaluate organizational/system interventions such as state and
national technical assistance networks? How well do evidence-based OD interventions that
have proven successful in business environments work in human service settings? The field
and emerging profession of professional development and training in human services
continues to grow!

References

Bernotavicz, F., Curry, D., Lawler, M., Leeson, K., & Wentz, R. (2010). A new key to
success: Guidelines for effective staff development and training programs in human
services agencies. Washington, D.C.: National Staff Development and Training
Association and American Public Human Services Association.

Broad, M.L. & Newstrom, J.W. (1992). Transfer of training: Action-packed strategies to
ensure high payoff from training investments. Reading, MA: Addison-Wessley.

Brown, D., Pryzwansky, W.B., & Schulte, A.C. (1987). Psychological consultation:
introduction to theory and practice. Boston: Allyn and Bacon.

Caplan, P. & Curry, D. (2001). The wonder years: Professional development in child welfare.
Proceedings of the Trieschman/Child Welfare League of America Workforce Crisis
Conference. Retrieved from http://www.cwla.org/programs/trieschman/2001fbwrecap.
htm.

Curry, D.H. (1997). Factors affecting the perceived transfer of learning of child protection
social workers. Unpublished doctoral dissertation, Kent State University, Kent, Ohio.

Curry, D.H., Caplan, P., & Knuppel, J. (1991). Transfer of Training and Adult Learning.
Paper presented at the Excellence in Training International Conference, Cornell
University, 1991.

Curry, D.H., Caplan, P., & Knuppel, J. (1994). Transfer of training and adult learning
(TOTAL). Journal of Continuing Social Work Education, 6, 8-14.

Curry, D. & Dick, G. (2000). The battered woman’s field. Training and Development in
Human Services, 1, 40-51.

Curry, D., Lawler, M., & Bernotavicz, F. (2006). The field of training and development in
human services: An emerging profession? Training and Development in Human
Services, 3, 5-11.

16 Training and Development in Human Services


Curry, D., Lawler, M., Donnenwirth, J. & Bergeron, M. (in press). Application Potential of
Professional Learning Inventory—APPLĪ 33. Training and Development in Human
Services.

Curry, D., McCarragher, T. & Dellmann-Jenkins, M. (2005). Training, transfer and turnover:
The relationship among transfer of learning factors and staff retention in child welfare.
Children and Youth Services Review, 27, 931-948.

Curry, D. & Rybicki Sr. M. (1995). Assessment of child and youth care worker trainer
competencies. Journal of Child and Youth Care Work, 10, 61-73.

Edmondson, A.C. (1996). Three faces of Eden: The persistence of competing theories and
multiple diagnoses in organizational intervention research. Human Relations, 49, 571-
596.

Garbarino, J. (1982). Children and families in the social environment. New York: Aldine.

Glisson, C. (2002). The organizational context of children’s mental health services. Clinical
Child and Family Psychology Review, 5, 233-253.

Glisson, C. (2007). Assessing and changing organizational culture and climate for effective
services. Research on Social Work Practice, 17(6), 736-747.

Glisson, C., & Durrick, M. (1988). Predictors of job satisfaction and organizational
commitment in human service organizations. Administrative Science Quarterly, 33, 61-
81.

Glisson, C., & Hemmelgarn, A.L. (1988). The effects of organizational climate and
interorganizational coordination on the quality and outcomes of children’s service
systems. Child Abuse & Neglect, 22, 401-421.

Glisson, C., & James, L.R. (2002). The cross-level effects of culture and climate in human
service teams. Journal of Organizational Behavior, 23, 767-794.

Holden, J.C., & Holden, M.J. (2000). The working system. Training and Development in
Human Services, 1(1), 34-38.

Holton, E.F. III, Bates, R.A., & Ruona, W.E.A. (2000). Development of a generalized
learning transfer system inventory. Human Resource Development Quarterly, 11, 333-
360.

Huse, E.F. (1978). Organization development. Personnel and Guidance Journal, March, 403-
406.

Training and Development in Human Services 17


Jackson, J.C. (2006). Organization development: The human and social dynamics of
organizational change. Lanham, MD.: University Press of America, Inc.

Jaskyte, K. (2005). Organizational culture and innovation in non-profit human service


organizations. Administration in Social Work, 29(2), 23-41.

Kolb, D.A., Rubin, I.M., & McIntyre, J.M. (1974). Organizational psychology: A book of
readings, 2nd edition. Englewood Cliffs N.: Prentice-Hall.

Lawson, H., Caringi, J., McCarthy, M., Briar-Lawson, K., & Strolin, J. (2006). A complex
partnership to optimize and stabilize the public child welfare workforce. Professional
Development: The International Journal of Continuing Social Work Education, 9(2-3),
122-139.

Lewin, K. (1948) Resolving social conflicts; selected papers on group dynamics. G.W. Lewin
(Ed.). New York: Harper & Row, 1948.

Lewin, K. (1951). Field theory in social science. New York: Harper and Row.

Lewin, K. (1958). Group decision and social change. In E.E. Macoby, T.M. Newcomb, &
E.L.Hartley (Eds.). Readings in Social Psychology. New York: Holt.

Mattaini, M.A., Lowery, C.T., & Meyer, C.H. (1988). Foundation of social work practice.
Washington D.C.: NASW Press.

McTaggart, R. (1996). Issues for participatory action researchers. In O. Zuber-Skerritt (Ed.)


New Directions in Action Research. London: Falmer Press.

Melson, G.G. (1980). Family and environment: An ecosystem perspective. Minneapolis:


Burgess.

Piotrowski, C., Vodanovich, S.J., & Armstrong, T. (2001). Theoretical orientations of


organizational development practitioners. Social Behavior and Personality, 29(3), 307-
312.

Rycus, J.S., & Hughes, R.C., (1998). Field guide to child welfare. Washington D.C.: CWLA
Press.

Schein, E.H. (1980). Organizational psychology, 3rd ed. Englewood Cliffs, NJ: Prentice-Hall.

Schein, E.H. (1987). Process consultation: Its role in organization development. Reading,
MA: Addison-Wesley.

18 Training and Development in Human Services


Senge, P.M. (1990). The fifth discipline: The art and practice of the learning organization.
New York: Currency Doubleday.

Smith, M. K. (2001). Kurt Lewin, groups, experiential learning and action research. The
Encyclopedia of Informal Education. Retrieved from http://www.infed.org/thinkers/et-
lewin.htm.

Von Bertalannffy, L. (1968). General system theory: Foundations, development, applications.


New York: George Braziller, Inc.

Zastrow, C., & Krist-Ashman, K.K. (1987). Understanding human behavior and the social
environment. Chicago: Nelson-Hall.

Training and Development in Human Services 19


ORGANIZATIONAL DEVELOPMENT SPECIALIST
COMPETENCY MODEL

Freda Bernotavicz, Kay Dutram, Sterling Kendall, and Donna Lerman


Institute for Public Sector Innovation, Muskie School of Public Service,
University of Southern Maine

As part of its responsibility to provide resources to the human services staff development and
training field, the NSDTA Standards Committee has developed a series of guidebooks that
define the roles and competencies needed to staff a training program. In 1997, based on a
review of the research and discussion with leaders in the field, the committee identified nine
major roles: Administrative Support, Communications Specialist, Evaluator/Researcher,
Human Resource Planner, Instructional Media Specialist, Instructor/Trainer, Manager,
Organizational Development Specialist, and lastly, Training Program and Curriculum
Designer. These roles are not synonymous with jobs or people and, depending on the size of
the organization, individuals may have only one or multiple roles.

As well as the competencies specific to each role, each guidebook includes a description of the
nine roles, the outputs, definitions of competencies, and an assessment tool. Training programs
can use these guidebooks in a number of ways including human resource planning, developing
job descriptions, recruitment and screening, performance management, and developing
curriculum to train on the roles. Copies of the guidebooks are available on the NSDTA
website.

The text that follows is excerpted from the 2002 publication Human Services Staff
Development and Training Roles and Competencies: Organizational Development Specialist.
While some of the competencies may be outdated (e.g., the minimal expectations in 26.00
Basic Computer Skills), the majority have stood the test of time and continue to reflect the
competencies needed for this role in promoting organizational effectiveness.

_____________________________

Development of the Organizational Development Specialist Competency Model

The role of the organizational development (OD) specialist in Human Services training and
development is a relatively new one and is evolving to meet the emerging needs of
organizations. The field began with a focus on classroom learning and in recent years
increased emphasis has been placed on the transfer of learning to on the job performance.
Gradually, there has been increased attention paid to identifying organizational strengths and
facilitating processes whereby employees create together organizational structures and work
processes that maximize personal productivity and organizational effectiveness, and improve

20 Training and Development in Human Services


organizational climate. The OD specialist focuses on the link between individual performance
and organizational performance. Unlike the instructor role, the OD specialist role does not
include direct classroom instruction. Instead, as the OD specialist works with groups of direct
service staff, supervisors or managers, information is gathered that informs the training
process. The OD specialist role is useful all along the continuum of change, from external
environmental occurrences forcing organizational response to internal performance barriers
requiring inquiry and dialogue leading to transformative action.

This model was developed by a group of experienced OD specialists who brought both their
theoretical and practical knowledge and experience to bear in identifying the competencies
needed in this role. In doing so, they built upon the existing competency models for the
Instructor role and the manager role and included data from other sources as shown in the
references. The list of competencies illustrates the many areas of expertise and knowledge
required of OD specialists. It is anticipated that the value placed on the various competencies
will differ from organization to organization and reflect an organization’s needs and purpose
at a particular time.

Organizational Development Specialist Competencies Outline

Administration
1. Organizational Ability
2. Human Service Policy and Framework
3. Training Administration

Communication
4. Facilitation Skills
5. Oral Communication
6. Interpersonal Communication
7. Nonverbal Communication
8. Cultural Sensitivity

Conceptual Knowledge/Skills
9. Problem Analysis
10. Judgment
11. Conceptual Thinking
12. Systems Thinking

Evaluation and Research


13. Evaluation and Research Concepts
14. Data Collection
15. Data Input and Analysis

Training and Development in Human Services 21


Group Dynamics and Process
16. Interpersonal Skills
17. Group Process
18. Managing Process
19. Group Climate
20. Teamwork and Cooperation

Human Resource Management


21. Human Resource Concepts and Systems
22. Human Resource Maintenance
23. Performance Management
24. Task/Work Analysis
25. Competency-Based Approaches

Information Management
26. Basic Computer Skills
27. Management Information Systems
28. Accessing Information

Instructional Management
29. Training Systems
30. Assessment and Transfer

Learning Theory
31. Learning and Human Development
32. Learning Climate

Person/Organization Interface
33. Impact and Influence
34. Approaches
35. Initiative
36. Organizational Development
37. Effective and Efficient Work Process

Self-Management Skills
38. Self-Responsibility
39. Self-Concept
40. Self-Control
41. Flexibility
42. Job Commitment
43. Professional Standards/Ethics

22 Training and Development in Human Services


Organizational Development Specialist Competency Model Comprehensive Listing

Administration

1.00 Organizational Ability


Ability to demonstrate organizational skills.
01.01 Work Management: Shows ability to plan, schedule, and direct the work
of self and others.
01.02 Work Assignments: Balances task requirements and individual abilities
(matches people and assignments).
01.03 Work Organization: Organizes materials or activities to accomplish tasks
efficiently.
01.04 Goal Setting: Sets challenging yet achievable goals for self and others.
2.00 Human Service Policy and Framework
Ability to demonstrate understanding of human services policy and laws.
02.01 Human Services Philosophy and History: Understands relevant human
services history, theory, values and ethics.
02.02 Federal and State Laws and Regulations: Comprehends federal/state laws,
legislation, regulations, and agency guidelines.
3.00 Training Administration
Ability to demonstrate training administration skills.
03.01 Planning: Demonstrates knowledge of the concepts of strategic, operational,
and long range planning.
03.02 Current Issues: Demonstrates understanding of current issues that affect the
organization.
03.03 Policies and Procedures: Understands key policies and operating procedures of
the organization.

Communication

4.00 Facilitation Skills


Ability to demonstrate effective facilitation and process consultation skills.
04.01 Session Objectives: Presents each objective at the beginning of the session both
orally and in writing.
04.02 Questioning: Asks participants open or closed questions which are related to
the objectives; answers all relevant questions.
04.03 Job-Relevant: Uses job-specific materials and references to enhance the
relevancy of the presentation.
04.04 Modes of Expression: Uses multiple modes of expression to convey and to
gather information (e.g., uses all five senses).
04.05 Range of Techniques: Selects from a range of techniques (e.g., paced
presentations, concrete examples, analogies, nonverbal activities, or media
presentations) to convey key ideas.

Training and Development in Human Services 23


5.00 Oral Communication
Ability to use the spoken word to effectively communicate.
05.01 Projection: Speaks loudly and clearly enough to be heard and understood by
everyone in the room.
05.02 Highlights Information: Speaks to emphasize important points and
demonstrate personal interest.
05.03 Varies Speech: Frequently varies vocal characteristics (e.g., pitch, speed,
modulation) throughout the session.
6.00 Interpersonal Communication
Ability to effectively communicate with individuals and groups.
06.01 Expectations: Discusses individual participant expectations and relates them to
session objectives.
06.02 Examples: Provides or elicits from participants relevant examples, anecdotes,
stories, and analogies, and appropriately uses humor.
06.03 Clear Explanations: Adequately explains participant roles, requests feedback,
and monitors learning activities.
06.04 Probing Techniques: Uses interviews, questions, and other probes to gather
information and stimulate participant insight.
06.05 Feedback: Provides constructive feedback to individuals and groups in order to
encourage continued progress.
06.06 Modeling: Models effective communication competencies (e.g., coaching,
counseling, and contracting).
06.07 Listening Skills: Uses active listening skills to gather information, encourage
discussion, and elicit feedback from the participants.
06.08 Engagement: Engages all partners in the OD process, jointly constructing
purpose, shared norms and appraisal of outcomes; values the knowledge and
contribution of all participants.
7.00 Nonverbal Communication
Ability to use nonverbal behaviors to effectively communicate.
07.01 Eye Contact: Frequently makes eye contact with people.
07.02 Appropriate Behavior: Refrains from distracting behaviors and mannerisms.
07.03 Emphasis: Uses gestures and nonverbal behavior to emphasize important points
and to demonstrate personal interest.
07.04 Stimulates Involvement: Uses unspoken cues and mannerisms to stimulate
participant involvement throughout the session.
8.00 Cultural Sensitivity
Ability to demonstrate effective cross-cultural communication techniques.
08.01 Cross-Cultural Diversity: Adapts communication and behaviors in order to
interact effectively with different types of individuals or groups.
08.02 Cross-Cultural Sensitivity: Recognizes and validates the differences in
cultural, ethnic, and religious values, perceptions, customs, and behaviors.
08.03 Cross-Cultural Discussion: Engages participants in the discussion of cultural
issues.

24 Training and Development in Human Services


08.04 Managing Conflict: Manages disagreement and conflict concerning cultural
issues.
08.05 Cross-Cultural Communication: Communicates effectively with persons of
diverse cultures.

Conceptual Knowledge/Skills

9.00 Problem Analysis


Ability to demonstrate effective problem analysis skills.
09.01 Assessment: Seeks out relevant data and examines complex information to
determine the important elements of situations.
09.02 Approaches: Uses critical judgment to assess alternative approaches to
problems or decisions. Values diversity of opinion and mental models as a way of
bringing multiple perspectives to bear on a problem.
09.03 Analysis: Uses a systems framework to identify both causes and consequences
of breakdowns. Discerns gaps between theory and practice.
10.00 Judgment
Ability to demonstrate sound judgment.
10.01 Using Information: Facilitates identification of, and communicates cause and
effect relationships. Reaches sound conclusions and makes reasonable decisions based
on available information.
10.02 Balance: Weighs both short- and long-term needs and consequences. Attends to
both human and organizational needs.
10.03 Priorities: Sets priorities for tasks in order of importance.
10.04 Discernment: Understands long-term cause and effect relationships; multiple
causation and problem archetypes.
10.05 Objectivity: Maintains objectivity in handling difficult issues, events, or
decisions.
11.00 Conceptual Thinking
Ability to synthesize information, to discern patterns in events and relationships.
11.01 Frameworks and Experience: Uses theoretical frameworks as well as learning
from past experience to guide analysis or actions.
11.02 Past Experience: Practices self-reflection to enhance learning from experience.
Distinguishes crucial similarities and differences in present and past situations.
11.03 Creative Thinking: Is comfortable with risk-taking in thinking. Plays with
ideas.
11.04 Systems View: Looks at the big picture. Focuses on interactions and dynamics
to create a climate for action. Recognizes tension as a catalyst for positive change.
12.00 Systems Thinking
Ability to focus on interactions and dynamics, taking a long view which includes the past, the
future, and the immediate present.
12.01 Dynamic Thinking: Understands complex relationships and interdependencies.
Frames a problem in terms of patterns of behavior over time.

Training and Development in Human Services 25


12.02 System-as-Cause Thinking: Discerns the patterns of recurring problems not
driven by daily events.
12.03 Global Thinking: Sees the entirety of a situation. Reframes issues or problems.
Balances short-term and long-term needs and perspectives.
12.04 Operational Thinking: Understands how a behavior is actually generated.
Questions any and all underlying assumptions.
12.05 Closed-Loop Thinking: Views causality as an ongoing process, not a one-time
event, with effect feeding back to influence causes, and causes affecting each other.
12.06 Quantitative and Qualitative Thinking: Knows how to quantify and how to
incorporate intangible variables into the system model.

Evaluation and Research

13.00 Evaluation and Research Concepts


Ability to demonstrate understanding of basic evaluation and research concepts.
13.01 Basic Concepts: Understands the process and importance of identifying the
purpose, audience and questions to be answered.
13.02 Evaluation Levels: Understands the levels of impact of evaluation, the issues
related to each, and uses this framework to recommend an evaluation plan.
13.03 Evaluation and Research Design: Understands basic concepts and
terminology related to design such as pre- and posttests, case study, sampling, control
and comparison groups, experimental and quasi-experimental design.
13.04 Data Collection Instruments: Describes types of tools for data collection
including questionnaires, attitude or satisfaction surveys, paper-and-pencil or
performance tests, interviews, focus groups, observations, and administrative
performance records.
13.05 Data Collection Issues: Understands basic concepts related to data collection
such as validity and reliability, and practical issues in selection and administration of
data collection instruments.
13.06 Data Analysis: Understands general concepts of data analysis including
qualitative and quantitative data, process and outcome data, descriptive or inferential
data, and cost benefit analysis.
13.07 Reporting and Dissemination: Understands types of reports (formative and
summative) and the role of interim and final results reporting. Provides leadership and
support for stakeholder involvement, providing feedback and disseminating results.
14.00 Data Collection
Ability to identify useful data and methodologies to collect it.
14.01 Organizational Research: Designs and uses field research methods, including
ethnographic studies; and evaluation methods for change processes.
14.02 Managing the Consulting Process: Engages with staff/clients, diagnoses
conditions and organizational patterns, designs and implements appropriate
interventions, manages unprogrammed events.

26 Training and Development in Human Services


14.03 Questioning: Collects data via pertinent questions asked during surveys,
interviews, and focus groups for the purpose of performance analysis.
14.04 Survey Design and Development: Comprehends survey approaches that use
open and closed style questions for collecting data. Prepares instruments in written,
verbal, or electronic formats.
15.00 Data Input and Analysis
Ability use data for organizational benefit.
15.01 Research Methods/Statistics: Understands research/statistical methods:
measures of central tendency, measures of dispersion, basic sampling theory, basic
experimental designs (e.g., case study, pre-/posttest, control group), and sample
inferential statistics.
15.02 Analysis/Diagnosis: Conducts inquiry into all organizational systems (self,
individuals, groups, organization, and multiorganization) to determine root cause(s) of
system's effectiveness..
15.03 Designing and Selecting Appropriate and Relevant Interventions: Selects,
modifies or designs interventions that effectively move the organization from its
current state to its desired future state, using strategies that address root causes or
performance gaps.
15.04 Performance Data Analysis: Interprets performance data and determines the
effect of interventions on clients, suppliers and employees.

Group Dynamics and Process

16.00 Interpersonal Skills


Ability to apply interpersonal skills through building trust, providing feedback, and valuing
diversity.
16.01 Builds Trust: Remains neutral and objective as relationships are built; uses
reflective listening techniques, and remains flexible while expressing confidence in the
process; and names and conducts safe discussions concerning issues or cultural norms
affecting the organization.
16.02 Diversity: Understands and values diversity and different styles of perceiving,
learning, communicating, and operating.
16.03 Feedback: Provides timely, sensitive, and relevant feedback. Challenges a
participant’s ideas in a way that maintains self-esteem.
17.00 Group Process
Ability to apply group process theory, including task and maintenance functions.
17.01 Theory: Knows and understands theories and principles of group dynamics
such as distinguishing task and growth groups; task and maintenance functions; phases
of group development; and small group behavior.
17.02 Task Functions: Implements group interaction task functions such as
information or opinion seeking, information or opinion giving, clarifying,
summarizing, and consensus testing.

Training and Development in Human Services 27


17.03 Maintenance Functions: Implements group interaction maintenance functions
such as encouraging expression of group feelings, harmonizing, modifying,
gatekeeping, and evaluating.
18.00 Managing Process
Ability to manage group process, including conflict and difficult situations.
18.01 Managing Conflict: Resolves problems and manages conflicts through
negotiations, aiming for win-win agreements.
18.02 Maintains Focus: Refocuses straying groups to adhere to the agenda or plan.
18.03 Difficult Situations: Manages problem situations (e.g., hostile participants,
disengaged participants, people who monopolize) in a way that maintains participants’
self-esteem.
19.00 Group Climate
Ability to establish and maintain effective group climate.
19.01 Ground Rules: Negotiates and clarifies with the group what constitutes
effective/ineffective behavior and establishes ground rules.
19.02 Group Decision-Making: Involves the group in discussing and making
decisions on process and procedures.
19.03 Environment: Creates an environment where participants feel psychologically
safe to explore ideas, disagree, and move in and out of role-plays and simulations.
20.00 Teamwork and Cooperation
Ability to support a group of individuals as they exchange information, establish trust, make
decisions, implement and evaluate interactions and plans.
20.01 Dialog: Supports groups in discovering and questioning assumptions;
uncovering the big picture and seeing connections between parts; learning by inquiry
and disclosure; and creating understanding of, and commitment to, each other and the
group’s purpose.
20.02 Relationship Building Practices: Models clear statement of intention,
reflective listening, appropriate questioning techniques, and respect for others’ ideas
and differing perspectives.
20.03 Team Culture: Describes team dynamics using process observations, assisting
the team in understanding its developing norms and realizing implications for the
team’s work.
20.04 Roles within the Team: Describes necessary roles within a team and assists the
group in effectively managing and utilizing those roles.
20.05 Responsibility: Models and reminds team members of the role of self-
responsibility in a high performing team; structures meetings and collaborative efforts
to support group responsibility.

Human Resource Management

21.00 Human Resource Management (HRM) Concepts and Systems


Ability to demonstrate understanding of human resource management concepts and systems.

28 Training and Development in Human Services


21.01 Human Resource Understanding: Understands issues and practices in human
resource areas: job design, human resource planning, selection and staffing, human
resource research and information systems, compensation and benefits, employee
assistance, and union/labor relations.
21.02 Staff Selection Theory and Application: Understands the theories, techniques,
and appropriate applications of staff selection interventions used for performance
improvement.
21.03 Human Resource Systems: Understands theory and techniques and appropriate
applications of effective career development, performance management, and
succession planning systems.
21.04 Human Resource Research: Uses existing personnel information to analyze
organizational situations to determine the appropriate research methodology, and
interpret and communicate the results to senior management.
21.05 Human Resource Information Systems: Understands and uses data from HR
information systems.
21.06 Laws/Regulation/Contracts: Understands issues related to federal and
personnel laws and regulations, union contracts, discipline and grievance processes
and their impact on individual and organizational performance.
21.07 Career Development Theory and Application: Understands theories and
techniques, and uses appropriate applications of career development interventions
used for performance improvement.
22.00 Human Resource Maintenance
Ability to understand and implement systems to maintain human resources.
22.01 Reward System Theory and Applications: Understands the theories,
techniques, and appropriate applications of reward system interventions used for
performance improvement.
22.02 Training and Development: Develops and implements systems to assess and
address employee training and development needs, plans training
activities/interventions to meet current and evolving organizational needs, and
formulates skill-building development plans for employees.
22.03 Workplace Learning: Understands the importance of and develops
opportunities for ongoing job related learning.
22.04 Health, Safety and Security: Understands the strategic health, safety, and
security issues and concerns from both the employee and organizational perspective;
the laws and regulations that impact this functional area; and the programs, systems,
and rules to maintain employee wellness and organizational stability.
23.00 Performance Management
Demonstrates understanding of performance management issues.
23.01 Performance Gap Analysis: Performs “front-end analysis” by comparing
actual and ideal performance levels in the workplace. Identifies opportunities and
strategies for performance improvement.

Training and Development in Human Services 29


23.02 Performance Theory: Recognizes the implications, outcomes, and
consequences of performance interventions; distinguishes between activities and
results.
23.03 Process Consultation: Uses a monitoring and feedback method to continually
improve the productivity of groups.
23.04 Self-Assessment: Understands and communicates the importance of self-
assessment, including design of learning goals, criteria for judging progress, and
process for determining action steps.
23.05 Techniques for Assessment: Uses a consistent assessment process, using
multiple types of quantitative and qualitative data and drawing on multiple sources to
make judgments.
23.06 Linking Individual and Organizational Effectiveness: Knows approaches for
linking individual and organizational effectiveness.
24.00 Task/Work Analysis
Ability to understand task/work analysis and its applications.
24.01 Approaches: Understands and knows of the various approaches to task/work
analysis, including functional job analysis and position analysis questionnaires.
24.02 Job Descriptions: Understands steps in developing comprehensive job
descriptions based upon task/work analysis.
24.03 Validity: Is familiar with approaches to ensure content validity of task/work
analysis.
24.04 Work Flow Analysis: Understands the techniques of work flow analysis,
including observation, interviewing key informants, and document analysis.
25.00 Competency-Based Approaches
Ability to use competency models and methodologies.
25.01 Competency Models: Knows and understands the theory, structure, and use of
competency models for improving human performance.
25.02 Competency Methods: Knows and uses different methodologies to construct
competency models, including Task Analysis, Behavioral Event Interviews, and
Subject Matter Experts.
25.03 Competency Assessment: Knows and uses procedures for designing
competency-based assessment instruments.

Information Management

26.00 Basic Computer Skills


Ability to apply basic computer concepts including e-mail, scheduling, and word processing.
26.01 Computer Basics: Applies basic computer concepts including the ability to turn
on computer, log on and turn off, display start and help menus, open programs, use the
mouse, move cursor within documents, delete files and use recycle bin.
26.02 E-Mail: Uses e-mail to compose, reply to and forward messages; attaches files
and saves attachments; sets up files; and file messages.

30 Training and Development in Human Services


26.03 Scheduling: Uses scheduling including creating new tasks, viewing, editing,
and deleting appointments; printing appointments.
26.04 Word Processing: Uses the word processing function, including editing and
maneuvering; selecting, deleting and inserting text; indenting paragraphs; cutting,
copying, and pasting; saving and securing documents.
27.00 Management Information Systems
Ability to understand purpose and basic functions of management information systems.
27.01 Management Information Systems Concept: Understands the concept of
information systems in general and is aware of the core interfaces with program
information.
27.02 Data and Information: Distinguishes between data and information and is able
to convert data into information to guide decision-making.
27.03 Policy and Practice Link: Uses information systems to convey the link
between program policy and practice.
27.04 Quality of Data: Understands and articulates the link between data quality
and capacity for meaningful analysis.
28.00 Accessing Information
Ability to locate and use data in a management information system.
28.01 Data in System: Knows the data available in a system and its potential for
providing information.
28.02 Critical Data: Demonstrates the ability to identify and locate critical data
within a system.

Instructional Management

29.00 Training Systems


Demonstrates understanding of issues related to training systems.
29.01 Needs Assessment: Identifies systemic needs and tailors training efforts to
specific parts of or the entire system as needed.
29.02 Development of Training Objectives: Facilitates collaboration within the
organization to achieve commitment to and clear understanding of purposes and
expected results of training initiatives.
29.03 Evaluation of Training Initiatives: Facilitates design and implementation of
effective follow-up and investigation into training outcomes.
29.04 Versatility: Maintains and applies different training techniques on a system-
wide basis as needed.
30.00 Assessment and Transfer
Ability to assess learner performance and promote transfer of learning.
30.01 Transfer of Learning: Facilitates practical application of classroom learning.
30.02 Evaluation of Participants: Assesses learner performance as it relates to
training objectives.
30.03 Modification: Uses participant feedback to modify training design and improve
future presentations.

Training and Development in Human Services 31


Learning Theory

31.00 Learning and Human Development


Ability to apply understanding of how adults learn.
31.01 Learning Theory: Understands the principles of adult learning and human
development theory as the basis for providing experiential learning, and physically
and emotionally safe and stimulating environments—including accommodating
diversity and shifting group members’ roles between learner and expert.
31.02 Organizations as Learning Entities: Understands that the quality of human
interaction affects the redesign of formal, organization-wide systems as well as day-to-
day work systems within the organization.
31.03 Contexts for Learning: Understands the impact of current experiences on
values, attitudes and behaviors; integrates a group’s success with own development;
honors and utilizes disagreement and/or conflict as a source of creativity; develops
polarized issues as opportunities to explore diversity and new ideas.
31.04 Self-Directed Learning: Motivates learners to credit their own experiences as
guides in building and applying new knowledge. Supports learners as they integrate
what is known and what is discovered.
31.05 Personal/Learning Styles: Knows conceptual frameworks for describing
different personal and learning styles and their implications for individual and
organizational development.
32.00 Learning Climate
Ability to create a positive learning climate.
32.01 Positive Climate: Develops a climate of trust conducive to risk-taking and
experimentation; remains aware of developments regarding kinesthetic, artistic,
environmental, emotional and other intelligences and their effects on learning styles.
32.02 Physical Environment: Understands the importance of physical comfort as it
relates to the ability to concentrate and learn, (e.g., light, air, exercise, water).

Person/Organization Interface

33.00 Impact and Influence


Ability to demonstrate understanding and skills of impact and influence.
33.01 Anticipation: Anticipates effects of an action on people, programs, or events.
33.02 Action Science: Inquires how human beings design and implement action in
relation to one another.
33.03 Capacity: Recognizes own perceptions, and is mindful of the impact they have
on the work being done; is willing to suspend own pre-conceived notions.
33.04 Organizational Awareness: Demonstrates awareness of organizational goals
and concern for image and reputation of the agency and program.
34.00 Approaches
Ability to use appropriate approaches to achieve desired results.

32 Training and Development in Human Services


34.01 Open to Cues: Looks for potential opportunities for miscellaneous information
that may be of future use.
34.02 Using Information: Distinguishes advocacy from inquiry; acts on valid
information generated by inter-subjectively verifiable data and explicit inferences.
34.03 Analysis: Deliberately combines seemingly unrelated problems in devising
solution to a single problem.
34.04 Versatility: Relates effectively with all levels of administration inside and
outside the organization.
34.05 Double Loop Learning: Trusts in the "action research" process, emphasizing
open inquiry rather than suppressing conflict. Chooses among competing sets of
standards/paradigms as a framework for action, anticipating that the end result will be
combination of what is planned and what evolves along the way.
35.00 Initiative
Ability to take the first step.
35.01 Action: Initiates timely action rather than waiting to react as situations develop.
35.02 Strategic Thinking: Recognizes and seizes opportunities.
35.03 Confronts Barriers: Challenges barriers to effective performance and takes
action to overcome them.
35.04 System Knowledge: Uses knowledge of the system to identify long-term
opportunities and problems; provides different frames through which the system can
be viewed and demonstrates how the framing impacts the situation.
36.00 Organizational Development
Ability to help groups initiate and manage change by facilitating healthy inter- and intra-unit
relationships.
36.01 Collaboration: Establishes and leverages relationships with key stakeholders to
jointly design work processes and outcomes.
36.02 Intervention: Works with small and large groups to set attainable goals; models
appropriate behavior and provides follow-up; is competent in bringing about intended
consequences.
36.03 Leadership Development: Expands a person’s capacity to be effective in
leadership roles and processes; enables groups of people to work together in
productive and meaningful ways.
37.00 Effective and Efficient Work Process
Ability to bring about intended results through iterative feedback loops.
37.01 Quality Assurance: Practices self-reflection and participates in reflective
practice interviews to identify gaps in own learning and practice.
37.02 Continuous Reassessment: Uses iterative queries at all levels to co-create
desired outcomes and methodologies. Creates opportunities to try new ideas in a safe
environment.
37.03 Environmental Scanning: Looks ahead using key informants and trends
analysis to anticipate risks, opportunities, and strategic positioning.

Training and Development in Human Services 33


Self-Management Skills

38.00 Self-Responsibility
Ability to engage in ongoing learning to improve professional capabilities.
38.01 Reflective Practice: Uses reflective practice in a regular and systematic way to
assess the effects of own choices and actions on others; recognizes espoused theories
versus theories-in-use; uses own performance to reframe issues based on feedback.
38.02 Self-development: Engages in continued efforts to clarify personal values and
to carry out plans for professional development to meet personal and agency needs.
38.03 Knowledge of Field: Stays up-to-date on policy and best practice.
39.00 Self-Concept
Ability to believe in own capabilities and judgment.
39.01 Self-Awareness: Identifies own personal values, needs, interests, style, and
competencies and their effects on others; recognizes when personal feelings have been
aroused; works within limits of capabilities; manages personal defensiveness.
39.02 Pride: Takes pride in own expertise and in ability to handle situations.
39.03 Feedback: Actively seeks and accepts feedback for improvement without loss
of self-esteem and without responding defensively.
39.04 Assertiveness: Advocates for professional standards, and for others and their
needs.
40.00 Self-Control
Ability to maintain emotional equilibrium and optimism.
40.01 Self-discipline: Manages biases; performs effectively in the midst of chaos and
in an atmosphere of ambiguity. Maintains self-control in high stress situations.
40.02 Checks Behavior: Inhibits impulses to do or say inappropriate things.
40.03 Self-Monitors: Monitors own personal values and biases so that they do not
undermine objectivity and professionalism.
40.04 Patience: Shows patience and tenacity in working for desired results.
41.00 Flexibility
Ability to respond to challenge and change.
41.01 Stress Reduction: Manages own well being; finds ways, such as humor,
to reduce or manage stress.
41.02 Coping Skills: Perseveres in the face of disappointment, hostility, or adverse
conditions; resists dwelling on disappointments; motivates self to make the best of
things.
41.03 Openness: Is open to new information and to changing own opinions; suspends
own judgment and helps others to learn to suspend their judgment; makes own mental
models explicit and doesn’t impose own mental models on others.
41.04 Flexibility: Is able to shift gears and redirect activities.
42.00 Job Commitment
Ability to demonstrates commitment to the role and responsibilities of an organizational
development specialist.

34 Training and Development in Human Services


42.01 Enthusiasm: Shows enthusiasm for organizational development and sees
self as a role model for colleagues.
42.02 Responsibility: Takes responsibility for projecting and maintaining a
professional image for the organization.
42.03 Follow-Through: Demonstrates willingness to solve problems and see things
through to completion.
42.04 Sets Standards: Sets high standards for self and others.
42.05 Initiative: Takes the first step to correct problems, meet new needs, or address
unexpected developments.
42.06 Focuses on Goals: Stays focused on larger goals and strategies to achieve them.
42.07 Indicators: Uses a variety of indicators to gauge success in achieving
objectives.
43.00 Professional Standards/Ethics
Ability to conduct self in an ethical and honest manner.
43.01 Legal Issues: Complies with all copyright laws and the laws and regulations
governing the position.
43.02 Confidentiality: Maintains confidentiality and integrity in the practice of the
profession.
43.03 Professional Conduct: Supports peers and avoids conduct that impedes the
practicing of their profession.
43.04 Public Service: Improves public understanding of public service and
government bureaucracies and the role of government.
43.05 Accurate Representation: Fairly and accurately represents credentials,
qualifications, experience, and abilities.

References

Boyatzis, R.E. (1982). The competent manager. New York, NY: John Wiley & Sons.

Hughes, R.C., and Rycus , J.S.(1989). Target competent staff: Competency-based inservice
training for child welfare. Washington, D.C.: Child Welfare League of America.

Klemp, G.L. (1981). Job competence assessment: Defining the attributes of the top performer.
Alexandria, VA: American Society for Training and Development.

Kinney, T., Coke, K.,and Fox, R. (1982). Public child welfare staff development: A role and
competency framework for curriculum development and instruction. Albany, NY:
Nelson A. Rockefeller College of Public Affairs and Policy, State University of New
York at Albany.

Lawson, T. E. (1989). The competency initiative: Standards of excellence for human resource
executives. Alexandria, VA: Society for Human Resource Management.

Training and Development in Human Services 35


Linkage (1998). Competency model for building a competency-based learning system.
Workshop Handout, Boston, MA.

McLagan, P. (1989). Models for HRD practice. Alexandria, VA: American Society for
Training and Development.

Powers, B. (1992). Instructor excellence: Mastering the delivery of training. San Francisco:
Jossey-Bass.

Rothwell, W.J., Sanders, E.S., and Soper, J.G. (1999). ASTD models for workplace learning
and performance: Roles, competencies, and outputs. Alexandria, VA: American Society
for Training and Development.

Senge, P. (1990). The fifth discipline: The art and practice of the learning organization. New
York: Doubleday.

Spencer, L. and Spencer, S. (1993). Competence at work: Models for superior performance.
New York: John Wiley & Sons, Inc.

Varney, G., Worley, C., Darrow, A., Neubert, M., and Cady, S. (1999). Guidelines for entry
level competencies to organization development and change (OD&C). Available online
at http://www.aom.pace.edu/odc/report.html.

_____________________________

Members of the NSDTA Standards Committee who were involved in development of the OD specialist
competency model and their affiliations at the time:

Chairperson
Freda Bernatovicz, Institute for Public Sector Innovation, Muskie School of Public Service, University of
Southern Maine

Committee Members
Kathy Jones Kelley, Pennsylvania Child Welfare Competency-Based Training and Certification Program,
University of Pittsburgh
Michael J. Lawler, Center for Human Services, University of California, Davis
Gerry Mayhew, Bureau of Partner Services, Wisconsin Department of Workforce Development
James McGroarty, Division of Benefit Programs, Virginia Department of Social Services
David Wegenast, Buffalo State College
Rose Marie Wentz, Training for Change

36 Training and Development in Human Services


EVALUATING ORGANIZATIONAL EFFECTIVENESS
IN HUMAN SERVICES:
CHALLENGES AND STRATEGIES

Cynthia Parry, C.F. Parry Associates

Introduction

Human services agencies are becoming interested in organizational effectiveness (OE) for a
variety of reasons. As agencies move to implement new evidence-based practice models,
there is increasing interest in organizational interventions that align the day-to-day business of
the agency and support new ways of serving children and families. Agencies also are striving
to meet federal Child and Family Services Review (CFSR) standards related to child safety,
permanence, and well-being, and simply to do business more efficiently and effectively in
tough economic times.

Organizational systems thinking has also begun to influence key functions within agencies. In
recent years there has been increasing interest in moving from traditional training systems to
broad-based staff development systems aligned with—and supporting—agency priorities.
Such efforts have expanded the traditional scope of staff development to include (a) more
consultative and consumer-driven individual and group training and coaching, (b) facilitated
technical assistance provided around specific agency or individual needs, and (c) more
alignment with other functions such as human resources and quality assurance (Kanak, S.,
Baker, M., Herz, L., Maciolek, S., 2008; National Staff Development and Training
Association, 2010).

Evaluation can play a critical role in supporting organizational effectiveness through such
activities as identifying areas of need, monitoring the fidelity of implementation, providing
feedback for continuous quality improvement, and offering evidence of the effects of
organizational change on service provision and outcomes for children and families. However,
evaluation in the complex and ever evolving environment of a human services agency poses a
host of challenges, particularly when the goal is to attribute an outcome to a specific
intervention.

Much has been written on how to conduct program evaluations and on the role of evaluation
in implementation (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005). The goal of the
current article is not to recapitulate that work, but to offer some reflections on how a set of
well established guiding principles for evaluation might look in the context of evaluation of

Training and Development in Human Services 37


organizational effectiveness, and offer some potential strategies for conducting these
evaluations.

Challenges in Evaluating Organizational Effectiveness

The fact that OE interventions operate within complex agency systems and attempt to make
sustained, systemic changes poses particular challenges for evaluation. These relate to the
broadly defined, fluid, and often diffuse nature of OE interventions, attribution of impacts on
outcomes to the intervention, and accounting for the complexities of the organizational
environment in design and analytic strategies.

Organizational effectiveness is a broad umbrella encompassing many possible changes in how


an agency does business and many possible interventions. The American Public Human
Services Association’s Organizational Effectiveness Handbook (APHSA, 2010) defines OE
as “a systemic and systematic approach to continuously improving an organization’s
performance, performance capacity and client outcomes.” Within that definition, interventions
could include, for example, such diverse activities as helping an agency to develop a strategic
plan, implement an agency-wide practice model, or conduct succession planning. While some
tasks are relatively discrete or focused on one function, such as training or human resources,
others are comprehensive and have agency-wide ramifications. Regardless of the scope or
type of intervention, research suggests that the more clearly the core components of an
intervention are known and defined, the more likely it is that the program or practice can be
implemented successfully (Bauman, Stein, & Ireys, 1991; Dale, Baker, & Racine, 2002;
Winter & Szulanski, 2001; cited in Fixen et al., 2005). An evaluation’s success also depends
on a clear understanding of the parameters of the intervention (Urban & Trochim, 2009); that
is, what is and is not included in the scope of the intervention. The need to have a good
understanding of the intervention itself is not unique to the evaluation of OE interventions.
However, the task of defining the core components of the evaluation, as well as the expected
effects on other parts of the agency and the agency’s clients, is often more complex, although
no less critical, when the intervention itself can differ widely in focus and scope, and is
embedded in a complex and dynamic system such as a human services agency.

Because there is no one organizational development intervention, there is no blueprint or set


of evaluation techniques suitable for all interventions. There may, however, be an overarching
set of goals that specific and differing local interventions are all working toward; for example,
federal outcome standards related to safety, permanence, and well being in the case of child
welfare agencies. Agencies are often under considerable pressure to show the effects of
organizational improvements on long-term client outcomes. However, attribution of
responsibility for achievement of outcomes is particularly complex in OE interventions.
Although they operate on a local level, they are expected to have an impact at a global policy
level (Urban & Trochim, 2009). Agencies and evaluators have to deal with the issue of when
and how to connect these local implementations to global goals of safety, permanence, and
well-being, when traditional experimental and quasi-experimental designs may not be

38 Training and Development in Human Services


possible. Attributions are further complicated by the fact that the impacts of some
organizational development efforts are not seen immediately but emerge at a later point in
time, which may be well after the initial intervention (Mackay, Horton, Dupleich, &
Andersen, 2002). Making appropriate attributions of positive outcomes to a specific
intervention can be a formidable technical problem for evaluators and involve a substantial
commitment of time and resources for an agency.

OE interventions take place within complex systems and are themselves dynamic. Over the
course of implementing an OE intervention a typical agency environment may be
characterized by changes in staff and/or leadership, fiscal cutbacks, implementation of
multiple initiatives, and shifting policies and priorities. Agencies and the structural units
within them also are characterized by a distinct organizational climate and culture, and
differences in such factors as readiness and openness to change, patterns of communication,
leadership, and resources. These differences have been shown to affect the success of program
implementation under certain circumstances (Glissen & Hemmelgarn, 1998; Fixsen, et al.,
2005) and must be considered in evaluation designs. The dynamic and fluid context within
which interventions are implemented and evaluated is particularly challenging in the case of
OE interventions, which are not time-limited initiatives but seek to make lasting changes in
how an agency does business.

Ten Recommendations for Conducting OE evaluations

Conducting an effective OE evaluation rests on many of the same foundational principles as


conducting any good program evaluation. However, certain aspects of designing and
conducting a program evaluation take on a different emphasis in an OE context. These aspects
are described below as a series of 10 recommendations for evaluating OE interventions.

Recommendation 1—Adopt a Systems Perspective

Organizations are systems and OE interventions are also systemic, systematic, and dynamic
(APHSA, 2010). Although they differ somewhat in specifics, models advanced for
understanding organizational development and change emphasize the multi-level nature of the
systems within which interventions are implemented and the dynamic nature of both the
larger system and the intervention itself. For example, Roth, Panzano, Crane-Ross, Massatti,
and Carstens, Seffrin, and Chaney-Jones (2005) outline a model for research and evaluation of
organizational change that considers factors at five levels: environmental (e.g., professional
norms); inter-organizational (e.g., the communication between an agency and a change
agent); organizational (e.g., organizational culture, hiring practices, workload); project (e.g.,
dedicated resources, commitment, and access to technical assistance); and innovation (e.g.
scientific and/or experiential evidence). Fixsen et al. (2005) see what they call “core
implementation components” (e.g., training) as central to changing behavior, and place them
within a context of organizational components that enable and support them (e.g.,
administration, selection and program evaluation) and “influence factors” (e.g., the social,

Training and Development in Human Services 39


political and economic context) that exist in the larger environment. Similarly, the APHSA
model includes a set of strategic support functions that make the core work of the organization
possible. This model places training within this category, along with human resources, quality
assurance, and other organizational functions. Strategic support functions impact work at
multiple levels from operations through key processes, structure and culture, and strategy
(APHSA, 2010).

In an organizational development context evaluation will be more successful if it is also


framed within a systems perspective. Trochim and colleagues define systems evaluation as
“an approach to conducting program evaluation that considers the complex factors that are
inherent within the larger ‘structure’ or ‘system’ within which the program is embedded”
(Cornell Office for Research on Evaluation, 2009. p. 5). They go on to identify several
characteristics of systems that are relevant to evaluation: specifically, that systems consist of
parts, wholes, and their interrelationships; that systems are embedded within other systems;
that human systems are dynamic and evolving; that programs have lifecycles and that
different evaluation approaches are appropriate at different developmental phases; and that
nested and dynamic systems lead to multiple stakeholder viewpoints.

One implication of a systems perspective for OE evaluation is that it must take into account
that an intervention typically exists within a network of part-whole relationships. For
example, implementation of a practice model for child welfare may be nested within a human
services agency responsible for multiple programs and needs to be put in place across
different units with differing responsibilities, such as emergency response, in-home services,
or adoptions. These units may have differences in workloads, supervisory buy-in and
expectations, or culture and climate that affect the implementation and outcomes of the
intervention differently. Similarly, work may be done at the level of a core implementation
component (Fixsen et al., 2005) or strategic support function (APHSA, 2010) such as training,
but the success of the intervention may be influenced by factors at multiple levels, such as
communication and trust among units and agencies. Regardless of conceptual model,
evaluators must not only work to identify and account for organizational barriers and
facilitators within the immediate context of an intervention, but also consider the potential
influences of variables at multiple levels of the system. Evaluators must be aware of when it is
necessary to measure, and attempt to control for, factors at other levels or within other
functions of the organization, and to bear in mind that the appropriate unit of analysis may be
the individual, the unit, and/or the agency depending on the specific questions being asked.

Recommendation 2—Gear Your Evaluation to the Intervention’s Level of Development

A second major implication of a systems perspective is that OE interventions aren’t static, but
progress through stages of development from initiation to dissemination (Cornell Office for
Research on Evaluation, 2009; Fixsen et al., 2005; Roth et al., 2005). During initial
implementation, interventions tend to be changing rapidly. Procedures, activities, and
protocols are not firm and can be expected to undergo revisions, refinements, and adaptations.

40 Training and Development in Human Services


Interventions at this stage may go through several cycles of improvements over an extended
period of time. Early in an intervention’s development, evaluation is most appropriately
focused on providing formative feedback to support and monitor the development and
implementation of the intervention (Cornell Office for Research on Evaluation, 2009). This
stage of development does not support evaluation of implementation fidelity as yet, nor does
it support an evaluation of outcomes. As interventions mature, and models, activities,
protocols, and procedures become more standardized, the focus of evaluation can shift to
include measurement of fidelity of implementation and relationships of activities to outcomes.
However, although mature implementations of an intervention will support an outcomes
evaluation, it is still desirable to collect information regarding fidelity of implementation to
avoid committing a Type 3 error. In the words of Gilliam, Ripple, Zigler, and Leiter (2000)
“Outcome evaluations should not be attempted until well after quality and participation have
been maximized and documented in a process evaluation. Although outcome data can
determine the effectiveness of a program, process data determine whether a program exists in
the first place.” (p. 56, as cited in Fixsen et al. 2005).

Recommendation 3—Think of Evaluation in Cycles

As evaluators we often talk about a cycle of evaluation from planning through


implementation, with results feeding back to inform future program development and
evaluation efforts. This idea is embodied in the notion of pilot studies that help to refine
evaluation questions, measures, and procedures. It is also incorporated in strategic planning
for evaluation that lays out a coordinated series of evaluation projects over several years, all
leading to the answers to overarching system-wide questions. The idea of planning evaluation
as a series of evaluations, each refining and feeding into the next, is highly compatible with
the view of OE interventions as supporting continuous quality improvement through learning
by doing (APHSA, 2010). Since OE interventions in child welfare are relatively new, many
are still in the early stages of implementation and will not yet support an outcomes evaluation.
Getting good data regarding the linkage of an intervention to agency- or client-level outcomes
may best be approached iteratively, through multiple cycles of evaluation conducted as the
intervention develops and matures.

Recommendation 4—Treat Evaluation as a Part of Program Development

The goal of OE interventions is not to put in place another initiative, but to change the way an
agency does business (APHSA, 2010). Similarly, evaluation should be thought of as an
ongoing part of the business of intervention and not as an add-on done after the program has
been implemented. Done throughout, evaluation can offer valuable insights in shaping the
intervention. In turn, the lessons learned from implementation can inform future evaluation
cycles and refinements in measurement tools and methods. Implementing and evaluating an
intervention share many of the same processes; for example, a need for broad stakeholder
input, and a shared understanding of a theory of change. As activities proceed in these areas in
parallel and over time, insights from one can inform and improve the other. When evaluation

Training and Development in Human Services 41


is done only in a summative context at the end of implementation, it can be difficult, if not
impossible to distinguish the failure of an intervention from the failure of implementation
(Fixsen et al, 2005).

Recommendation 5—Develop a Shared Understanding of the Boundaries of the


Intervention

Since OE interventions take place within a broad organizational context of multiple, nested,
and somewhat fluid components and relationships, it is critical to an effective evaluation to
define the boundaries of the intervention being evaluated. The situation is complicated further
by the fact that the development of organizational capacity is often not viewed as a goal in
and of itself but as the means to accomplishing other organizational goals (Mackay, et.al.;
2002). For example, consider an intervention to restructure training to include hiring and
training a cadre of mentors. These mentors would be located in county agencies and would
work with supervisors on developing assessment and coaching skills to address specific
performance issues or enhance transfer of learning from classroom training. The boundaries
of the intervention could be drawn narrowly to include just the establishment of the mentor
program and the activities of the mentors, or they could be drawn more broadly to also
include the activities of the supervisors or even the staff with whom the supervisors work.
There is no right or wrong way to draw the boundaries of an intervention, but where those
boundaries are drawn has clear implications for the scope of the evaluation (Urban &
Trochim, 2009).

It is also critical that the understanding of the boundaries of an intervention is shared among
agency leadership, the evaluator, and other stakeholders. According to Urban and Trochim
(2009) defining program boundaries helps participants clarify both stated and unstated
assumptions about the intervention and arrive at a precise description of what the intervention
is and what it is not. It also helps all concerned to arrive at a common language for
representing the intervention, and a common understanding of its scope and activities.
Boundary analysis makes it more likely that all critical components of the intervention are
considered when determining the scope of the evaluation. If an element is not included in the
design for the current evaluation cycle, this process helps to ensure that it is excluded based
on shared decision making, rather than simply overlooked. A shared understanding of
program boundaries may allow the evaluator and the agency to avoid the situation in which an
important evaluation question cannot be answered because it was not considered when
developing the evaluation design.

Recommendation 6—Work with the Program to Articulate and Map a Theory of


Change

A theory of change, often operationalized by a logic model, has become a standard part of
program evaluations in a wide range of settings (W.K. Kellogg Foundation, 2004; Mackay,
2002). These models clarify how a program is expected to work and provide a basis for

42 Training and Development in Human Services


evaluating fidelity of implementation. At the simplest level, columnar logic models connect
the components of an intervention in one box to expected outcomes in another box. More
elaborate models connect specific program activities with the outputs and outcomes that are
expected to be a direct result of the activity. Urban and Trochim (2009) refer to this type of
logic model as a visual causal, or pathway, diagram and emphasize its importance in
prioritizing evaluation questions, and guiding the development of a systems evaluation plan.
Mayne (2008) refers to this type of diagram as a results chain and emphasizes its importance
in building a case for program impacts. In both diagrams the main causal pathways
hypothesized between activities and outcomes are made clear. Where there are multiple
potential causes for an outcome, these are identified. Pathway models also identify activities
that assume more importance by affecting multiple outcomes.

Given the broad range of interventions that might be undertaken as part of organizational
development, and the varied effects that could be expected based on those interventions, it is
especially important to design evaluations based on a clear map of how OE activities are
expected to produce organizational change. A pathway model as articulated by Urban and
Trochim (2009), captures the complexity inherent in a systems change effort. It also allows
for a process of what these authors refer to as “hanging the literature” (p. 548) on the
relationships hypothesized between OE activities and outcomes. To the extent that literature
can be found that supports a hypothesized linkage between an intervention and an outcome,
the case is strengthened for the effectiveness of the local intervention, which is especially
important when following a local intervention to outcomes is not feasible. The specific
linkages between program activities and expected outputs and outcomes also help to avoid
potential confusion over the correct unit of analysis. Mackay et al. (2002) points out that the
issue of whether evaluation data should be analyzed at the level of the individual, the unit, or
the agency can be problematic in many OE evaluations and that changes measured solely at
the individual level do not necessarily mean that sustainable organizational change has
occurred. For example, questions of interest in an OE evaluation in human services might
focus measuring increases in communication between units within the agency, or increases in
interactions with community partners. A well developed visual-causal diagram can help to
make clear how the outputs of an OE intervention are expected to impact both individuals and
the organization, and how and where individual change is expected to impact organizational
functioning.

Recommendation 7—Take a Participatory Approach

It is a truism in evaluation that stakeholder involvement is a key to a successful evaluation


project. However, in many cases stakeholders take on only a limited advisory role. In OE
interventions, where the goal is for an agency to sustain new ways of doing business after
those who initially facilitate and evaluate the change are no longer involved, stakeholder
involvement takes on a somewhat new meaning. A participatory approach, and the group
work involved, is seen as a key to the development of strong sponsorship for change from
agency leadership and the emergence of a core group of champions for the innovation (Fixsen

Training and Development in Human Services 43


et al., 2005; APHSA, 2010; Mackay et al., 2002). In this context, it is not enough for
stakeholders to participate in a small number of kick off and update meetings related to the
evaluation. If change is to take place and be sustained, stakeholders at all levels of an
organization need to play a meaningful part in developing strategies for data collection, in
making meaning from information obtained, in communicating evaluation findings, and in
finding ways to incorporate evaluation feedback to improve the way the agency does its work.
Meaningful participation in the evaluation is a key part of learning by doing (Chinman, Imm,
& Wandersman, 2004; Fixsen et al., 2005; Mackay et al, 2002).

Recommendation 8—Use Multiple Methods

Evaluators frequently make use of multiple measures and data sources in answering a given
evaluation question (Chinman et al., 2004). The use of multiple measures and methods allows
for triangulating and cross checking findings, and helps in identifying important areas of
convergence and divergence based on different methods as well as different perspectives of
respondents. Multiple measures and data collection methods also allow the strengths of one to
offset the weaknesses of the other (Cresswell &Clark, 2007).

Triangulation of data from multiple sources has been suggested as a particularly important
element of evaluations of organizational development (Horton, Mackay, Andersen, &
Dupleich. 2000). Because of the nesting of the intervention within a larger systems
environment, OE evaluations are apt to touch a variety of individuals in different ways based
on their relationship to the project. Those most closely involved, for example as part of a
sponsor team or as champions for the intervention, may have more detailed insights to offer
than those more peripherally involved. Mixed methods designs (Creswell & Clark, 2007) that
involve the integration of qualitative and quantitative measures seem well suited to an
evaluation that simultaneously needs to collect rich and detailed narratives from a small
number of participants and also more limited and standardized information from larger
groups. Since OE interventions aim to change how an agency does business in some way, a
variety of archival, or existing data sources (such as outcome information from SACWIS
systems, work products, meeting minutes, policy manual updates) also are likely to be
important in measuring and documenting change.

Recommendation 9—Measure Implementation

There is a growing awareness in human services that a key factor in successfully instituting
and sustaining organizational and practice change is attention to implementation (Fixsen et
al,. 2005; Barth, 2009). The discussion is often framed around the fidelity of implementation
of an evidence-based practice (Fixsen et al., 2005). However, the ability to understand and
describe what has been implemented is also central to assessing the quality of program
delivery, to providing feedback to shape program improvements (Chinman et. al., 2004), and
to interpreting findings related to outcomes.

44 Training and Development in Human Services


Careful measurement and documentation of implementation is particularly important in
evaluations of OE interventions. These interventions are typically somewhat fluid, changing
as the work proceeds. OE work is not conceptualized as a time-limited initiative but as a
sustained change in how an agency functions that follows a continuous improvement cycle
(APHSA, 2010). In this context, the formative feedback typically collected in a process
evaluation takes on added importance. Where the intent is to link the OE intervention to
agency or client outcomes, is it is even more critical to understanding evaluation findings that
the evaluator knows what has been implemented and to what extent the design for the
intervention has been followed or adapted.

Recommendation 10—Build a Case for Impact

Although difficult to measure in the complex environment of a human services agency, the
ultimate goal of most interventions in the field is to show a positive impact on outcomes,
whether defined in terms of improved agency function or improvements in the lives of clients.
Use of traditional experimental designs that allow clear attribution of a change in outcomes to
the effects of the intervention is seldom possible in human services settings. In fact, some
(Mayne, 2008; MacKay et al., 2002; Horton, Mackay, Andersen, & Dupleich, 2000) have
argued that the traditional attribution paradigm is not appropriate for evaluations of
organizational development efforts. Instead they argue for a model of assessing the
contribution of the intervention to outcomes, recognizing that multiple factors are likely to
impact the same outcomes and that it is not possible in a typical agency environment to isolate
one particular program.

One implication of taking a “contribution” approach to evaluating the impact of an OE


intervention is that the theory of change articulated by the developers of the intervention takes
on additional importance. The theory of change provides a basis for building a reasonable-
person case for the impact of the intervention on outcomes; a concept that has been referred to
as an “impact chain” (Mayne, 2001), and in child welfare training as a “chain of evidence”
(Parry, & Berdie, 1999; Parry, Berdie, & Johnson, 2003). Mayne (2001) describes the results
chain as the principal tool of contribution analysis. He outlines a six-step process consisting of
identifying the attribution problem to be addressed, developing a theory of change, gathering
evidence, assembling a “contribution story” including threats to the validity of the
interpretation, seeking out additional evidence, and refining the contribution story. The
contribution story serves as a vehicle for summarizing the evidence supporting the program’s
theory of change.

Another implication of a contribution approach is that building a case for an intervention’s


contribution to outcomes requires identification and measurement of contextual factors that
might affect the hypothesized linkages or provide alternate explanations of findings. There
has been a good deal written on the role of various organizational and contextual factors that
can affect program implementation (Fixsen et al., 2005); however, the identification of which

Training and Development in Human Services 45


factors may be operating in a given situation still may pose a significant challenge to
evaluators.

A final consideration is that the contribution story will only be as good as the evidence that is
amassed supporting the links in the chain. Although a contribution approach reframes the
attribution problem somewhat, the need remains for a rigorous approach to gathering evidence
to support the intervention’s role in outcome change. There are many quasi-experimental
designs that could be used in such evaluations, and discussion of them all is beyond the scope
of this article. However, I would like to highlight three possible approaches for consideration
in future OE evaluations.

One approach that may offer promise when OE interventions are expected to produce
relatively immediate impact, is a group of designs collectively known as single system designs
(Nugent, 2010). In these approaches the target of change in an organization is measured
repeatedly under differing conditions (for example, a baseline and an after-action phase). If a
change in the outcome of interest is observed after the intervention takes place, evidence is
provided for the effectiveness of the intervention. If the intervention is introduced to different
groups at different points in time, and an effect on outcomes is observed after each of several
implementations, the case for the intervention’s impact on outcomes is strengthened. These
designs are often used with a relatively small and manageable number of data points and may
lend themselves to examining changes in CFSR outcomes where data is collected routinely
and is available for an extended time period.

Another design option that has been recommended (Barth, 2009) for improving the rigor of
evaluation in human services is propensity score analysis. This technique is used to reduce the
effects of selection bias and to create a comparison group that is as similar as possible to
program participants. Regression equations are used to predict which individuals from a group
that does not receive the intervention would be likely to participate if offered a similar
opportunity. Propensity score analysis has the potential to help control for important
differences in contextual variables (such as individual attitudes and motivation, workload, or
perceptions of supervisory support) in cases where a design is used that compares outcomes in
sites where OE interventions are occurring to outcomes in sites where they are not.

Finally, because OE evaluation exists within a nested set of relationships (Individuals within
units, units within programs, and programs within agencies) lends itself to approaches to data
analysis such as hierarchical linear modeling (HLM). This type of analysis considers effects at
multiple levels, and thus is ideal for the type of questions OE evaluation deals with. However,
it requires large amounts of data and may not be practical except in large scale cross site
evaluations.

46 Training and Development in Human Services


Conclusion

As human services agencies become more aware of the importance of systemic and
organizational approaches to implementing innovation, evaluators also must develop
strategies to meet the challenges of conducting evaluation in a complex and ever-changing
system. Traditionally evaluation has regarded complexity and programmatic evolution as
sources of error to be controlled, and has conceptualized a program as having an ultimate goal
of disseminating a well-defined service to a well-defined population of users. Newer
approaches to evaluation build on implementation science and recognize the need to
understand the mechanisms by which the context of an intervention facilitates or hampers
change efforts. However, much of the discussion regarding implementation science occurs in
the context of evaluating evidence-based practices. With these interventions, the ultimate goal
of evaluation often is conceptualized as developing an evidence base for a clearly defined and
standardized intervention. While most programs are not expected to be completely static, OE
interventions have a qualitatively different expectation regarding program development and
change grounded in the concepts of a learning organization and continuous quality
improvement. This has prompted some to suggest that new evaluation strategies are needed
(Horton et al., 2000) for evaluations of organizational improvement. As yet there is no magic
approach to the evaluation of OE interventions that solves all of the practical problems
associated with evaluation in this arena. There is much more work to be done to understand
how OE interventions can be evaluated in ways that are both rigorous and suited to their
complex and fluid nature. However, the lessons being learned from implementation science
and systems theory offer promising directions to explore.

References

American Public Human Services Association. (2010). Organizational effectiveness


handbook 3rd edition. Washington, D.C.: American Public Human Services Association.

Barth, R. P. (May 29, 2009). What lies in the future for evaluation in the field of child
welfare? Closing plenary presented at Child Welfare Evaluation Summit. Washington,
D.C.

Brown Urban, J and Trochim, W. (2009). The role of evaluation in research: Practice
integration working toward the “Golden Spike.” American Journal of Evaluation, 30,
538-553.

Chinman, M., Imm, P. and Wandersman, A. (2004). Getting to outcomes™ 2004: Promoting
accountability through methods and tools for planning, implementation, and evaluation.
Santa Monica, CA: RAND Corporation. Retrieved from http://www.rand.org/pubs/
technical_reports/TR101.

Training and Development in Human Services 47


Cornell Office for Research on Evaluation. (2009). The evaluation facilitator’s guide to:
Systems Evaluation Protocol. Ithaca, NY: Cornell Digital Print Services. Retrieved from
http://core.human.cornell.edu/documents/SEP1.pdf.

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005).
Implementation research: A synthesis of the literature. Tampa, FL: University of South
Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation
Research Network.

Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods
research. Thousand Oaks, CA: Sage.

Glisson, C., & Hemmelgarn, A. (1998). The effects of organizational climate and
interorganizational coordination on the quality and outcomes of children’s service
systems. Child Abuse and Neglect, 22, 401-421.

Horton, D., Mackay, R., Andersen, A., and Dupleich, L. (2000). Evaluating capacity
development in planning, monitoring, and evaluation: A case from agricultural research.
ISNAR Research Report No. 17. The Hague: International Service for National
Agricultural Research. Retrieved from http://www.ifpri.org/isnararchive/Publicat/PDF/
rr-17.pdf.

Kanak, S., Baker, M., Herz, L., Maciolek, S., (2008). Building effective training systems for
child welfare agencies. Portland, ME: University of Southern Maine.

Mackay,R., Horton, D., Dupleich, L., Andersen, A. (2002). Evaluating organizational


capacity development. The Canadian Journal of Program Evaluation, 17(2), 121–150.

Mayne, J. (2008). Contribution analysis: An approach to exploring cause and effect. ILAC
Brief Number 16. Retrieved from http://www.cgiar-ilac.org/files/ publications/ briefs/
ILAC_Brief16_Contribution_Analysis.pdf.

National Staff Development and Training Association. (2010). A key to success: Guidelines
for effective staff development and training programs in human service agencies.
Washington, D.C.: American Public Human Services Association.

Nugent, W. (2010). Analyzing single system design data. New York, New York: Oxford
University Press.

Parry, C. and Berdie, J., (1999). Training evaluation in the human services. American Public
Human Services Association.

48 Training and Development in Human Services


Parry, C., Berdie, J. and Johnson, B. (2003). Strategic planning for child welfare training
evaluation in California. Proceedings of the 6th Annual Human Services Training
Evaluation Symposium, Berkeley, California: California Social Work Education Center.

Roth, D., Panzano, P. C., Crane-Ross, D., Massatti, R., and Carstens, C., Seffrin, B., and
Chaney-Jones, S. (2005). The innovation diffusion and adoption research project
(IDARP): Moving from the diffusion of research results to promoting the adoption of
evidence-based innovations in the Ohio mental health system. In D. Roth & W. J. Lutz
(Eds.). New Research in Mental Health, 16, 78-89. Columbus, OH: Ohio Department of
Mental Health.

W.K. Kellogg Foundation (2004). W.K. Kellogg Foundation logic model development guide.
Battle Creek, MI: W.K. Kellogg Foundation.

Training and Development in Human Services 49


SECTION TWO
STATE OF THE FIELD

The field of training and development in human services incorporates many disciplines,
different roles and responsibilities, and varied human services settings. But amid these
differences are binding similarities as evidenced by the interest in NSDTA’s annual
professional development institute.

The state of the field section is intended to share information concerning current issues,
trends, and potential future developments in the field of training and development in human
services. Its goal is to provide markers to the current developmental state of the field, specific
to human services training and development. While there are similarities between the training
and development functions in business and in human services, there are also many distinctly
different issues that human services training and development professionals encounter.

In the current issue, in-depth organizational system case studies from the Positioning Public
Child Welfare Guidance Institute and four statewide human service systems (Arizona,
Pennsylvania, Texas, and Minnesota) illustrate the use of organization development principles
in action.

Training and Development in Human Services 51


EMBEDDING FIELD-WIDE GUIDANCE
FOR AGENCY EFFECTIVENESS:
THE 2010 PPCWG INSTITUTE

Phil Basso, Organizational Effectiveness, and


Bertha Levin, Policy and Programs,
American Public Human Services Association
Contributors:
Anita Light, Policy and Programs, and Melissa Kanaya, Practice Innovation,
American Public Human Services Association

In partnership with Casey Family Programs, the American Public Human Services
Association (APHSA) and its affiliate the National Association of Public Child Welfare
Administrators (NAPCWA) launched the Positioning Public Child Welfare Guidance
(PPCWG) Institute in the spring of 2010. The purpose of the PPCWG is to establish markers
of effective performance for the field of public child welfare. The institute provided a unique
opportunity for six selected agencies to learn to apply and embed the PPCWG through a
structure for change management and a continuous improvement process. Using content from
the change management section of the PPCWG, participating agencies engaged in processes
designed to make improvements in agency effectiveness and, by extension, in outcomes for
children, youth and families.

PPCWG Development and Design

The PPCWG was completed by NAPCWA in 2010 to establish parameters and provide
guidance for effective actions to enable public child welfare agencies to achieve the goal of
providing effective services that improve outcomes for the children, youth, and families
served. It is the culmination of work by leaders and stakeholders in the public child welfare
field to define “what we do, how we do it, and how we must hold ourselves accountable.” The
PPCWG takes into account the unique and complex context in which public child welfare
work is done.

The PPCWG was developed though an initiative which began in January 2007, led by
NAPCWA in partnership with Casey Family Programs. It drew upon significant input and
support from state and local child welfare agency directors and administrators, front-line
social workers, support staff, advocates, academics, researchers, consumers, and
representatives from other human service systems that interface with child welfare, including
juvenile justice, education, and physical and mental health.

52 Training and Development in Human Services


The governance structure that was used to develop and vet the PPCWG included the
NAPCWA executive committee, a national advisory committee, a sponsor group, and 14
subcommittees. Each subcommittee developed guidance in one of the following critical areas
of responsibility in child welfare: strategy, leadership, practice model, communications,
administrative practices, workforce, disparity and disproportionality, strategic partnerships,
information management, change management, research, technology, pubic policy, and
budget and finance.

Each of these guidance domains was determined by the executive committee and advisory
committee to be a critical factor in agency performance. Taken together, these domains form a
systemic framework for agencies who are interested in making improvements that are
sufficiently comprehensive to have the desired impact. The intersections of the 14 domain
areas are depicted in the analytical framework shown in Figure 1.

Figure 1. Interrelationships of the 14 Critical Areas of Responsibility in Child Welfare

Outcomes

Strategy

Service Delivery Organizational Capacity


Leadership
Practice Model Workforce
Strategic Partnerships Communications
Change Management
Administrative Practices
Information Management

Budget and
Public Policy Research Technology
Finance

Disparity and
Disproportionality

Training and Development in Human Services 53


The PPCWG is based on the premise that public child welfare has responsibility for action in
these 14 areas and that:

• All 14 areas of practice are essential and interrelated


• Agencies are more effective when all 14 areas of practice are aligned
• Performance capacity must be built in each area
• Disparity and disproportionality are improved by making improvements in each of the
other, interrelated domains

The PPCWG website has sections about each of the 14 critical areas. Each section:

• Defines the critical area, explains why it is important, and enumerates the questions
the guidance will address
• Defines markers for effectiveness for each domain at different levels of how that
domain plays out within an agency: strategy, key processes, operations, and
implementation
• Contains links to other PPCWG pages and resources

APHSA’s PPCWG Institute

The 2010 PPCWG Institute, developed and delivered by staff at APHSA, focused on enabling
participants to generate practical solutions and approaches to their agency and child welfare
practices, and on providing a structure for participants to make real changes in their agencies.
The institute engaged participants in a learning by doing experience that tapped into their own
experiences and provided opportunities to apply PPCWG content to plan, implement, and
monitor a comprehensive change effort. Major objectives of the institute were to have
participants

• Use the PPCWG to define areas of importance and assess the related strengths and
weaknesses of their state or local child welfare system.
• Build structures and processes for continuous improvement that would best support and
sustain the implementation of identified remedies
• Learn from one another and build networks for ongoing support

Design Principles

Both the PPCWG and the institute design were informed by four guiding principles:

• Be holistic. To be holistic means to look at the big picture whether it is the family or
the work of the agency, consider the context in which the work is done, meet the client
or the agency where they are, and put energy to it to close the gaps in performance and
capacity.

54 Training and Development in Human Services


• Be inclusive. To build buy-in among partners and stakeholders, it is important that
change plans are worked through and developed with staff and colleagues. There must
be process in place to receive input and communicate progress and setbacks.
• Define effectiveness. The PPCWG defines effectiveness markers for the field. Using
these markers, an agency is able to understand where they are and what it looks like to
be better as well as analyze motivation to change. Once defined and benchmarked, an
agency can plan actions and develop tools to move the agency towards its goals.
• Using one’s own experiences as a basis for learning and change. The institute
worked from the principle that the participants bring a wealth of experience to their
improvement efforts. Through the process of sharing experiences, understanding is
gained regarding why things are happening. Together, facilitators and participants can
begin assessing an organization and identifying remedies that will address the issues.
Then the most appropriate guidance to address root causes and fill the gaps can be
applied.

Structure and Process

The institute’s structure and process was based on a learn by doing approach, consistent with
adult learning theory. Six state and local agency teams came together to participate in three
two-day sessions that began in May and ended in December. The sessions were generally
comprised of facilitated discussions versus content presentation. Before and between each
session teams applied the structure for change management and process for continuous
improvement explored during institute sessions. Institute staff were available for on- and off-
site consultation throughout this timeframe.

The institute curriculum provided content related to understanding the PPCWG, particularly
the change management guidance. The institute provided a structure for organizing
sponsorship and committees to support a change effort. It also covered a systematic
continuous improvement process through which participants learned how to define an area for
improvement that would reduce the number of children in out-of-home care. Once an
improvement priority was defined, participants learned how to assess their agency’s strengths
and gaps as well as the root causes and general remedies for those gaps. This provided a
tangible critical thinking basis for participants to plan for, implement, and monitor change
efforts.

The structure and process of the institute paralleled the way agencies work with the families
they serve. Just as the experiences of the families must be considered when planning for
change, the experiences of agencies must be taken into account when planning for change.
Participants’ experiences were shared in all face-to-face institute sessions to learn about each
other’s challenges and to assess what worked, what did not, and why. This enabled the group
to identify root causes of ineffective performance, capacity-related problems, and what could
be done to solve them.

Training and Development in Human Services 55


Participant Selection

Institute applicants were required to address their agency’s perspective on six dimensions:

1. Clear sense of challenges and benefits. Is there insight into the agency challenges
and areas needing improvement, coupled with an understanding of how the institute
could help meet the challenges to reach their proposed outcome goals?
2. General parameters. Is there evidence of compatibility between the agency and
PPCWG values and principles?
3. Readiness for change. Is there a general understanding of the systemic and systematic
nature of effective change efforts as reflected in past experiences?
4. Sponsorship and other resources. Are agency personnel and stakeholders who
would support the effort identified, including those with the authority to empower
change?
5. Team members. Are a project manager and other team members identified who have
the needed experience to identify problems, assess gaps and root causes, and plan,
implement and monitor change?
6. Program areas. Does the identified institute work pertain to Fostering Connections
and the safe reduction of children in care of the agency?

Applicant agencies identified areas of improvement that they wanted to impact through
participation in the institute, and letters of support were written by the leadership of the
agency affirming that each designated team member would participate in all four face-to-face
institute sessions, have time to prepare for each session, and receive the time, support,
resources, and authority they would need after each institute session to plan, execute, and
monitor a change process.

Participating teams ranged from three to five members and included program and
administrative staff from the executive director level to front-line supervisors. One team
included the leaders from partner agencies within their community.

Institute Content and Materials

During Institute sessions teams got to know, support, and learn from each others’ experience
as well as understand the nature and content of the PPCWG and how it could be applied to
day-to-day activities. In this way participants developed a clear picture of the strengths and
weaknesses of their organizations and designed strategies for change specific to their
situations.

Pre-Institute

Before the first session of the institute, participants were asked to read the strategy, change
management, leadership, and public policy sections of the PPCWG as illustrative of the

56 Training and Development in Human Services


PPCWG as a whole. During pre-institute calls with the assigned APHSA staff facilitator each
team was instructed in use of an agency readiness tool to assess their current capacity for
change. This tool was organized around the dimensions of organizational readiness,
leadership readiness, general readiness to improve and innovate, and staff readiness.

Teams were also introduced to the idea that effective learning and improvement relies on
teams that:

• Are sincere about their intentions for participating in learning and improvement
programs.
• Make things happen through a bias for action and learning by doing
• Make connections between what they want to improve today and what they need to
improve overall to achieve goals and sustain progress
• Know when they need on-the-ground support and ask for it
• Accept advice and direction on ways to do things differently

Participants were asked to think about and be prepared to discuss a critical area from the
PPCWG analytical framework that their team would work on during the Institute. This area
for improvement would be further defined in their charter with their sponsors, and that the
process of sponsor group formation and chartering would be an integral part of the first
institute session.

In addition teams were asked to start thinking about (1) a sponsor group of people with the
authority and willingness to the support needed to implement the work the team decided to
do, and (2) a continuous improvement team (CIT) that would be engaged to manage and
ensure work was done.

Session 1

The first session was structured for the participants to get to know each other better so they
would be more likely to work as a team as they developed their agency improvement plans. A
case study for each agency would be written to document challenges and approved by the
respective jurisdiction prior to its release by APHSA.; participants signed release forms for
video and pictures. Each team reported to the group on their perceived readiness and what
readiness factors might need to be addressed to meet the challenges that a change effort would
present.

The agencies showed considerable variation in level of readiness to change. Each identified
non-monetary resources they needed to bring change about, including human capital and
inter-organizational partnerships. There was a general acknowledgement that information
technology and human resources had to be part of any change process and be represented on
change teams. The teams also reflected that often change efforts are conducted without
consideration for what is going on culturally within the agency or what staff capacity is

Training and Development in Human Services 57


available to implement a change effort. For example, if a state or locality was charged with
implementing a new program or practice that would achieve improved outcomes for children,
a systematic change planning or implementation process would not typically be employed,
thus negatively impacting the success of the new program or practice.

The second day of the first session, centered on APHSA’s model for continuous
improvement, DAPIM This method focuses improvement teams on the following themes:

• Defining a problem leads to its clarification and the ultimate aim of an improvement
effort, positioning a team to increase engagement and commitment.
• Assessing a problem requires critical thinking to uncover root causes. Doing so saves
time and minimizes frustration by ensuring that activities undertaken for change are
relevant. Root cause analysis is critical for identifying improvements that have the
greatest chance of paying off.
• Planning is a preparation process in which concrete remedies are developed and tasks
and activities are mapped out to achieve goals and objectives.
• Implementing includes assigning tasks with leads and timeframes, and assuming
responsibility for managing the process toward heightened accountability.
• Monitoring promotes learning and adaptability and paves the way for updating plans
toward achieving the desired impact.

Figure 2. APHSA’s DAPIM Model for Continuous Improvement

Define

Monitor Assess
Performance
& Capacity

Implement Plan

58 Training and Development in Human Services


Intersession work was reviewed with the teams. The expectation was that by the time the
teams returned in July, each would have finalized the sponsor group and continuous team
roster, have completed their organizational assessment, and created a charter for their
identified project.

In the session’s wrap up and evaluation, participants reported that they liked:

• Rotating the facilitators (four were used in this institute)


• The encouragement to participate actively throughout the session
• The ability to speak openly
• The frequent check-ins to ensure everyone was engaged in the process
• Meeting with the facilitators during the team work time to help them apply learning,
gain clarity in their goals, and work on a charter

They requested that subject matter content be presented for shorter periods and be
interspersed with more hands-on work so it would be more concrete and less theoretical. This
feedback affirmed the institute design approach of minimizing lectures.

Intersession

During the first intersession, facilitators assessed the strengths and struggles of teams and
developed curriculum for the next session accordingly. Support to teams was based on
varying needs and ranged from telephone calls, to site visits, to meeting with sponsor groups
and helping form and facilitate continuous improvement teams. Each team had their own
challenges and experienced varying degrees of success based on their grasp of the process and
challenges within the agency and community served.

Lesson Learned

This intersession experience was consistent with the learning-by-doing principle. As teams
began to struggle with new ways of doing things, the facilitator/consultants used that
experience for coaching, encouragement, and support. Intersession experiences provided real-
world material to process in the second institute session.

Session 2

The teams reported on their intersession activities and their assessment of what was going
well and what needed improvement. Reports were divided into two sections: (1) sponsor
group formation, chartering, and continuous improvement team formation and (2) defining
and assessing areas for improvement. It was anticipated that the teams would have
accomplished the first two steps of the DAPIM process.

Training and Development in Human Services 59


Each team’s sponsor group and continuous improvement team rosters were in various stages
of development. While the concept of having a sponsor group for a change initiative may not
have been new to some teams, the utilization of it in a change process as a critical first step
was. Teams experienced problems in firming up the sponsor group roster and their subsequent
engagement because of the availability of the needed sponsors and time for follow through
with those initially identified. The continuous improvement team roster was difficult for some
to finalize because of a lack of resources and time initially promised by the sponsors; legal
constraints of some potential members of the continuous improvement team; and buy-in from
staff needed to carry out implementation tasks.

The charter template was found to be helpful, but the teams’ charters were also in various
stages of completion. Teams approached the charter work in various ways—some as an
individual activity and others as a collaborative effort engaging colleagues, staff, and external
stakeholders. Some of challenges in completing the charters reflect the challenges to engaging
and finalizing the sponsor group and continuous improvement team rosters. Teams that were
able to present charters to their sponsor groups faced responses ranging from suggestions to
narrow the scope of the work to suggestions to broaden the scope of the work.

All participating teams initially focused on one defined area for improvement as outlined in
their application for the Institute. But as they assessed their organizations’ readiness, and
began to assess their strengths and gaps for their initially defined improvement area, many
teams elected to either broaden or refocus the scope of their project. This is consistent with
the DAPIM method, which operates as a fly wheel with interrelated steps, and not as a linear
method. For example, defining a topic for improvement drives what you would initially focus
on for assessment; conversely, as you begin to assess the initial identified area of
improvement, your findings may help you refine what it is you want to improve.

By the end of the first day in Session 2, it was clear that the teams had achieved varying levels
of success. Facilitators adapted the institute curriculum and engaged participants in a learning-
by-doing experience specific to the institute itself. In real time on day 2, the facilitators used
the process of DAPIM to make adjustment to the overall timing of the institute. Participants
learned more about how to take an issue, define it clearly, assess the strengths and gaps
related to the issue, plan for change, implement the change, and monitor its progress. Near the
end of the second day, teams worked in small groups with their APHSA staff liaisons to
develop customized action plans for meeting their commitments by the next session. In the
session’s wrap up and evaluation, participants reported that the “mini-DAPIMing” of the
institute itself was very important in the learning process, taking a theoretical concept and
applying it to a real-time experience embedded the concept.

Lessons Learned

While establishing a sponsor group and continuous improvement team roster may seem like a
simple task, it is one that is overlooked in many change initiatives. When these teams are not

60 Training and Development in Human Services


in full operation as outlined in the PPCWG change management chapter, new programs or
practice improvement initiatives cannot be successful, particularly when there is a crisis or
workforce capacity issue that might derail the initiative. It was also clear that the guidance
related to change management and strategy is fundamental to initiating a change initiative for
a new program or practice improvement. Teams reported many connections between their
intersession efforts and the PPCWG domain materials, including:

Change management: used the principles of inclusiveness and being holistic in the
development of sponsor group and continuous improvement team rosters. For
example, support functions (HR and finance) that could conceivably impede project
progress for legal and resource reasons were deemed important to include.

Strategy: (1) Established both quantitative and qualitative outcome goals and
recognized the need for both (e.g., it is one thing to serve greater numbers and another
to ensure that the services are appropriate and adequate). (2) Embraced the concept of
slowing down to speed up. It is necessary to go slow enough to ensure sound planning,
to think methodically about outcomes, to promote buy-in with staff and stakeholders,
to ensure that all good ideas are considered, and to provide enough time for effective
implementation. This enables speeding up to properly reinforce a sense of urgency that
the improvement work is critical to outcomes for children.

Intersession

During this intersession APHSA liaisons maintained regular, ongoing engagement with their
assigned teams by of telephone contacts and site visits. The liaison intervention depended on
team needs and receptivity. On-site activities included DAPIM define and assess work
uncovering root causes by conducting focus groups with front-line staff. Planning and
implementation consultation was provided to the agencies that were further along in their
continuous improvement effort. The role of the facilitator was to share observations and
challenge participants’ decisions to enable their success.

Session 3

This session included extensive team reports covering their define and assess efforts as well
as the three plans: continuous improvement, communications, and capacity. Structured
presentation questions to be addressed were developed and provided to the participants to
guide and focus their reports.

In Session 3 it became clear that the all teams were adapting the institute’s materials to their
own particular situation. Institute staff endorsed this and reinforced support for the teams
moving at different paces. Doing so seemed to increase the teams’ energy and momentum for
complete their projects in a timeframe that made sense to them. The team reports clearly

Training and Development in Human Services 61


showed that participants were drawing upon many of the PPCWG chapters. Every team report
out included a “because we realized” comment that referenced a PPCWG chapter.

It was also evident that the four PPCWG and institute principles that resonated with the
participants in Session 1 were embedded in their work. The participants were comfortable
critiquing each other as supportive peers. And the teams were both overtly or intuitively
applying mini DAPIMs to tasks within their plans. When this was pointed out the response
was “well it just became part of the way we do things.”

Teams also noted that they had become more mindful and intentional in their work,
recognizing, for example, that slowing down the change process may actually lead to better
outcomes. Teams acknowledged that they were each in a different place in the change
process, but felt that they were all moving in a positive direction. Each team was also able to
look at where they were on the DAPIM flywheel and clearly understood how they got there.

The schedule for day 2 was adjusted to accommodate participants’ having more time for
cross-team feedback. Facilitator presentations covered implementation, monitoring,
sustainability, and data.

Lessons Learned

As the teams began to internalize the PPCWG and change management techniques, they were
able to apply the content presented in prior sessions. Understanding content from the guidance
does not happen easily or immediately, but it is application that produces understanding and
continued use. Some of the content that was specifically utilized by the institute teams:

• Capacity assessment helps determine appropriate options when initiating and planning
a change effort.
• PPCWG critical areas of budget and finance, public policy, research, and technology
are the foundation for strengthening and supporting the work of others; play a vital
role in improving practices; make work easier, better, and faster; and in help allocating
resources.
• Quick wins need to be used fairly early in a change process, and providing stories of
how quick wins are having an impact helps sustain the momentum for change.
• Key messages within well-developed communication plans, backed up with solid data
and commentary, are a vital lever in sustaining the momentum and energy for a
change effort.
• One can hire staff for their core competencies, but making a difference in terms of
safety and permanency requires analyzing data in ways that drill down to the basic
units of accountability and improved case planning.
• Resistance to change can be constructive or non-constructive, and it is critical for
those leading change efforts to engage resisters, understand the reasons for their
resistance, and respond accordingly.

62 Training and Development in Human Services


Intersession

The major work during the intersession was to review the project progress, identify what still
needed to be done and most importantly link the activities to outcomes. APHSA staff engaged
with the teams as in previous intersessions, helping them prepare for the formal closure of the
institute and to plan for sustaining their momentum for change. All the teams were at different
stages of progress at this point, but each team’s efforts incorporated the following:

• Gain sponsor group support and approval for improvement plans


• Capture lessons learned from quick wins and make related adjustments to them
• Implement continuous improvement, communication, and capacity plans
• Monitor the progress, impact, and lessons learned from implementing these plans and
make revisions as necessary
• Establish measures and monitoring methods for directly linking improvement efforts
to outcomes goals
• Consider what would be needed to sustain the team’s change effort in 2011

Session 4

Each team reported on their overall progress since the last session, including the barriers to
progress that they anticipated going forward. Each team also presented data dashboards for
connecting improvement efforts to outcomes, next steps for implementing these on an
ongoing basis, and a plan to monitor their and use it to make ongoing adjustments.

Implementation and monitoring concepts and tools were presented in an interactive lecture
format, and the issue of post-institute sustainability was addressed in a facilitative manner.
Barriers for sustainability were identified and potential ways to meet them were considered
with input from the teams, including:

• Support from above (sponsors, new leaders and others who may be needed). The work
already been done on the project and the PPCWG could be used to educate, inform
and connect with new leaders. Teams should stay persistent and tenacious and
maintain connections through all parts of the agency for support. Teams should use
outcome measures to support their projects and provide a compelling vision to those
with the authority to empower them.
• Workforce and continuous improvement team capacity and financial resources. This
must be continuously evaluated as to its impact on the change initiative because of the
current economic climate of budget cuts and staff layoffs. At times, change efforts
should be limited or slowed down in line with available capacity to support them.
• Fidelity to the PPCWG/APHSA organizational effectiveness model for change
management and continuous improvement. This may be difficult without someone
checking in, asking questions and providing feedback; the peer pressure of needing to

Training and Development in Human Services 63


present a progress report to the institute group; and the value of the ideas and support
received from the group. APHSA will explore the possibility of setting up a virtual
group on the website to enable all team members to continue to check in with each
other and gather group support and ideas.
• Not seeing improvements. Teams need to monitor plan implementation to determine if
more time is needed, or if the plan must be adapted to emerging data. Measures
themselves may also need to be reconsidered from time to time.

The session and institute came to a formal close with a celebration of the progress each team
had made, and statements from team members regarding how the institute experience had
helped them and their agencies.

Reflections through the Lens of APHSA’s Evaluation Framework

As we look to further implement PPCWG in the field of child welfare, it is important to


understand where child welfare agencies can utilize the guidance themselves and where
consultation is needed for them to do so successfully. The institute offered a glimpse into the
criteria that will ensure successful implementation of the guidance.

Agency readiness assessment assumed much greater importance as the institute unfolded. The
related tools resonated with the participant teams and appeared to the APHSA
facilitators/liaisons to be a useful gauge for determining the speed, scope, and required
support for a change effort. Participants commented that readiness assessments provided
insight into the scoping of improvement priorities, and they suggested that such tools be used
before and during a change effort.

It is critical not to supply too much information to participants before they have a chance to
apply it. Participants were prepared to move quickly through the curriculum and wanted to
move quickly to break-out activities. The first day of Session 1 was about 50 percent lecture
and group presentation and 50 percent discussion. Lecture periods did not exceed 20 minutes,
but participants indicated they thought the lecture component was too long and theoretical. It
became clear that the greatest learning takes place among peers, so even brief lectures should
be punctuated with discussion periods.

There must be detailed exploration of how well the participants understand what they are
supposed to do in an institute well before the first session.

Conducting a mini-DAPIM on the institute itself assumed great importance as a way to model
and reinforce the need to be systematic throughout any improvement effort.

64 Training and Development in Human Services


Team Updates

Though each agency focused on different areas for improvement, the changes that each
envisioned through applying the PPCWG analytical framework and domain content,
supported by the institute's change management structure and continuous improvement
process, resulted in enhancements in agency capacity or practice. These enhancements are
perceived by the participants as positively impacting the capacity of families to provide safely
for their children in their own home.

Each participating agency team perceives a cause and effect link to the reduction of children
in out of home placement though improved practice:

• Kansas: By the close of institute this team was poised to integrate the institute’s
change management structure and continuous improvement process into daily
operations and at all agency levels (state, regional, and local). In June 2011 this team
reported that each region was employing the structure and process, and that the
approach was taking hold on an informal, grassroots level in many local settings.

Throughout the time that they participated in the institute, the project team set targets
for regions to work toward. The goal was to move a stalled but successful process
forward for safely reducing the number of children in out-of-home care. In state fiscal
year 2011 two of the six regional offices achieved their benchmarks. Between July 1
and November 30, 2010, Wichita reduced the number of children in care by 62, from
998 to 936. During the same period, the southeast region reduced the number of
children in care by 38, from 520 to 482.

On an aggregate level, there were 8362 Kansas children in care at mid-year 2009,
8275 at mid-year 2010, and 7685 at the end of April 2011.

• Larimer: The Larimer team has been working with their community’s Family
Consumer Council (FCC) to become more effective and a stronger community asset.
In June 2011 the team reported it was very much engaged in the effort to strengthen
the FCC and was applying structure and process lessons from the institute in various
and novel ways.

Larimer reported that in July 2009 there were 251 children in care. In July 2010 there
were 196 children in care and as of May 2011 there were 199 children in care. During
that two-year period the recurrence of abuse rates remained virtually the same. It is the
hope that once the FCC’s strategy is fully realized, the community will be more
responsible for caring for the children and families within it, safely reducing the
number of children in placement with the public agency.

Training and Development in Human Services 65


• Charlottesville: By taking steps to create an agency environment that encourages open
dialogue within and across divisions, resulting in better coordination of resources and
improved supervision, this agency’s supervisory team hopes to increase staff capacity
to offer effective and efficient client services. As a result of improved services, the
team believes that vulnerable populations will be safer and children that may have
previously entered placement will be able to remain their own homes. Reduced entry
rates will most likely lead to a reduction in the number of children in out-of-home
placement.

Results through early 2011 were promising. There has been a 20 percent annualized
decrease in foster care placements and a 22 percent annualized increase in discharges
to permanency through March. In 2010, the agency facilitated one family partnership
meeting compared to 11 in early 2011. At the beginning of 2010, Charlottesville had a
total of 215 children in foster care and in December of 2010, the number of children in
foster care had decreased to 106.

• Calaveras: This team believes that by lowering the number of referrals coming into
the hotline for protective services, through the analysis of why the rate of calls is so
high and looking at where the services needs may be met by other community
agencies, staff capacity will be increased to provide essential services to families
where the risk of out-of-home placement is the greatest. With improved appropriate
services to these families, more children will most likely be able to remain safely at
home. This will hopefully result in a reduced number of children in out of home
placement.

Calaveras is a small locality, and from December 31, 2009 to December 31, 2010 the
number of children in out of home care remained the same at 74. At the end of May
2011 that number had increased to 79.

• Baltimore: By reinvesting financial resources from high-end residential care into


community services, Baltimore hopes to step down children from high-end facilities to
less costly community and family settings, including safe reunification with their own
families and relatives. They hope that such community service initiatives will have the
broadening effect of enabling families to keep children at home and avoid placement
entry. They hope that both the step down reunification and the avoidance routes will
lead to a reduced number of children in out of home placement.

As of January 31, 2010, the County of Baltimore had 595 children in care. On June 30,
2010, there was a decrease of children in care to 560. On May 31, 2011, the number of
children in care had increased to 572.

66 Training and Development in Human Services


• District of Columbia: A focused, engaged, and systematic continuous improvement
process is taking shape, and by April 2011 the agency was implementing a range of
quick wins and planning to do so for a range of other remedies.

By successfully embedding the agency’s practice model in a way that engages the
entire agency, DC believes that they will establish better collaboration between its
casework functions and administrative functions (such as legal, fiscal, and human
resources). They believe that this, in turn, will increase the capacity of the direct
service staff to improve their services, parental capacity to provide for children safely
in their own homes, and reduced placement for children.

As of December 31, 2009, the number of children in out-of-home care was 2,103 and
as of September 30, 2010, the number of children in out-of-home care decreased to
2,092. Since DC initiated improvement plans in the spring of 2011, trends are
expected to become evident later in the year.

Looking Forward

Casey Family Programs is providing funding through 2011 to do follow up with the agencies
engaged in this first institute and to provide assistance as needed to sustain improvements.
Depending on the needs of the agency, the following type of assistance will be offered:

• Periodic telephone follow-up to inquire how the project is progressing


• Website section to enable ongoing interactive communication among the teams
• Onsite assistance by the APHSA staff liaison

Any future PPCWG institute will be re-designed based any new information gleaned from this
year of follow-up and on lessons learned from the 2010 cycle. Changes based on this institute
would include:

• Increasing pre-institute preparation to move the teams further into the process prior to
the first in-person session
• Providing an even clearer and more realistic preview of the work effort required by the
institute
• Clarifying intersession responsibilities to ensure that participants understand fully and
are able to connect their ongoing efforts to the PPCWG
• Conducting the institute over a full year rather than nine months to ensure that all
agencies, regardless of their readiness, have the opportunity to get to the
implementation phase of the work
• Reducing up-front reading of the guidance, focusing more on how to navigate the
PPCWG website and find the parts within each domain area that are applicable to a
selected project as participants proceed through improvement efforts

Training and Development in Human Services 67


IMPROVING LOCAL OFFICE SUPERVISION:
ARIZONA DEPARTMENT OF ECONOMIC SECURITY

Phil Basso, Organizational Effectiveness,


American Public Human Services Association, and
Elizabeth Hatounian, Organization and Management Development,
Arizona Department of Economic Security

Background

Since the mid-1990s Arizona’s Nutrition Assistance program has been one of the fastest
growing in the country. It has experienced many ups and downs.. The federal Food and
Nutrition Service (FNS) supervises the Supplemental Nutrition Assistance Program (formerly
known as food stamps). In 1996, the FNS levied a hefty penalty on Arizona for substandard
performance during the prior three years. In 1999 and 2003, Arizona earned bonus funding for
being among the top performing states. In 2007, Arizona was penalized $1,500,000 for being
one of the poorest performing states during the preceding two year period.

FNS and Arizona settled on an agreement for the agency to invest 50% of the sanction, or
$750,000, on new strategies for performance improvement, with the remaining 50% held in
abeyance pending the results of this investment. The state also faced class-action litigation
regarding its food stamp application processing.

Arizona’s Nutrition Assistance program is part of the Family Assistance Administration


(FAA) within the Department of Economic Security (DES). At the same time as FAA was
exploring ways to improve its performance, the entire agency was undertaking efforts to
enhance itself. DES had developed a formal business model, which identified: (1) DES vision,
mission, goals, and values; (2) related foundational strategies for better serving the public; and
3) strategies to strengthen the agency itself. One of the core strategies to strengthen DES was
to ensure a quality workforce. This included ensuring that the agency was hiring the right
people for the right positions; aligning training, development, and performance expectations
with its organization’s values; and enhancing training and supports for supervisors. This fell
squarely in line with what FAA needed to do to improve its performance and to satisfy the
terms of the Settlement Agreement with FNS.

Through an internal analysis, FAA determined that its staff recruitment, development, and
retention were deficient and in need of improvement. FAA also determined that the key
positions of first level supervisor and manager had the greatest impact on performance and

68 Training and Development in Human Services


retention, yet were the least supported in terms of training and development. DES set out to
find a supervisor and management development program that would help the agency reach
and maintain a high level of performance over time.

APHSA’s Approach

In early 2007 DES learned of a relatively new APHSA continuous improvement program that
had met with early success for a few state agency teams that had implemented it. The initial
request for APHSA support was for a two-day intensive continuous improvement
seminar/workshop for all FAA supervisors in Arizona’s most populated county.

APHSA’s initial consulting approach was to back away from the suggested intervention and
take things from the top—to probe for the agency’s concrete goals, objectives that the agency
believed would best address these goals, current strengths and gaps related to these objectives,
root causes for the identified gaps, and potential remedies for addressing the root causes.

Through a structured process of reflection, it became clear that FAA was certain about its
goals, objectives, and related gaps, but less certain about the root causes and remedies for
their primary gap—supervisory effectiveness. Through discussion of the initial request, it
became clear to FAA that while effective supervision was an important gap that needed to be
addressed, there were many reasons besides lack of training for the gap. Further, those reasons
were likely different and nuanced for each local office management team, subject to the
motivations and collaborative spirit of each of these teams. It also became clear that the
remedy employed would need to be focused on working directly with these actual teams to
continuously improve themselves, in essence “learning by doing.”

After discussing with APHSA concrete alternatives to a classroom-based training approach,


FAA decided to develop an innovative supervisor development program and pilot it in 11 of
its largest and underperforming local offices through two phases of facilitated improvement
work. Once the pilot was mature enough for lessons to be learned and positive results to be
seen, the agency planned to use internal resources to replicate the pilot within all 85 of its
local offices.

FAA formulated a plan to employ this program and contracted with APHSA to deliver
consultative services through direct facilitation work with the initial 11 office management
teams. In July of 2007 FAA kicked off the program with a full day workshop wherein the
concept of learning by doing as a method of continuous improvement was introduced to the
management teams for the first five participating offices. Also in attendance were a number of
management personnel representing policy development, training, systems, quality control,
and quality improvement. This initial workshop provided a forum for identifying and
understanding both agency and office priorities, which established the groundwork for each
team’s activities.

Training and Development in Human Services 69


In the ensuing months, each local office team worked on areas they had identified as most in
need of improvement. The APHSA consultant facilitated these teams through five steps for
continuous improvement:

1. Define the areas for improvement in operational terms


2. Assess the current and desired state, and identify root causes for any related gaps
3. Plan the solution or remedial action
4. Implement the plan and
5. Monitor the progress and outcomes, adjusting plans as necessary

This DAPIM model provided a systematic approach within which team members learned
continuous improvement by actually solving real problems they had identified. Many times
throughout the process, as teams employed the DAPIM method a tool was introduced from
the consultant’s tool kit or emerged from the team’s own work as a solution to a particular
problem. This is where more traditional training content fits into the DAPIM approach,
ensuring immediate application in the real world.

Another key aspect of the DAPIM method is to initially identify and work toward quick wins
that can be achieved within a month. Experiencing early success serves to reinforce the
approach and motivate team members to continue working towards mid- to long-range
solutions that may be more complex and time-consuming. For continuous improvement, the
analogy of turning a heavy flywheel was given to describe the difficulty and effort necessary
to make the first turn. Once momentum is established, the second and subsequent turns of the
flywheel become increasingly easy and intuitive.

One team, for example, created a time management tool to help them understand why
supervisors were not able to meet all of their job responsibilities during the work day. By
charting the time they spent on each activity, it became evident that supervisors had been
devoting their time to work that belonged to their staff. This phenomenon, described by
APHSA as “unconscious demotion,” had eroded the available time supervisors had to actually
supervise—limiting staff development, increasing the urge for supervisors to “take over” the
work of their staff, and intensifying this negative cycle.

Reversing this cycle became a central focus of this office’s supervisory team, initially in ways
that were quick wins, including improving meeting management, staging some fun and stress-
reducing activities, and eliminating some low-priority reporting activities. Over time, though,
the team began to consider strategies for mentoring and developing staff that would take more
time—rotating leadership responsibilities among staff members, developing trust through
thoughtful recognition, communication and teamwork activities, having more effective
“crucial conversations” with staff, establishing clear and consistent supervisory standards
throughout the office, delegating decision-making authority where appropriate, and
strengthening their performance management practices with support from HR.

70 Training and Development in Human Services


Overall Structure of the Intervention

Phase I

The APHSA consultant’s Phase I work with the first five teams was completed in October
2007, with all five teams reporting successful results from their quick win solutions and
putting longer-term continuous improvement plans in place as well. Most importantly,
participants on these teams had learned the DAPIM continuous improvement methodology
and continued to add strategies to their tool box on their own. Finally, each team enhanced
their level of trust, with other teams and with upper management by having open and
constructive discussions, often about difficult and revealing subjects.

These observations were made by the participating teams themselves, as well as by their
managers, who participated in the facilitated sessions as observers, and most of FAA’s senior
management, who participated in monthly monitoring sessions. These sessions and the
participating management, including DES human resources and training staff, came to be
known as the “sponsorship,” better ensuring sustainability in terms of priority, time, and other
resources.

Phase II

FAA sponsorship and APHSA began Phase II of the effort by further considering the question
of sustainability and how to replicate the learning-by-doing approach as broadly and quickly
as was practical. Upon identifying the six additional local office teams to work with the
APHSA consultant, four agency employees who had served as observers during Phase I were
designated to be co-facilitators. The intent was for co-facilitators to implement the DAPIM
model at subsequent sites and, in effect, learn by doing themselves. Session observers were
expanded to include members of each office’s regional management team, ensuring a stronger
linkage between improvement and alignment efforts at different tiers of the organization.
Phase II of the effort met with similar success at the local office level, this time with
significant interest from a number of participants to become DAPIM facilitators in the future.

The APHSA consultant was therefore tasked to first demonstrate the DAPIM and facilitation
methods involved, and eventually to train internal FAA staff to facilitate the DAPIM process.
People who were chosen to learn the facilitation process were familiar with FAA, the offices,
and the people. Their role as a DAPIM facilitator was in addition to other job duties. Most had
little or no experience facilitating groups, but their commitment and passion for the DAPIM
process as a way to improve organizational effectiveness made them eager learners.

The process to train a DAPIM facilitator took three to six months. After observing the
process, the new facilitator was teamed up with an experienced facilitator to work with an
office as co-facilitators. Each prospective facilitator, then, had the opportunity to observe an
experienced facilitator work with an intact workgroup from start to finish. They also

Training and Development in Human Services 71


participated in facilitator development sessions, where peer facilitators shared their progress,
raised questions and challenges, and received advice and encouragement from the rest of the
group. These sessions were a key element to learning the facilitation process as it provided
participants time to share, reflect, and learn from their own personal experience.

At the Phase II sponsorship meetings, 10 additional internal facilitators were identified for
subsequent phases of the effort, including three staff from the DES training function. In
jointly planning for Phase III, to begin in March 2008, FAA sponsors and APHSA also
decided to develop an organizational effectiveness (OE) hand book to support FAA
facilitators over time to continue their practice well after APHSA had stopped providing
direct support. This handbook would contain all of the models, tools, templates, activities,
examples, and narrative guidance that would be needed for an internal continuous
improvement facilitator.

At this point, FAA was viewing the DAPIM and learning by doing methods as an emerging
way of doing business for all of its management levels to learn and employ. In addition, after
receiving DES executive leadership support, plans were launched by the DES training
function to incorporate the approach in other areas of staff development throughout the
agency.

Phase III

In Phase III, four FAA facilitators from the first cohort worked alongside the new facilitators
in the local offices, with the APHSA consultant observing on an ad hoc basis in more
challenging local office settings or where internal facilitators were not yet available. The local
offices selected were a combination of five from the first 11 offices that still needed further
support, and three new local office participants.

At this point there were enough internal facilitators involved with the effort to also stage bi-
monthly sessions where the FAA facilitators’ practice itself was formally subject to DAPIM
itself. Each facilitator would bring successes and challenges to the table and the collective
team would then consult to one another on strengths, gaps, root causes and remedies, and
plans to continuously improve their individual and collective facilitation effectiveness.
Reports from the participants themselves and at the sponsor meetings cited this activity as
vital peer-to-peer support and development. These sessions were also instrumental in drafting
and revising key sections of the OE handbook, as many of the remedies identified constituted
team activities whereby the group created a new facilitation tool or guidance for addressing
facilitator challenges.

Phase IV

To kick off Phase IV of the effort in September 2008, the first edition of the Organizational
Effectiveness Handbook the 16 internal facilitators developed during this effort. From this

72 Training and Development in Human Services


point forward the APHSA consultant was used primarily as a coach from a distance—either
by phone and email—or through four additional in-person sessions with the FAA facilitators.

At the beginning of this phase the sponsor group and director of FAA established eight 2008-
09 goals and objectives for sustainability:

1. Regional management teams would all establish a working knowledge of learning by


doing, using DAPIM principles and steps when solving problems.
2. All FAA staff would be informed of the program, its importance, and its benefits.
3. Each region would “own” their Phase IV plan, versus relying on a centralized
sponsoring authority. Region-specific evaluation methods would be in place, designed
from these options: the FAA-developed climate survey; the APHSA readiness model;
completion of DAPIM work products; local office performance results.
4. A fully dedicated learning by doing coordinator position would be established.
5. Every region would have a learning by doing success story, illustrated by the region-
specific evaluation approach.
6. Every region would have a dedicated DAPIM facilitator.
7. Continuous improvement projects would be successfully completed for the regional
leadership team and each region’s management team.
8. Learning by doing would be concretely linked to the overall FAA and DES strategy.

During Phase IV feedback on the first edition of the OE handbook was converted into a list of
improvements that were delivered by APHSA at the beginning of 2009. The APHSA
consultant was also asked to work with the regional program manager team regarding the
continuous improvement of their working together and with their supervisor, the director of
FAA. Finally, the APHSA consultant began working with a project manager hired in
November 2008 as part of the goals and objectives for strengthening the sustainability of this
effort.

Unfortunately, heading into 2009 DES and FAA entered a period of sudden and severe
budgetary crisis and the onset of the recession. The ensuing changes resulted in a 65%
increase in applications at the same time as a 35% decrease in staffing. For primarily these
reasons, Phase IV wrapped up in January 2009 with a tenuous hold on sustainability.
Sponsorship for the effort at the FAA leadership level, DES executive level, from the training
function, and from a number of the FAA regional managers was not sustained, and the project
manager position created and staffed in late 2008 was eliminated. Today there are solid
pockets of DAPIM and learning by doing being practiced within FAA and some interest in
reviving the effort, but no formal plans for doing so are in place.

DAPIM Facilitation

Local FAA offices were chosen to learn the DAPIM process based on the need to improve
their timeliness and accuracy numbers as well as on the size of their office. Offices were

Training and Development in Human Services 73


located in various parts of the state, from Tucson in the south to Flagstaff and Show Low in
the north. The management teams worked with an APHSA trained DAPIM facilitator in six to
eight full-day sessions to allow enough time to go through and learn each DAPIM phase.

A key to the successful transfer of the DAPIM process to each management team and office is
the DAPIM facilitator. The role of the facilitator is to create a safe environment in which
participants can learn the process through first-hand experience and continue using the
process on their own. A critical piece to this facilitation is to get the participants to reflect
during each phase. This produced two positive results. First, participants became aware of
how their behaviors had either a positive or negative impact on the office climate and culture.
When participants chose to change behaviors, the climate and culture improved along with the
processes. Second, participants learned that when an implemented plan did not work, they
could learn from it and move forward to try again. Over time, participants came to realize the
process is continuous.

This facilitation approach creates a learning-by-doing experience unlike traditional classroom


training that attempts to simulate on the job challenges. The learning by doing approach has
several core elements that make it effective:

• Work directly with intact teams that work together every day.
• Build a safe, high-trust, team-oriented learning environment.
• Encourage teams to tackle real life work challenges through creativity and
experimentation.
• Facilitate continuous improvement for the aspects of performance of greatest
significance to the teams themselves.
• Build the capacity of participating teams to handle new and emerging challenges as on
ongoing way of doing business.
• Use team members’ expertise and insight about their own challenges to determine
which developmental models and tools should be introduced and when.
• Use an organizational needs assessment to determine developmental priorities in
alignment with organizational goals and objectives.
• Measure success by identifying concrete improvements to learners’ performance on
the job and to the lives of the organization’s clients.

Prior to the first session, each management team completed a survey and a one-on-one
interview with a DAPIM facilitator. This assisted the facilitator in determining what the
overall perception of the office operations was to each team member and the level of mutual
trust and cohesiveness of the office staff. A key principle of DAPIM is to work on both tasks
and relationships as a holistic approach to organizational effectiveness.

The DAPIM process included a standard set of work products that the teams developed in
their sessions. Work products included identifying initial feelings about the work, setting
ground rules for relating with one another during the sessions, identifying areas for

74 Training and Development in Human Services


improvement, and conducting a root cause analysis. Once the root cause analysis was
completed, the teams developed long-term and short-term solutions. Short-term solutions, or
quick wins, were implemented within the first 30 or so days; mid- and long-term solutions
took more planning time for implementation and monitoring. Management teams also were
encouraged by facilitators to include a communication plan to keep staff involved and
prepared for the implemented changes.

In the first few sessions, the DAPIM facilitator introduced the DAPIM process to the
management team, working with them to identify improvement areas for the office in terms
that were operationally specific (define phase). Each targeted improvement area was then
assessed by its current state against the desired state, with strengths, gaps, root causes for
priority gaps, and related general remedies all identified (assess phase). A work plan for long-
and short-term remedies was then established (plan phase). By the third or fourth session,
implementation of each targeted area had begun (implementation phase). Many management
teams by this time were including their front-line staff in contributing ideas for the process
improvements. By the fifth and sixth sessions, the implemented short-term goals were
monitored and adjusted as needed (monitoring phase), while the team continued to develop
and implement their longer-term goals.

Facilitators introduced team activities as needed to address specific relational or process


issues. The team activities addressed issues like trust, open-communications, or time
management. Additional sessions with a facilitator were added, when needed, until a
management team built enough confidence and trust to carry on without the facilitator.

The implemented strategies that came out of each office varied even though the desired
business outcome was the same: to improve supervision and improve timeliness and accuracy
numbers. The variance in the strategies demonstrated that when the work teams were taught to
use the DAPIM process, the freedom to be creative, innovative, and open to experimentation
surfaced. This was a very important aspect of the process to the individual offices. By
allowing each office to use DAPIM to solve their own problems rather than a one-size-fits-all
solution, each office was able to address specific issues and problems for their location and
demographics.

Results

Local Office Results

Many of the local offices involved in this effort report that they’ve experienced significant
positive results in supervisory effectiveness and office results. Over time, several local office
management teams continued to use the process on their own without the need of a
professional facilitator. Many supervisors developed self-confidence through their successes
with the DAPIM process. Frontline staff often was brought into the process to improve office
operations, resulting in increased employee engagement in several local offices. Many

Training and Development in Human Services 75


relational issues were resolved, which was found to not only have a positive effect on office
morale but also to contribute to success in improving timeliness and accuracy numbers.

Some of the more successful strategies that were implemented included:

• A workload forecasting tool for supervisors


• A community resource fair with community partners
• Cross training of staff to cover the front lobby
• Central Appointment Registration (CAR) calendar
• Protected time for staff and supervisors to focus on processing paperwork
• On-board new team members to the DAPIM approach so it continues to be “the way
we do business”
• Tracking case loads by work units within the same office as a competition
• Increased involvement of front-line staff in process improvement efforts

This approach to learning was a win-win for the participants and the organization. It
facilitated learning, improved individual and organizational performance, and reached the
desired business results of improved timeliness and accuracy numbers.

Throughout the plan phases and sponsorship meetings, the APHSA consultant charted local
office progress in terms of the facilitated work products that each team was expected to
generate through their session efforts:

Initial Feelings
Ground Rules
Improvement Topics Defined
Strengths
Needs/Gaps
Root Causes
General Remedies
Quick Wins
Longer-Term Plans
Monitoring and Adjustments
Team Activities
Staff Communication
Staff Input and Management Follow-Up

In Phase I, all five participating teams accomplished all of their deliverables (although one
team struggled somewhat to establish effective two-way communication with their office
staff). Many of these teams and the observers of these teams also reported personal and
collective breakthroughs in their teamwork, trust, and mutual respect. Some offices reported
major improvements to both timeliness and accuracy within six months, and one office

76 Training and Development in Human Services


embarked on an innovative community partnership initiative that served as a benchmark for
like efforts across the program.

In Phase II, all six participating teams accomplished their deliverables through general
remedies, and two of the six teams accomplished all of their deliverables, with each reporting
significant personal and collective breakthroughs in teamwork, trust, and mutual respect. One
office that had particular success implementing a range of quick and mid-term initiatives
reported, “I have been the Local Office Manager in [my office] for over 12 years and was
unable to accomplish what DAPIM did in a matter of weeks and months.”

For the four teams that did not accomplish all of their work products, two of these teams fell
short on longer-term planning but were otherwise moving forward effectively, while two
others seemed to be stuck. The primary root causes involved were determined to be(a) the
level of support provided by the local office manager or the regional manager, and (b) the
relative effectiveness of the (in development) facilitator involved.

In Phase III, internal FAA facilitators became responsible for monitoring the progress of the
local offices with which they were working, using an internal evaluation format that focused
on timeliness, accuracy, customer service scores, and local office manager self-reporting.

The APHSA consultant focus shifted to guiding the progress of the internal facilitators, who
also reported their progress during the facilitator DAPIM sessions and sponsor sessions. Most
of the internal facilitators reported acquiring a working knowledge and hands-on proficiency
with DAPIM and learning by doing, this was affirmed and reinforced during the collective
continuous improvement sessions with the APHSA consultant and demonstrated by the work
products yielded by the local office teams with whom they worked with one of the most
proficient internal facilitators left the agency, while two others struggled to acquire a working
proficiency. Paradoxically, these two individuals worked in trainer roles, while all but one of
the other internal facilitators was either a local office manager or in FAA’s regional
management.

During Phase IV, the initial plan was to transfer ownership of the effort to the regional
offices. Monitoring and evaluation efforts were very limited, and the effort was no longer a
formal one by February 2009. The comprehensive set of goals and objectives for sustaining
this effort through 2009, while adopted by the full sponsor group in September 2008, were
never formally monitored.

Overall FAA Results

By the end of 2007 FAA improved Nutrition Assistance program accuracy to 95.13%, quickly
fulfilling the sanction settlement agreement requirements. New and more ambitious accuracy
goals were established for 2009. FAA received a substantial monetary award in 2008 for
results well beyond FNA standards during 2007, maintaining high levels of both accuracy and

Training and Development in Human Services 77


timeliness results. The program’s own customer service results also markedly improved
during this time.

The FAA program results achieved in 2007 have not been sustained since. According to FAA,
2008 to 2010 error rates increased from the 2007 rates due to the state budgetary crisis.
Approximately 20 FAA local offices were closed, and workers were moved to other offices to
handle the increased workloads stemming from the recession and other factors. Front-line
workers were unable to keep up with the number of incoming applications both from on-line
and in person applications. FAA management’s energy went into the reconfiguration of
offices and dealing with the spiral upward of applications.

Reflections Through the Lens of APHSA’s Evaluation Framework

APHSA’s evaluation framework can be summarized through the following four statements:

A. Effective facilitation and continuous improvement tools, coupled with sponsor


willingness and relative agency readiness, will result in an effective structure and
process for change management. Effective work products, communications, and
related activities come from that process.

In the case of Arizona’s FAA, the facilitation was typically good to excellent, but with some
planned variation due to the fact that the organization was developing internal facilitators as it
progressed through the project phases. The OE tools were also a work-in-progress for the
internal facilitators who were developing their practice. Top sponsor willingness to make the
changes needed for the organization to accomplish its goals was typically good to excellent,
though that same level of willingness became inconsistent and questionable when the
ownership of the effort began to shift from a central point of sponsorship to ownership at the
regional level.

Many organizations experience change as an event initiated at the top, regardless if the people
in the front lines are ready, willing, and able. In a spirit of innovation, FAA tried a different
direction to get to the organization ready and willing to adapt to changes they might face. The
DAPIM process was given to the front line offices first rather than the mid to executive
management teams.

Starting at a grassroots level, the expectation was that “seeding” the DAPIM process as a way
of doing business at the local office level would produce the desired individual and
organizational performance improvements across the division. The intent was to give front-
line offices the DAPIM tool first to help them improve their internal office operations so they
could respond proactively to the needs of their specific local communities while meeting
federal and state mandates.

78 Training and Development in Human Services


Each group of offices that had been scheduled to receive DAPIM had a kick-off session and
debrief session with their mid to executive management team sponsors. In the beginning, this
was perceived as a punishment by the local offices, because the initial offices had below-
standard timeliness and accuracy numbers. The office management teams’ trust level in their
mid to executive leaders was low and the “us and them” mentality was high. This attitude
affected the teams’ readiness and willingness to participate.

Facilitators faced resistance at first and worked hard to build relationships and trust with the
office management teams. Only when the management teams began to see some positive
results from the work did they come to believe that the DAPIM was a tool to help them rather
than a punishment and the mid to executive leaders were really trying to support them in a
positive way. The trust began to increase and the “us and them” mentality began to decrease.
The willingness and readiness to change also increased.

Over the first year, the kick-off and debriefing sessions of office results eventually began to
build organizational enthusiasm and excitement. Management teams hearing of the positive
results started to request that the DAPIM process be brought to their offices. The mid to
executive level management sponsors were happy with the relational and operational
improvements that were occurring horizontally through the program and that the desire to
learn the process was growing. But a disconnect still existed between the local offices and the
sponsors.

For the most part through the first three phases of this effort, the results were positive, as
expected. Some early phase participants reported the experience as a “breakthrough,”
“transformative,” or “career-changing.” One of the more successful local office managers was
promoted, another with significant personal identity concerns was recognized for much
improved performance, and one of the early facilitators became far more effective in all work
efforts as reported by their supervisor.

Beginning in the fourth phase the results were far more negative and not sustained, as would
be expected when sponsorship wanes (see lessons learned for more on this). Other
contributing factors to Phase IV difficulties included the coming budgetary crisis and,
ironically, a sense that the effort had paid off and did not require as much ongoing support
and vigilance.

B. The work products, communications, and related activities coming from the
learning-by-doing process will result in improvements to the agency’s capacity to
perform.

In the local offices where the learning by doing and continuous improvement methods were
embedded and the related work products were accomplished, many improvements to the local
office climate—teamwork, trust, communication, and morale—were noted by the local office

Training and Development in Human Services 79


leadership, observers and regional management. And team activities introduced during the
facilitated sessions resulted in new skills and techniques being employed.

Interestingly, facilitators did not need to introduce a comprehensive competency model or


skills inventory for supervisory effectiveness. Rather, the local office teams themselves
generated remedies for root causes to priority gaps. This led them to identify most of the
topical areas for supervisory development that are found in competency or skills frameworks.
The facilitator then introduced a model, tool, or template to address that topic from a
comprehensive toolkit. APHSA calls this the inside-out technique: begin developmental work
from the client’s own perspective and trust the fact that, over time and through the DAPIM
process the client will make broader and appropriate connections to what they need in order to
improve and grow.

C. Improvements to the agency’s capacity to perform will result in actual agency


culture and performance improvements, and the sustainability of the agency’s
improvement efforts.

In the offices where many improvements to climate, teamwork, trust, and communication
were noted, improvements were also evident in staff efficiency, customer service scores, staff
retention, absenteeism, and staff complaints about their management. APHSA notes that a
task-relationship balance is very useful to teams that are selecting topics for improvement and
identifying root causes and associated remedies. DAPIM facilitators are trained to scan for
imbalances in a team’s or agency’s culture in this regard, and “lean the other way” to ensure a
balance begins to emerge.

In retrospect, there are number of reasons why the DAPIM process did not develop deep roots
within the organizational culture and practices of its senior leadership. First, the organization
was spread out from one end of the state to another. Second, the need to go back and follow
up with some offices slowed down the schedule to get the remaining FAA local offices
around the state. Third, the new DAPIM project manager position became a casualty of the
budget cuts along with the lack of DAPIM facilitators with time that could be dedicated to
work with offices.

When the state budget deficit came into full effect in late 2008 and early 2009, no DES
human service program was left unmarked. Most state agencies had a reduction in their
workforce either by leaving vacancies unfilled or through terminations. Some programs even
lost funding or ran out of funds before the end of the fiscal year. All FAA local offices had
not been exposed to the DAPIM process yet; the practice was not rooted enough to become
part of the culture throughout the state and it lost momentum. The good news is that the
DAPIM process is still used by a few offices as a way to solve local office problems, and
those managers trained as DAPIM facilitators continue to use it as a way of doing business
when they see an opportunity to do so.

80 Training and Development in Human Services


D. Agency culture and performance improvements and the sustainability of the
agency’s improvement efforts results in better client service, client outcomes, and
community partnership.

As improvement efforts took hold at the local office level in many settings, and despite limits
to the effort’s overall sustainability, the impacts on client experience, timeliness, and accuracy
were experienced in these settings. This may argue for beginning broad agency improvement
efforts where they will have the most likely impact on those being served on the front lines.
However, sustainability was limited by this very fact: since the effort was primarily focused in
local settings, ongoing sponsorship at the regional and central office levels was not as strong
as if these had been the first or primary improvement teams being facilitated.

Lessons Learned

First, and as described above, the role and importance of sponsorship cannot be overstated in
terms of the sustainability of this type of intervention. Where those in authority were
committed to the value of this effort, results were far better than in settings where this was
not the case. As APHSA has evolved its practice within other agencies, it has become far
more focused on this very condition of sponsorship and its impact on sustainability, and will
continue to be so.

The FAA executive leadership and mid-management teams might have gone through their
own DAPIM process earlier, perhaps even at the same time as the first set of offices. Maybe
knowing that their leaders were also going through DAPIM would have lowered the
resistance of the first offices; indeed, whenever the FAA director communicated about her
own “personal DAPIM work,” the APHSA consultant and internal facilitator team noted a
significant increase in energy for the effort overall.

Also, the strategies and results coming from the local offices could have been more concretely
aligned with the improvement strategies and results of the mid and executive teams. Each
level of the organization had its own DAPIM work to do, and it all needed to intersect and
work together for the overall good of the program and strengthen the relational link between
all levels of the program.

If the DAPIM process had been disseminated from top down and across the program at the
same time, the chances of it truly becoming part of the organization’s culture could have
created a proactive and dynamic organization that could have withstood the reduction in the
state’s budget and its effect of a reduced workforce, because the program would have be
ready and willing to act proactively at the first sign of trouble.

Second, and not discussed above to any significant extent, is the role and importance of
internal project management for an effort such as this. In Phases I-III of the effort, a project
manager with solid related skills and relationships was assigned, and the effort benefitted

Training and Development in Human Services 81


greatly from their work to maintain schedules and structures for meetings and communication.
Late in Phase III this individual’s priorities were shifted to other work, and the newly hired
project manager never really had the intended impact before their position was eliminated due
to the agency’s budgetary crisis.

Third, the experience in Arizona has informed APHSA’s ongoing efforts to develop internal
DAPIM facilitators in other agencies.

When FAA facilitators started to learn to facilitate the process, there were no reference tools.
This had been purposefully done so that facilitation did not become a structured pattern that
had to be followed. DAPIM is a fluid and flexible process. The best analogy to describe this is
the difference between classical music and jazz. Classical music is played by musicians as
written; no notes are added or omitted when a composition is played. In contrast,
accomplished jazz musicians improvise along with other musicians and still produce a
harmony with an appealing sound. With that analogy in mind, APHSA wanted the experience
of the DAPIM process to be the central focal point rather than a rigid or rote process on paper
that had to be followed step-by-step.

Most of the FAA facilitators felt somewhat lost without something written that they could
refer to during their work sessions. A facilitator’s handbook was developed which included
tips and techniques in facilitating the DAPIM process. There was disagreement among the
FAA internal facilitators as to the timing of when to give a new facilitator the handbook: as
they begin to learn or after they have learned so they can have had time for their own DAPIM
experience.

APHSA continues to strive for a balance between providing structure and clarity as new
facilitators develop their practice and allowing for each facilitator to make the practice their
own so it is intuitive, accessible, and delivered in a dynamic and responsive manner.

An intriguing result was that two of the three internal facilitators selected from the training
function were not successful. Further study is recommended regarding the differences in skill
and temperament between what a classroom trainer does when performing well and what a
consultant or facilitator does. As described above, the best analogy may be one of playing
either classical trumpet or jazz horn: while built upon similar foundational skills and practice,
each type of performer’s temperament and motivation for playing music may impede their
progress in the other form of playing. In this case, the APHSA consultant concluded that one
of the trainers was too rigid about a instruction-centered approach to acquire a more
facilitative technique. The other trainer was too content-centered—accustomed to presenting
conceptual training content for its own sake—versus facilitating in a manner that yields
customized and concrete work products from a team. In contrast, the trainer who was
successful was highly skilled in adaptive and participant-centered training techniques. A
model that elaborates upon trainer characteristics in these ways may be useful to the field.

82 Training and Development in Human Services


After the first year of this effort, several unexpected circumstances came up that resulted in a
gap between the number of available facilitators and the number of offices designated to go
through the DAPIM process. One unexpected circumstance was the reconfiguration of
management teams by upper management, moving members around from one office to
another to fill-in where needed. The composition of several management teams became a mix
of people who knew the DAPIM process and the new arriving members who had not learned
the process yet. Another was that some offices that had gone through the process were asking
for extended follow-up sessions. In other instances, upper management requested that an
office go through it again because relational issues had not been fully resolved, which resulted
in some operational issues remaining unresolved. These circumstances placed a scheduling
burden on facilitators trying to work with new, intact work groups and work with offices that
had already been through the process.

It is also important to understand that FAA internal facilitators were conducting these sessions
in addition to performing their normal job duties. The facilitation of the DAPIM process was
voluntary, with the approval of their immediate supervisor. As workload increased for many
of the facilitators, in large part due to the budgetary crisis and increased applications that
ensued beginning in late 2008, their availability to facilitate decreased. In some cases they
were pulled from providing facilitation assistance altogether.

A final lesson learned is that despite the many positive impacts of embedding continuous
improvement methodologies at the supervisory level of an agency, a major trauma to the
broader organization and environment can limit full implementation of DAPIM.

Training and Development in Human Services 83


BUILDING ORGANIZATIONAL EFFECTIVENESS
CAPACITY IN A TRAINING SYSTEM:
PENNSYLVANIA CHILD WELFARE TRAINING PROGRAM

Kathy Jones Kelley, Organizational Effectiveness,


American Public Human Services Association
Contributor:
Maryrose McCarthy, Pennsylvania Child Welfare Training Program,
School of Social Work, University of Pittsburgh

The Pennsylvania Child Welfare Training Program (CWTP) is a collaborative effort of the
University of Pittsburgh School of Social Work, the Pennsylvania Department of Public
Welfare, and the Pennsylvania Children and Youth Administrators. It was established to train
direct service workers, supervisors, administrators, and foster parents in providing social
services to abused and neglected children and their families. The CWTP is centrally managed
and regionally administered by the University of Pittsburgh School of Social Work.

During the first round of Child and Family Services Review (CFSR) in 2003, the CWTP was
recognized to be in the top tier of child welfare training systems by the Administration for
Children and Families. Despite the national recognition, the CWTP engaged in a
transformation initiative following the first round of CFSR The goal was to expand a strong
classroom-based system for developing staff knowledge and skill to a system that also would
have the capacity to support agencies in improving overall organizational effectiveness (OE).
To achieve the goal, the CWTP collaborated with the American Public Human Services
Association (APHSA). The CWTP documented its success and challenges throughout its
transformation through an informal tracking system. In 2009, the CWTP re-engaged with
APHSA with the goal of familiarizing OE staff with updated APHSA OE models and tools.
At the same time, Pennsylvania was engaged in the second round of CFSR.

What follows is a description of the process the CWTP engaged in to build OE capacity as
well as the impact of OE facilitation in one local county agency. Also discussed are lessons
learned and planned next steps to support positive outcomes for the children, youth, and
families served by the Pennsylvania child welfare system.

Background

Pennsylvania’s child welfare system is state supervised and county administered via 67
counties agencies. The CWTP provides both training and technical assistance to county

84 Training and Development in Human Services


agencies. The CWTP has four regional teams to support county agencies. The regional teams
are aligned with the four regions of the Pennsylvania Office of Children, Youth, and Families
(OCYF). Through this regional delivery system the CWTP maintains a focus on providing
individualized training and technical assistance services to each county agency. This case
study will focus only on the OE-related technical assistance and support.

OE facilitation was first introduced to county agencies by the CWTP in 2004. At that time,
OCYF was interested in the CWTP supporting practice improvement efforts identified in the
Program Improvement Plan (PIP) such as family engagement, youth engagement, building
systems of care, staff retention, visitation, and risk/safety assessment.

The CWTP partnered with APHSA to develop practice improvement specialists through
coaching and mentoring by APHSA. The OE efforts were designed to transform the CWTP
from a system that delivered traditional classroom training based on individual needs of staff
toward a system that offered a continuum of products and services that ranged from
traditional classroom training to OE related technical assistance based on organizational
needs.

At the core of this transformation was the development of training and technical assistance
teams to support local county agencies. The practice improvement specialists become key
members of the teams along with transfer of learning specialists and independent living
specialists. The regional teams were supported internally by departments accountable for
curriculum and trainer development, coordination of training delivery, and technical and
administrative support. Since 2004, teams have collaborated with counties agencies to
complete organizational needs assessment to identify the training and technical assistance
needs. The assessment leads to an individualized training and technical assistance plan that
the team uses to coordinate a continuum of services to support county agencies toward
improving outcomes for children, youth, and families.

OCYF and the Pennsylvania Children and Youth Administrators view this continuum of
service as critical in supporting county agencies in using OE models and tools to support
quality improvement efforts. When developing the PIP in 2009, Pennsylvania linked the
APHSA OE approach to the newly developed CQI process that was embedded in the PIP as a
primary strategy to support change at the local level. As part of this linkage, the CWTP is
viewed by both state and counties leaders as a key resource in supporting county agencies in
the development of continuous improvement plans to improve services to children, youth, and
families.

APHSA Approach to Organizational Effectiveness

Since 2004, APHSA has continued to develop OE models and tools and to support
organizations in building OE capacity. In 2009, the APHSA OE department compiled the
models and tools into an OE handbook. The overarching purpose of the handbook is to serve

Training and Development in Human Services 85


as a resource for organizations that are making continuous improvement a way of doing
business.

APHSA has defined OE as a systemic and systematic approach to continuously improving an


organization’s performance, performance capacity and client outcomes. Systemic refers to
taking into account an entire system or in the case of OE an entire organization; refers to
taking a step-by-step approach. In simple terms, OE is a step-by-step approach to
continuously improving an entire organization.

DAPIM is the primary APHSA OE model. DAPIM enables work teams to drive continuous
improvement. The approach involves defining what the organization wants to improve;
assessing strengths and gaps in performance capacity, performance actions, outputs, and
outcome; planning for remedies; implementing plans for maximum impact and
sustainability; and monitoring progress, impact, and lessons learned to sustain follow-
through and continuous improvement.

Teams engaged in a facilitated learning-by-doing process using the DAPIM model become
familiar with OE models, tools, templates, and methods to continuously improve in priority
areas. Work products that result from the continuous improvement effort include
development, implementation, and monitoring of quick wins; mid- and long-term continuous
improvement plans; and related communication plans. Monitoring techniques are used to
assess progress and adjust continuous improvement work as needed.

Structure of the Facilitated Continuous Improvement Effort in Pennsylvania

In 2009, the CWTP sought support from APHSA to continue the professional development
efforts of their staff by introducing them to updated APHSA OE models and tools. At the
same time, the CWTP just completed a year-long effort to restructure. The on-site effort took
place from September 2009 through March 2010.

The CWTP identified sponsors that included members of the CWTP’s leadership. The
sponsors worked with APHSA to develop a work plan to achieve desired results.

The sponsors formed a pilot group to engage in DAPIM a learning-by-doing continuous


improvement process. The purpose of the pilot was to apply the updated OE models and tools
at the local level via a regional training and technical assistance team. Based on lessons
learned from the pilot, a plan would be developed to use the models and tools throughout the
CWTP. Washington County Children and Youth Services (WCCYS) was selected as the local
pilot site.

The pilot group included all staff from the CWTP’s Western Regional Team, select staff from
each department within the CWTP, WCCYS staff, OCYF regional staff, and statewide
technical assistance partners. It should be noted although a conscious effort was made by the

86 Training and Development in Human Services


sponsor team to include staff from each of the CWTPs departments, some staff self-selected
out of the process due to travel distance to Washington County.

From September 2009 through March 2010 the following was completed:

• Pilot group members were familiarized with APHSA’s updated OE models and tools.
• The pilot group engaged in DAPIM and a learning-by-doing session to become
familiar with the OE handbook, develop a desired future state for the Western
Regional Team in supporting county agencies, and plan for application of OE
handbook narrative guidance, models, templates, tip sheets, and team activities in
WCCYS as part of their CQI efforts.
• A continuous improvement plan was developed to support the Western Regional Team
in achieving its desired future state.
• Using lessons learned from the pilot, a plan was developed for engaging all CWTP
staff in learning-by-doing continuous improvement efforts. The plan was designed to
introduce all staff to OE models and tools and the content of the OE handbook in order
to make OE a way of doing business within the CWTP. The success measure
identified in the plan was for each of the four regional teams to use updated OE
models and tools in their work with county agencies resulting in county continuous
improvement plans. In addition, all staff members would use the OE models and tools
in planning for their day-to-day work tasks—modeling best practice for state and local
leaders. The pilot group planned to use the learning-by-doing approach versus
traditional training-on-content approach to the content of the OE handbook. The pilot
group served as the implementers for building internal OE capacity. As part of the
plan pilot group members facilitated each regional team and all departments in
learning-by-doing sessions.
• The pilot team engaged the WCCYS sponsor team and continuous improvement team
in DAPIM learning by doing, resulting in a continuous improvement plan for the local
county.

Three Months Post Facilitation

In June 2010, APHSA conducted focus groups and interviews with the pilot group and
WCCYS leadership and staff to gather lessons learned and assess the impact of the continuous
improvement efforts in the CWTP and WCCYS. The data collected focused on the following
key areas:

• effectiveness of the learning-by-doing process the pilot group engaged in


• impact the OE models and tools were having on the OE facilitation practices of the
Western Regional Team, specifically in WCCYS
• effectiveness of the plan developed to introduce all CWTP staff to the OE models and
tools (moving toward the goal of making OE the way of doing business in the CWTP)
• determine connections between OE and statewide CQI efforts

Training and Development in Human Services 87


• impact of OE efforts in WCCYS
• lessons learned and recommendations for next steps toward building OE capacity
within the CWTP

At the time the data was collected the CWTP was still implementing its internal plan to build
OE capacity and the Western Regional Team was continuing to support WCCYS in OE
efforts. The following outlines the findings from the focus groups and interviews.

Reflections from Pilot Group Members

Understanding of Purpose

There was consensus among pilot group members on the purpose and anticipated outcomes
for the pilot. Generally pilot group members understood the purpose of the pilot was to
introduce participants to the updated OE models and tools, enhance the OE facilitation skills
of all regional team members, engage all CWTP staff in OE as a way to model and promote
continuous quality improvement, and that “OE was the way we were to do business in
Pennsylvania counties.”

As the pilot progressed, an awareness developed for members of the pilot group around the
connections OE would have to the Pennsylvania’s CQI process; specifically the pilot brought
to light the critical role the CWTP practice improvement specialists would have in CQI as OE
facilitators.

Value of Learning by Doing and Impact of OE Models and Tools on the Practices of
Regional Teams

Members from the pilot group agreed that the learning by doing approach was valuable to
them. One member commented, “I did not appreciate the learning by doing approach until
about midway through the process. It is powerful and infiltrates the process in subtle ways.”
Another commented, “The learning by doing approach allowed for the opportunity to learn
about the models while getting the feedback from the county to frame how we should do our
work.”

Pilot group members observed that the parallel process of learning about the OE models and
tools while applying them in WCCYS had a positive impact. Processing the use of the tools in
the county agency allowed them to understand how to effectively facilitate OE based on
county input. The learning-by-doing approach “allowed the opportunity to learn while getting
the feedback from the county; it framed how to do the work.” Processing the work in WCCYS
modeled a team approach to working with counties.

The DAPIM model was viewed by all pilot group members as valuable and critical part of
their work moving forward. Pilot group members’ comments about DAPIM included:

88 Training and Development in Human Services


“DAPIM showed us how to focus on the desired future state for guiding our work. It
was missing in our child welfare work efforts. Defining allows us to assess.”

“DAPIM structure is helpful. It is driven by the customer to create quality.”

“Every component of DAPIM is important. Each step is important. Modeling the


flywheel was critical.”

“The DAPIM model has had a big impact. I have a clearer understanding of what I am
trying to do in a county. My role is to help them find their own solutions versus
jumping to solutions for them. This approach to problem solving and project
management works.”

“The biggest thing for me was the define work. We were not stopping to define. We
just jumped to solutions.”

“If we really want to change we need to know where we are going. We need to do
really good define work.”

“[The] process has helped us to ask others: what is the target, what is your desired
future state?”

One system partner who participated in the pilot commented “as a partner in the process it
became clearer what OE is and how we are there to support the OE efforts after the D[efining]
and A[ssessing] work is done. We can focus on remedies to gaps.”

Members of the pilot group also noted the value of root cause and remedy work, specifically,
how remedies were organized into recommendations, commitments, and work team activities.
The pilot group saw this as a way to organize the work for county agencies into manageable
“things they can do and what needs to be bumped up as recommendations.” Root cause and
remedy work was also viewed as a strength-based way to advance the needs of the county
agencies to OCYF. The pilot group viewed the recommendations that would surface during
DAPIM as a positive way to inform statewide planning efforts “versus planning then telling
the counties to implement.”

Pilot group members commented on developing an appreciation for “monitoring” and using
lessons learned to adjust plans versus “just moving forward.” Developing concrete work plans
and then monitoring the work plan helped pilot group members see the impact on outcomes. It
was reported that monitoring is currently being used in the day-to-day work of pilot group
members. Pilot group members also reported encouraging all CWTP staff to use monitoring
techniques.

Training and Development in Human Services 89


Other OE models and tools that impacted the pilot group included the capacity planning team
activity, communication planning template, trust model, safety and accountability model,
markers for effective OE facilitators, and the readiness model.

The capacity planning team activity allowed pilot group members to define the tasks they
should be doing and then reflect on the tasks they actually do. This team activity resulted in
the development of a work plan tool. The tool was viewed as an important data collection tool
moving forward. Pilot group members felt the data could be used to make decisions on
agency structure and work load assignments. One pilot group member commented “the tool
would allow decisions based on data not assumption.” In addition it was reported the capacity
planning team activity has been completed in county agencies with successes.

Communication was identified as a gap for the CWTP. It was reported that the
communication planning template was being used by pilot group members in their day-to-day
work. The tool was reportedly assisting CWTP staff in thinking through all of the elements to
consider when planning for communication.

The trust model and safety and accountability model provided a way for pilot group members
to think about root causes for some of the organizational gaps they face. The models gave
pilot group members a safe way to talk about sensitive topics. One pilot group member
commented, “The power of using the tools was an important part of the process.”

The markers for effective OE facilitators helped pilot group members to understand the
difference between training and facilitation. The markers are currently being used to assist
with professional development of the Western Regional Team. The CWTP is creating markers
of effectiveness for other roles in the CWTP to assist in role clarity.

The readiness model was viewed as a tool that could assist the CWTP in selecting county
agencies for CQI reviews.

No tools were reported as ineffective.

Structure of the Pilot

Pilot group members reported the structure of the continuous improvement effort has had a
positive impact on the Western Regional Team. Examples of the impact on the team included
the following:

• Using the DAPIM model in team meetings to discuss and plan for work with counties
• Monitoring the continuous improvement plan developed during the pilot at team
meetings
• Using the markers for effective OE facilitators to develop professional development
plans

90 Training and Development in Human Services


One member of the Western Regional Team commented “we are more intentional with our
decisions, not jumping to training as the only solution for counties.” Team members credit
their supervisor for much of the impact the pilot has had for them. One member commented
“she is really moving us forward, she is very invested and is trying to move it forward
program wide.”

The supervisor for the Western Regional Team expressed the pilot provided the opportunity to
develop a supervision style to support the Western Regional Team using OE models and
tools.

Some pilot group members felt the structure of the pilot allowed those not on regional teams
to develop a better understanding of OE and how it can support county agencies in developing
and delivering quality services to children, youth, and families. One pilot group member
commented “just spending time together with OE staff helped to share a unity of purpose.”

The pilot structure brought to light the importance of OE to the CQI efforts in the state. OE
was viewed as the way to facilitate the changes needed to ensure quality services to families.
Pilot group members expressed having a clearer understanding of how important practice
improvement specialists would be to CQI.

The system partners that were members of the pilot group felt the structure allowed them to
have a “stronger relationship in the west.” It was reported that pilot group members are
continuing to collaborate on work with county agencies and that there is a greater awareness
of the need to connect with each other as technical assistance and training providers.

Concerns about the structure of the pilot focused on communication, lack of involvement by
all regional team supervisors, and lack of role clarity on using OE models and tools in the
day-to-day work of the CWTP.

Pilot group members did express concern that the structure of the continuous improvement
effort resulted in lack of buy-in from all CWTP staff. It was reported that “staff have been
slow to respond.” Some felt the reason for lack of buy-in was poor communication during the
pilot process itself. Despite communication planning at the end of each pilot session, internal
communications about the activities taking place during the pilot appear to have been lacking.
Now that the pilot was completed, pilot group members felt that communication plans
developed and implemented during the pilot did not have the intended impact. It was reported
that CWTP staff generally did not make the connection between the OE kick-off session in
September, which all CWTP staff attended, and the pilot. It was also reported by pilot
members that staff thought OE would only occur in the OE department. Comments on
breakdown in communication include:

“The pilot and OE appeared to be a mystery to staff.”

Training and Development in Human Services 91


“When staff asks about OE they were told information is coming versus sharing what
was happening.”

“We missed a great opportunity to talk about OE in our staff meetings and day-to-day
interactions.”

Pilot group members reported that not having regional team supervisors participate in the pilot
placed supervisors in a challenging position. The impact has resulted in supervisors not all
operating from the same knowledge based of the OE models and tools. Some pilot group
members expressed a belief that a lack of understanding OE by supervisors was creating an
“uncomfortable environment for supervisors.”

Some pilot group members recommended that role clarity for supervising regional teams
using OE models and tools be completed. Some felt the work of role clarity around
supervision should happen immediately in order to support the professional development of
supervisors.

Pilot group members expressed if the supervisors and managers continued to receive support
in developing an understanding and working knowledge of how to use OE models and tools in
planning for the organization and coaching staff, the CWTP would remain viable to the
county agencies and state.

Impact of Plan to Engage All CWTP Staff

As part of the pilot, a plan was developed to engage all staff in the OE effort. The plan
included a full-day meeting to re-introduce the OE models and tools to all staff. It also
included plans to facilitate learning by doing sessions to complete a DAPIM with each
department and regional team from May through December 2010.

The full-day session with staff was well received. It was reported that some staff speak of
DAPIM often while others continue to see the effort as an OE department initiative. A follow
up learning by doing session have been slow to be planned with the exception of the
Statewide Quality Improvement Department and two regional teams. Root causes for the
reactions from staff appear to be “capacity issues” and “a feeling of not being included in the
pilot.” Leadership is viewed as committed to the OE work.

As the plan was being implemented, some pilot group members expressed concern about
maintaining the fidelity of the model. Some pilot group members have expressed a lack of
skills and knowledge among the CWTP staff to do OE facilitation effectively. One team
member commented, “We have work to do to develop these [OE facilitation] skills in our
facilitators but we are embracing OE and are committed. Monitoring our plan will be
important.”

92 Training and Development in Human Services


Impact of the Facilitator

Pilot group members reported it to be a strength of the process to have an experienced OE


facilitator help them continue development of their OE facilitation skills. The facilitator was
viewed as objective, demonstrated a willingness to “call out tense moments” and “point out
our challenges” in order to “move the group forward.” The facilitator held team members
accountable to commitments and was able to use the correct tools and models to support the
team. The pilot group members reported trusting the facilitator and “this helped [us] to trust
the process.”

Reflections from Washington County Children and Youth Services

Understanding of Purpose

WCCYS leaders saw participation in the pilot as a way to support them in managing the
changes they were experiencing. They saw OE as a way to help them identify structural areas
to be enhanced in departments and units of their agency by engaging staff as part of the
process. Leaders specifically wanted to improve practice by improving attitudes of staff and
engaging families.

WCCYS staff who participated in the pilot viewed the OE efforts as an initiative of the new
leader: “new leader, new direction.” Staff also saw the purpose as a way to create a better
internal working environment. Staff viewed the effort as important, stating “the OE effort is
important; we think this can work.”

Value of Learning by Doing and Impact of the OE Models and Tools

WCCYS leaders felt the learning-by-doing process was helpful in “understanding what the
agency is hoping to accomplish.” The leadership indicated that they “started large” by looking
at global issues in the agency. The leadership recently realized if they could start over they
would start DAPIM work in “smaller areas” with a more specific focus. WCCYS leadership
stated they would start by working with the management team and expressed a belief that a
stronger management team would support the agency in OE efforts.

WCCYS leadership also reported the agency has a focus on OE: “The agency is capable of
positive change.” The OE technical assistance process will support them in change.

WCCYS staff shared the views of leadership on the importance of the OE work. One staff
member commented “It is important to know that we do think this [OE] can work. We care
about what happens in our agency.”

Training and Development in Human Services 93


WCCYS staff reported that the learning-by-doing process was working with regard to
defining roles and developing a better understanding of “why [staff] people do what they do
in the agency. “The WCCYS staff expressed they worked well together; respected each
other’s ideas; and could talk about controversial ideas in their OE sessions. WCCYS staff said
“We believe in our new administrator; things can really change with this administrator.” They
also felt the DAPIM process could work in their agency.

WCCYS staff said the OE process and activities have had an impact on their day-to-day
practice. They talked about being “more open-minded, aware of layers in the agency, and
more respectful for other staff.” One supervisor reported using DAPIM in supervision.

Staff participating in the pilot suggested the WCCYS sponsors of the OE work meet with
them to monitor work completed to date and discuss the recommendations they had for
moving forward with the OE work. WCCYS staff suggested working on methods for
engagement and teaming internally “just like we do with families.”

The OE models and tools viewed as having the most impact on the organization include the
team activities on role and authority; developing a mission, vision, and set of values; creating
a visual case flow; frequently asked questions tool; and the safety and accountability model.

Linking OE and CQI

WCCYS leadership was asked specifically about linking OE to the CQI. WCCYS leadership
suggested starting with a readiness assessment to determine if the agency has a strong
foundation to start CQI, and if not, start with OE to build readiness. Leadership felt OE could
help them prepare the agency for improving practice by creating a “strong foundation to
support change and then building strong practices on top of the foundation.” They were
hopeful that linking OE and CQI would lead to a single plan that is “do-able” for the county
agency. Leadership also expressed an interest in and commitment to incorporating DAPIM
into their internal quality assurance reviews on case records to build improvement plans.

Impact of the Facilitator

Both the WCCYS leadership and staff felt the facilitator had a positive impact on the OE
efforts in the county agency. The facilitator was viewed as “straightforward, engaging, and
knowledgeable of the state system, understanding of county needs and barriers and a good
translator from implementation team to sponsor team.” They felt the facilitator has been able
to balance the process and is attempting to meet the needs of both leadership and staff. They
described the facilitator as being available and creating a trusting environment with no hidden
agenda. Some staff shared this view: “Difficult messages are easier talk about with Jen here.”
Staff specifically reported that the facilitator “does well with clarification and causes them to
think outside the box.”

94 Training and Development in Human Services


One Year Post Facilitation

Based on data collected in interviews, focus groups, and surveys of the pilot team and
WCCYS staff, as well as APHSA recommendations, CWTP leadership continued to support
internal efforts to build OE capacity. APHSA support included a day-long session with all
staff accountable for facilitating OE to review models, tools, and skills of effective
facilitation. In addition, quarterly meetings were planned throughout 2011 with APHSA and
the OE facilitators to capture lessons learned and provide technical assistance on skill
development.

In March 2011, APHSA conducted an after action review with all CWTP staff and leadership
from WCCYS to collect lessons learned on continuous improvement efforts, progress toward
stated goals, and updates on implementation of recommendations from the final APHSA
report.

Assessing Impact from the CWTP Staff Point of View

During the after action review staff reported it has been difficult to separate the impact of
their OE efforts from a major restructuring of the CWTP that took place at the same time.
Some staff felt the DAPIM model should have been used for the restructuring.

The staff reported the following positive impacts of engaging in OE efforts:

• The DAPIM model helps provide a common language, and the structure of DAPIM
makes it “do-able” to learn.
• The tools were viewed are helpful to have when engaged in discussion and planning
efforts; specifically some staff found the communication plan, project management,
and data analysis tools useful.
• Developing a desired future state keeps staff focused on the direction of the CWTP.
• Developing clear work plans has made the CWTP more effective in monitoring and
communication. Monitoring activities have shown staff “we have gotten things done.”
Many felt accountability was improved, and there was a renewing of commitments by
staff.
• A common language now existed within the CWTP for approaching and
communicating with counties about OE.
• Engaging in DAPIM sessions helped with role clarification and “bringing us together
as a team.” The sessions helped to “connect the organization, brought us out of silos,
and brought administrative support staff into the organization.”
• Engaging in DAPIM helped to balance a focus on both the product and process.
• Overall there was sense for many staff the process works. They reported the model
“gives you a way to think critically about what a county is saying.” Staff seemed
excited about using DAPIM in counties and when completing organizational needs
assessment with them.

Training and Development in Human Services 95


Some staff reported they would have changed the order of how staff engaged in OE efforts
following the pilot. They felt the leadership team should have been first, followed by the OE
department, and then the remainder of the staff. Staff reported this would have allowed
leaders to have access to and understand the models and tools before staff, which would have
created full leadership buy-in. Staff also reported they felt this would have created clarity on
desired future state, reduced confusion on decision making, and provided clarity on how
teams should move forward with recommendations.

Staff felt having the OE department engage second would have allowed all regional teams to
benefit from the experience of the western regional team and developed additional internal
capacity to facilitate OE at a faster speed. Staff also felt this would have prevented regional
teams from moving at different speeds in learning about OE models and tools, which had a
negative effect on the momentum and progress for some teams.

Some staff reported the effort was focused on task and lacked attention to the relationships
(trust of internal facilitators and the overall process). Internal facilitators reported having a
new sense of and understanding for the “struggles faced by other teams and departments.”

Some staff were concerned the DAPIM focus might take away some of the critical thinking
skills of staff, stating “we are dependent on the OE handbook.” And some staff were
concerned the CWTP had taken on the DAPIM model and excluded other models. One
observation: “It is causing us to recreate things that already exist.”

Despite a written plan to engage the entire CWTP in OE efforts, there was confusion by some
staff about the goals and benchmarks for the effort. Some staff did not remember reviewing a
written plan.

Moving forward, the staff would like to pay close attention to the organizational readiness
model, reporting “it would have been helpful prior to developing the work plan.”

Assessing Impact from the Point of View of WCCYS Leaders

In the after action review WCCYS leaders observed that the OE efforts have brought out the
strengths of staff and their dedication to delivering quality services to children, youth, and
families. Leaders report it helped them in making decisions about roles managers and staff
could play in the organization. Leaders report the process helped with decision making and
getting buy-in from staff on change efforts. They also have found monitoring of quick wins to
be successful, but are struggling with how to monitor impact of mid- and long-term action
plans. Leaders did report meeting more frequently with staff has helped with monitoring.

Leaders have observed decreased energy in staff not engaged in the OE effort and plan to
broaden involvement.

96 Training and Development in Human Services


One thing WCCYS leaders reported they would do differently is start with small and tangible
items. Leaders observed staff struggling when trying to address a cultural issue such as trust.
When staff focused on tangible items such as the case flow process, however, they began to
develop trust by working together and achieving success.

Recommendations WCCYS leaders have for the CWTP in implementing OE in other counties
include (a) providing additional clarity upfront on goals and objectives and (b) more direction
on selection of the sponsor team and internal continuous improvement team. They suggested a
visual road map to show agency leaders the potential impact of “starting big” or focused on
small items. And lastly, they suggested sharing examples of successes and challenges from
other counties to assist them in decision making.

As the CWTP continues efforts to build OE capacity, leadership demonstrates a strong


commitment to support staff. Staff is excited and encouraged about having a common
language and process to support counties in improving outcomes for children, youth, and
families. But more importantly, county and state leaders see the CWTP as a key partner in the
provision of the training and technical assistance needed to achieve positive outcomes for the
children, youth, and families served by the Pennsylvania child welfare system.

Training and Development in Human Services 97


TEXAS: A CASE STUDY IN
ORGANIZATIONAL EFFECTIVENESS

Jon Rubin, Organizational Effectiveness,


American Public Human Services Association and
Daniel Capouch, Texas Department of Family and Protective Services

Introduction

APHSA defines organizational effectiveness (OE) as a systemic and systematic approach to


continuously improving an organization’s performance, performance capacity, and client
outcomes. Since 2007, APHSA’s work in Texas through partnerships with Casey Family
Programs and the Texas Department of Family and Protective Services (DFPS) has been an
example of how dynamic OE work can be. These partners demonstrated systemic and
systematic growth from working in one region on a specified improvement area (staff
retention) to developing internal OE capacity that impacts the DFPS system statewide,
supporting solution-focused efforts across regions and programs.

It is typical of APHSA to use this inside-out approach to OE. Once an organization begins to
improve one area of performance, other areas of performance are affected and improved as
well, leading toward positive momentum for continuous improvement. Once continuous
improvement efforts take hold, it is no longer an initiative or externally facilitated effort. It
becomes but the way business gets done, the way decisions get made, and the way problems
get solved in the organization, leading to improved performance throughout. To reach this
goal in Texas, plans were made not just to lead staff members through OE processes and
procedures but also to build capacity for sustainability of OE leadership.

This article gives a chronology of OE efforts in Texas, describes attempts to build the
capacity and sustainability of OE work in the state, and reviews some specific examples of
change plans and lessons learned along the way.

History

Building Partnership and Sponsorship

In 2007, Texas’ Child Protective Services (CPS) partnered with Casey Family Programs and
APHSA to launch a continuous improvement initiative for its regional directors, program

98 Training and Development in Human Services


administrators, program directors, and front-line supervisors. At that time CPS sought to drive
both immediate and longer-term improvements to regional performance and capacity. This
initiative was launched at time of considerable growth in staff size of the organization as a
result of legislative direction toward CPS reform beginning in 2005. Thousands of new staff
was added to over a very short period of time. In state fiscal year 2007 over 2,500 new full-
time employees were added, creating both great excitement and pressure with this growth.
More staff was allocated in the next two legislative sessions.

APHSA came to the partnership with strong experience helping state and local human
services agencies continuously improve organizational effectiveness and outcomes. Casey
Family Programs brought significant financial, logistical, and project management support
and experience helping CPS innovate its front-line child welfare practice. CPS came forward
with a committed core group of managers dedicated to staying the course for building
significant changes in their system. The partnership set the stage for a multi-phased work
effort to improve retention of front-line staff, a specific area of focus linked to both Casey’s
2020 outcomes and Texas’ proposed program improvement plan.

Key individuals from the partnership formed a sponsor group to provide oversight for the
work effort. The sponsor group secured resources; ensured strategic alignment with the
vision, mission, and outcomes of CPS; and obtained buy-in from regional leadership teams.

Essential to the success of the partnership was the direct and honest communication between
the members; respect for the strengths each member brought to the table; effective use of all
strengths throughout the planning, implementation, and monitoring phases; and the constant
flexibility all members demonstrated in order to meet the needs of the regions.

Unique to this effort was the planning approach facilitated by APHSA with the sponsor group.
Core to the approach was the reflective thinking engaged in by the sponsor group during each
phase of work as it was occurring, and the ongoing adaptability of everyone involved. This
allowed for decision making in real time to plan for future phases.

Traditional planning processes look at overarching vision and strategy and then plan
initiatives and activities to engage in over a multi-year time frame. The inside-out approach
allowed the sponsor group to begin by clearly defining a vision for continuous improvement
work. This vision supported staff begin solution focused and using the expertise that they had
themselves to define problems, seek root causes for these problems, and develop and
implement plans that would lead to sustainable system improvements. The initial phases of
work were then planned. The vision remained in the forefront for the sponsor group as data
was generated from statewide data sets and regional teams. The sponsor group used this data
to monitor the impact of continuous improvement work on retention of front-line staff and use
of the continuous improvement process by regional leadership teams. This ongoing
monitoring led to incremental, dynamic, and phased planning to support the success of the
regions.

Training and Development in Human Services 99


From November 2007 through June 2008 a first phase of work was completed in Region 6. It
focused on turnover of front-line staff in the Houston and Harris County conservatorship
programs, which serve youth in out-of-home care. The first phase of work contributed to a
significant reduction in workforce turnover in Harris County and set the stage for the
continued growth of the initiative and the building of OE capacity across the state.

From May to June 2008, a second phase of work was completed with regional directors and
their management teams who were new to their roles. At that time CPS was interested in
providing them with a leadership development session that would be foundational for their
work moving forward. The session resulted in the completion of strategic readiness
assessments and short-term plans in each region. Implementation of the plans prepared the
new regional directors and their teams for the third phase of work involving all regional
directors and their management teams within CPS.

From August to November 2008 a statewide leadership institute was conducted. This third
phase of work built on the gains from the first two phases.. During the institute regional
management teams along with statewide disproportionality specialists and state quality
assurance specialists gathered together to learn the OE methods and choose areas of
improvement that were most relevant to them. (Disproportionality specialists are staff
assigned in all 11 Texas regions to provide consultation regarding reducing the
disproportionate placement of children of color into foster care and on disparate outcomes
related to service provision) The sponsor team decided to use existing quality assurance staff
who were subject matter experts in the Child and Family Services Review and investigations
rather than staffing this initiative with university- based experts. This decision was made to
assure that this change was more sustainable and that CPS would build this capacity truly
from the inside out. The primary focus of the institute was to provide regional teams with the
tools necessary to engage in continuous improvement work that would lead to reduced
turnover statewide.

At the institute each region was given the opportunity to focus on one continuous
improvement topic area related to reasons staff turnover. Using the Retention Fact Tip Sheet
from APHSA, a tool used to help groups determine the underlying causes of retention-related
issues, regional teams identified strengths and gaps in the system and root causes for gaps.
Specific topics identified for regional work included staff development; communication;
recruitment and hiring practices; and work environment.

Disproportionality specialists brought a perspective of assisting regional teams in


understanding how to continually assess and plan for organizational and front- line practices
to ensure equitable services and treatment to all children, youth, and families. As a result of
the disproportionality specialists’ participation in the Institute, regional teams recognized a
need to make the specialists permanent members of their regional leadership teams. Their
participation allowed for every program, initiative, and service that is planned for and

100 Training and Development in Human Services


provided to be reviewed and examined through an anti-racist lens to ensure equity while
improving safety, permanence, and well being outcomes for children, youth, and families.

Building Capacity and Sustainability

In planning for the third phase of work, the sponsor group prioritized sustainability of the
continuous improvement effort once APHSA facilitators completed their role at this particular
Institute. Quality assurance specialists emerged as the best candidates to form a sustainability
team within CPS. Quality assurance specialists attended sessions with APHSA with the goal
of developing the skills necessary to perform as the statewide OE Sustainability Team. The
sessions provided clarity around this new role and a list of quick win commitments to support
their development as an OE Sustainability Team.

From February to June 2009 the final phases of the statewide leadership institute were
completed. The OE Sustainability Team completed individual skill assessments and
established their individual development plans with team leads. The team then attended a
four-day session designed to support their skill development. The first two days, presented by
CFP, addressed basic facilitation and platform skills. The last two days, facilitated by
APHSA, provided coaching on the use of OE models, tools, and templates; conducted skill
practice; and introduced the OE Sustainability Team to the content of the APHSA
Organizational Effectiveness Handbook. The session resulted in furthering the commitment of
the OE Sustainability Team to their role in continuous improvement work. Overall the four
days created a sense of both safety and accountability for performing as OE facilitators.

From March to June 2009 regional sessions were facilitated by the OE Sustainability Team
with coaching and support provided by APHSA and the team leads. Specific outcomes and
measurements were determined by each region. The sessions resulted in an increased number
of formal continuous quality improvement initiatives, including cross-departmental work
groups, chartered improvement projects, communication plans, and continuous improvement
plans.

As 2009 came to a close, OE efforts began to spread across the state, resulting in connections
from CPS to the state’s Center for Learning and Organizational Excellence (the staff
development arm of DFPS), as well as presentations to front-line practice supervisors who
were encouraged to use APHSA’s DAPIM model as a supervision technique and a frontline
casework practice model that encouraged family engagement, critical thinking, and problem
solving. (The DAPIM model involves defining what an organization wants to improve;
assessing strength and gaps in performance, and identifying root causes and remedies;
planning remedies that include quick wins, medium- and long-term improvements and
addressing root causes; implementing plans for impact and sustainability; and monitoring
progress, impact and lessons for continuous improvement.) At the beginning of 2010, a team
of 11 facilitators was in place and working throughout the state in with regional management

Training and Development in Human Services 101


teams to improve outcomes for CPS clients. Each trained OE facilitator conducted at least one
DAPIM continuous improvement process.

The OE model was used by Texas CPS as the primary process for developing an
implementation plan an enhanced family safety decision making initiative in coordination
with the National Resource Center for Child Protective Services. It was envisioned that the
OE team would play a key role in implementing improved practice processes and techniques.

As the work spread throughout CPS, project sponsors wanted to further develop the capacity
for OE work within their system and ensure the sustainability of the OE effort within the state.
There was also the desire to expand the OE capacity beyond CPS and into the larger DFPS
organization including adult protective services, child care licensing, and operations. To that
end, APHSA proposed to support DFPS by helping to build capacity for OE work by
increasing the number of trained OE facilitators; improving the sustainability of OE work by
providing a written guide for training future OE facilitators and developing leadership staff to
oversee the OE work across the state; and working to improve the skill level and capacity of
the original team of OE facilitators.

Over the course of 2010, APHSA, with the support of Casey Family Programs, was able to
fulfill these commitments and train 11 additional staff members from across the spectrum of
DFPS programs to become in-house OE facilitators. Further, APHSA staff worked with the
tenured OE facilitators to develop a curriculum for on-boarding new OE facilitators as the
state determines necessary. There is a plan to develop an additional class of OE facilitators in
2011 that will bring the total number of OE facilitators in Texas over 30. Finally, APHSA
worked with CFP and DFPS to build and support a team of OE liaisons who will oversee the
assignment and monitoring of the OE work in the state, assure communication with the
project sponsors, collect outcome data, facilitate OE sessions as needed, and serve as
champions of the OE effort, maintaining positive momentum for using OE as a way of doing
business in Texas.

Through 2010 Texas OE facilitators lead over 100 on-site continuous improvement sessions
developed continuous improvement plans on such topics as staff retention, communication,
work environment, hiring practices, disproportionality, supervisor development and
mentoring, welcoming new staff, meeting management, supervision workloads, leadership,
decision making, and ensuring child safety.

Lessons Learned

“When you give someone a fish, you feed them for a day, when you teach someone to fish you
feed them for a lifetime.”—ancient proverb

One of the primary values of APHSA’s OE department is to help clients become self-
sufficient building their internal capacity and continuously improving their own performance.

102 Training and Development in Human Services


Nowhere was this value more clearly demonstrated than in the work in Texas where the
mission was to build a team of OE facilitators capable of leading OE work across the
programs and regions of DFPS.

As this effort unfolded, APHSA became challenged to develop for the skills of teaching
others to do what APHSA OE staff had learned to do. Performing a skill and teaching others
to perform the same skill are two very different actions. Along the way APHSA staff came
across many lessons about becoming better teachers, better performers, and better partners.
Below are some of the lessons learned while working with Texas DFPS.

Readiness and Willingness

Concurrent to the work in Texas, APHSA became much more conscious of the need to assist
organizations with understanding their readiness for change prior to beginning major
improvement efforts. While an organization may feel the need to improve certain areas of
performance, not every organization is prepared to move forward immediately with change
plans. APHSA does not use readiness assessments to rule out working with an organization,
but instead as a tool to see if there are foundational areas that an organization might need to
address as an initial stage of continuous improvement work.

The readiness assessment tool helps focus on whether an organization is ready to move
forward based on a current evaluation of areas such as decision making, role clarity, use of
data, teamwork, use of strategic support functions, and organizational climate. While building
OE capacity in Texas (teaching others to fish), APHSA learned to assess individual readiness
to facilitate and lead others through continuous improvement processes. By developing an
understanding the skills needed to teach others to facilitate, APHSA was able to develop a
deeper understanding of individual readiness to perform these leadership roles.

Some of the skills developed in staff and are needed to be ready to perform OE facilitations
include:

1. Facilitate versus lead the sessions; avoid being prescriptive and overly directive.
Instead, guide clients based on a balance between their energies and need to complete
work products. Do this while developing trust and respect of participants and
maintaining focus in the group.
2. Adjust the session agenda in real time, balancing the speed the team can reasonably
achieve goals with the ultimate objectives of the project.
3. Actively engage and listen to others.
4. When teams go off on tangents, provide them with a line of sight for their discussion
to come back to the problem they are working on and the discussion at hand.

Training and Development in Human Services 103


Readiness is different that willingness. Support those interested in continuous improvement
efforts, and meeting clients “where they are at” in terms of their readiness, there needs to be a
certain level of willingness to be successful doing OE work.

OE work is hard. To truly achieve results, to be accountable for work products, and to
improve outcomes takes significant effort and commitment to the task. If an organization or
individual is unwilling to perform to this level, it is very important for an OE facilitator to
determine why (root cause) the resistance is present. Ultimately, if resistance is constructive
(when an organization or an individual intends to continuously improve but has a different
perspective as to how or what the current priorities are), that resistance should be listened to
and included in the discussions and decisions that follow. If resistance is not constructive
(meant to maintain a status quo or comfort zone), lack of willingness will only impede OE
efforts in the future.

Organizations not willing to change, or individuals not willing to work to continuously


improve, will not be successful using APHSA’s OE model of inductive efforts toward
continuous improvement.

Leadership and Sponsorship

When building OE capacity for an organization, the success of the work lies not simply in the
teaching done in the classroom, but also in the commitment of leaders and sponsors to support
the OE work moving forward and allow OE work to permeate the organization.

To embed a new practice across an organization as large as Texas DFPS (which has 254
counties across 11 regions covering over 25 million people), commitment of funds and
resources is certainly a necessary step for success. Beyond that step, having leaders cascade
down a commitment to this new process and to respecting continuous improvement plans and
teams is the most precious resource the sponsors and leaders can provide. Sponsors and
leaders need not initially have the skills and natural proclivity to facilitate continuous
improvement sessions, but they must demonstrate openness to the process, a trust in the
models, and the desire for continuous improvement efforts to occur under their watch to
insure that the process will take hold. Ultimately, these sponsors and leaders will need to
develop the skills to lead through facilitated problem solving. For leaders who are used to
controlling meetings and directing decisions; this will be a learn-by-doing experience.

Being Outcome Driven/Commitment to Sustainable Change

OE is not a quick fix. It can be used (and frequently is) to develop immediate quick wins,
steps an organization can take to immediately improve an area of performance or demonstrate
those efforts to staff to build buy-in and create positive momentum for change. But generally
speaking, OE is a process to create long-term plans that will result in improved outcomes
based on addressing the root causes of problems, creating long term sustainable change.

104 Training and Development in Human Services


This process takes time, commitment, and a willingness to improve. Without these
components, OE efforts are not likely to succeed in supporting major system changes.

OE sessions themselves require a commitment of time to develop not just simple plans, but
clear definitions of problems, specific and observable findings of strengths and gap areas of
performance, and deep discussions as to the root causes for these gap areas.

These discussions require building trust and a shared sense of safety in the team that only
comes from a shared commitment to improving the organization. There is not a requirement
that everyone agrees immediately to a plan, but there will be the need that everyone agrees to
fully participate in the process. This takes time and a desire to do what is best for the
organization moving forward, not a particular individual goal or department initiative.

By being driven to improved organizational outcomes, by demonstrating willingness to work


towards long-term strategic and sustainable change and not just quick fixes, organizations that
embrace the OE methods will always find quick wins along with the long-term change plans
that will serve to benefit the children, youth, families, and communities our clients serve.

Monitoring Leads to Continuous Improvement

By establishing and maintaining a commitment to monitoring plans and commitments,


APHSA believes that continuous improvement efforts will not only continue, but evolve and
improve through lessons learned and ongoing changes. When monitoring does not occur,
plans are delayed, timing changes, trust is lowered, and morale disappears through the lens of
change fatigue. This is true with plans, but with people as well.

When performance is assessed, when outcomes data is collected, and when public
commitments are made for performance, follow through is more likely to occur. With OE
facilitators, maintaining a commitment to monitor change plans as well as personal
performance is the only way to continuously improve work products and the skill and
performance of individuals. As we engage in this work, we should honor our commitment to
support plan monitoring for our clients and with those we work with, but should further honor
our commitment to monitor our own performance, and seek to continuously improve in our
performance and outcomes.

Summary

The unfolding progress of Texas CPS successes will allow APHSA to test its evaluation
framework and see what outcomes can be achieved by “teaching others to fish.” APHSA has
already seen successes in Texas CPS with the first round of OE facilitators in 2009. By
monitoring Texas’ future progress, we can test evaluation hypotheses, understand how
APHSA facilitation and OE models and tools combined with agency readiness and

Training and Development in Human Services 105


willingness lead to improved agency capacity and performance, and ultimately create
improved client services and client outcomes.

The growth of OE across Texas from a targeted effort in one region to a statewide initiative
that cuts across programs and levels of the organization is an example of APHSA’s inside out
approach, and demonstrates how improving one area of effectiveness for an organization can
grow and serve to improve an organization as a whole, resulting in sustainable change and
improved outcomes for the children, youths, adults, families, and communities served.

106 Training and Development in Human Services


USE OF A STATE PRACTICE MODEL
IN RE-DESIGN OF THE MINNESOTA CHILD WELFARE
TRAINING SYSTEM

Kathy Jones Kelley, Organizational Effectiveness,


American Public Human Services Association
Contributors:
Christeen Borsheim and Richard Dean,
Child Safety and Permanency Division, Minnesota Department of Human Services

The Minnesota Department of Human Services (DHS) has a long history of reform efforts
impacting practice and policy to support positive outcomes for children, youth, and families
served by the child welfare system. Throughout reform efforts DHS maintained the value that
a well-developed, clearly defined child welfare practice model is the basis for reform efforts.
DHS viewed the training unit of Child Safety and Permanence Division (CSP division) as key
support to county child welfare agencies and tribes in implementing the practice model.

In November 2008 the CSP division engaged in a system readiness assessment to define the
purpose and role of the training system in supporting reform efforts. The assessment was
facilitated by the American Public Human Services Association (APHSA) Organizational
Effectiveness (OE) staff and supported with resources from the National Resource Center on
Organizational Improvement (NRC-OI).

A key finding from the readiness assessment was that the state lacked a clearly defined
written practice model to provide guidance to support functions on how to align their products
and services to the vision, mission, and values of DHS. Based on this key finding, and prior to
moving forward with a re-design of the training system, the state engaged in the development
of a written practice model in February of 2009. The process engaged state and local leaders
and front-line staff in a facilitated process to assure the practice model would have direct
application to practice.

Following the development of the state’s child welfare practice model, training staff engaged
in OE efforts to define how the training system would support reform efforts by aligning the
key program areas of the training system to the practice model. This effort resulted in a
continuous improvement(C) plan outlining quick wins and mid- to long-term planning
activities. It also raised the awareness of training leaders of the resources and OE supports
needed internally for training staff to achieve the goals in the CI plan and ultimately provide

Training and Development in Human Services 107


OE related technical assistance to the counties and tribes on the alignment of the practice
model.

This article describes the process the training staff engaged in to align to the practice model
and their efforts to build internal OE capacity. It covers lessons learned and challenges faced
by those who are embracing a major transformation effort they call the “re-design of the
training system.”

Background

Minnesota is a state-supervised, county-administered child welfare system. To support


counties and tribes in the development of competent staff with the skills necessary to serve
families and children at a practical level, the DHS CSP division operates a training system.
The training unit is responsible for the overall operation of the training system.

The training system provides classroom training for county and tribal social workers,
supervisors, managers, directors, and resource families. Minnesota statute mandates that all
new child protection social workers attend foundation training within their first six months of
employment. Minnesota statute and rules both require that all child protection social workers
to complete at least 15 hours of continuing education or in-service training relevant to
providing child protection services every year. This training provides tools to help social
workers to work more effectively and better serve families.

When attending foundation training, social workers learn the fundamental and essential skills
necessary for caseworker practice. Foundation courses include information on child abuse and
neglect, family dynamics, child development, attachment issues, and legal concerns. Upon
completing core courses, social workers are encouraged to attend specialized skills and/or
related skills training. Supervisors, directors, and managers may also attend leadership
training courses on managing and supervising child welfare employees. Specialized skills and
related skills training is available on a variety of topics. County and tribal foster, adoptive,
and kinship parents are encouraged to attend pre-service and/or specialized skills and related
skills training.

In 2008, leaders within DHS, the CPS division, and the training system wanted to build on the
strengths of the training system. The decision was to expand its scope from a primary focus
on traditional classroom training to include organizational effectiveness-related technical
assistance and support to counties and tribes.

The CSP division and training system leadership were particularly interested in ensuring that
the training system continued to operate at peak effectiveness and according to current best
practices in child welfare training while aligning with and building capacity and strategic
readiness to support the model of practice being implemented through child welfare reform.

108 Training and Development in Human Services


To achieve the desired results, leaders sought the support of APHSA and the NRC-OI. This
led to the development of a CI plan to re-design the training system.

APHSA Approach to Organizational Effectiveness

Since 2004, APHSA has continued to develop OE models and tools and to support
organizations in building OE capacity. In 2009, the APHSA OE department compiled the
models and tools into the OE Handbook. The overarching purpose of the handbook is to serve
as a resource for organizations that are making continuous improvement a way of doing
business.

APHSA has defined OE as a systemic and systematic approach to continuously improving an


organization’s performance, performance capacity, and client outcomes. Systemic refers to
taking into account an entire system or, in the case of OE, an entire organization. Systematic
refers to taking a step-by-step approach. In simple terms, OE is a step-by-step approach to
continuously improving an entire organization.

DAPIM is the primary APHSA OE model. DAPIM enables work teams to drive continuous
improvement. The approach involves defining what the organization wants to improve;
assessing strengths and gaps in performance capacity, performance actions, outputs, and
outcome; planning for remedies; implementing plans for maximum impact and
sustainability; and monitoring progress, impact, and lessons learned to sustain follow-
through and continuous improvement.

Teams engaged in a facilitated learning-by-doing process using the DAPIM model become
familiar with OE models, tools, templates, and methods to continuously improve in priority
areas. Work products that result from the continuous improvement effort include
development, implementation, and monitoring of quick wins; mid- and long-term CI plans;
and related communication plans. Monitoring techniques are used to assess progress and
adjust continuous improvement work as needed.

Structure of the Efforts to Re-design the Training Unit

The continuous improvement planning process facilitated by APHSA, engaged training staff
in a learning-by-doing approach using DAPIM engagement of staff throughout the continuous
improvement planning process was a priority for leaders as they viewed this as a primary
vehicle for developing the internal skills and capacity of the training staff to apply the OE
model in ongoing work.

In designing the structure of the effort, CSP Division and training system leaders served as
sponsors. The sponsors, in conjunction with APHSA and NRC-OI, decided that the training
staff would meet as a group for four days—two 2-day sessions held a month apart. These
sessions were scheduled in May and June 2009. APHSA facilitated the sessions. DHS,

Training and Development in Human Services 109


APHSA, and NRC-OI discussed including stakeholders and staff from other DHS divisions;
however, all parties agreed the training staff needed time as a group to reflect on the training
system and its future direction. All parties agreed key stakeholders and staff from other
divisions within DHS would be informed of the process and be engaged in future phases of
work as needed.

APHSA and the sponsors collaborated to develop the agenda and handouts for each session to
assure agreement as to the work to be done, the action plan for doing the work, and desired
outcomes of the sessions.

The agenda utilized a learning-by-doing approach to applying the DAPIM model to drive
continuous improvement efforts in the following key areas of work for the training system:

• Assessment
• Curriculum development
• Training delivery
• Trainer and writer development

Two remaining key areas were identified for future work by the training system: developing
OE facilitators and evaluating delivery of training and technical assistance. Sponsors and
training staff agreed alignment to the practice model in assessment, curriculum development,
training delivery, and trainer and writer development was a critical priority. This alignment
needed to be achieved before expanding services to include OE related technical assistance
and support.

Participants reflected on the practice model developed in 2009 and system readiness
assessment conducted in 2008 to produce work products in each phase of DAPIM. Work
products included the following:

• Establishing ground roles for sessions


• Reaching agreement on a desired future state for each key area of work for the training
system
• Assessing the current state of the training system against the desired state to identify
strengths and gaps
• Identifying root causes and possible remedies for closing the gaps
• Developing a CI plan outlining quick wins, and mid- and long-term activities and
commitments. The plan has two major priorities: inclusion of the practice model in all
training system products and services (first priority) and expansion of training system
services to county and tribal agencies to include OE related technical assistance and
support (second priority).

Throughout the planning process, training staff recognized the need for additional funding
and/or re-distribution of existing training funds to fully transform the training system. As part

110 Training and Development in Human Services


of the CI plan they identified the following quick win: “review resources in our contracts
(university, trainers, writers) and how do we want to use them—not focusing on ‘people’ but
the work that needs to be done to achieve our goals; what roles do we need in place for the
future? How will the decisions impact the RFP? Do we want an RFP?”

Training unit leadership submitted the CI plan to DHS leadership for approval and ongoing
support of planned training unit re-design efforts. DHS leadership was in agreement with the
funds needed to fully transform the training system and approved of training staff phasing
work efforts by first completing no-cost or low-cost tasks while additional funds were
identified by DHS leadership. DHS also requested training system staff complete the quick
win around use of current resources and submit a plan for consideration.

Training staff organized the CI plan into a work products and remedies chart to allow them to
begin implementing and monitoring progress on planned commitments and activities. The
chart outlined work efforts to be completed from July 2009 to March 2010 and included both
approved activities and those still needing funding. Training staff planned to meet routinely to
monitor the impact of plan implementation and collect lessons learned.

Feedback on the Facilitation of Efforts to Re-design the Training Unit

An after action review was done immediately following the facilitation process that used the
DAPIM model and a learning-by-doing process to guide training staff through the continuous
improvement effort. Training staff reported that it provided them with clear direction and
focus. Participants commented on the importance of having a clearly defined desired future
state for each area of planning. They found the facilitated reflection on current system
strengths and gaps to be valuable. Specifically, participants commented that they had never
had the opportunity to critically look at their training system as a group, think about what was
going well, what could be improved, and how they might suggest it be improved. Participants
were observed to be actively engaged in identifying root causes and remedies for gaps.
Specific examples of this include: all participants contributed to the conversation, verbal
acknowledgement of each others thoughts and suggestions, and openness toward accepting
ideas for improving areas of work completed by the training unit even when it was an
individual’s current area of responsibility. For remedies outside their control, participants
demonstrated commitment to thinking of solutions that could be advanced to DHS leadership.
Participants also willingly committed to performing task that were within their control.

Participants commented that the process created a safe and supportive environment for
sharing ideas. They reported having experienced peer learning and unit-driven solutions. The
group agreed that spending time together on their strengths and gaps was useful and reported
it was the first time such an opportunity had been provided. Reflecting as a group on strengths
and gaps allowed them to see the work of the training system from different perspectives and
reconsider how work should be completed to become more effective in supporting outcomes
for children, youth and families. Participants also commented that going through the process

Training and Development in Human Services 111


together, with their leadership was extremely valuable. This allowed participants to get
answers to questions on the spot and to feel supported in their recommendations and
commitments. It was also reported that intersession work kept them focused on the process.

Participants commented how the DAPIM and learning-by-doing became easier to use as they
went through the planning process. There was recognition from some participants on how the
DAPIM process would be transferable to work with counties and tribes. Links were
specifically made to the use of DAPIM assessment work with counties and tribes.

The facilitator observed that participants were interested, enthusiastic, creative, and
demonstrated positive attitudes towards the overall transformation of the training system and
the use of the DAPIM model and tools as a team. Participants verbalized a need for the
training system to expand its services to meet the needs of counties and tribes. They also
verbalized how experiencing the DAPIM in a learning-by-doing session allowed them to see
how a similar process could support counties and tribes in becoming more effective.

The facilitator found it valuable to have the practice model and system needs assessment
documents available as tools to guide reflective thinking and assure the team was clearly
aligning to the broader strategy of DHS. Some participants expressed concern about what the
practice model looked like in front-line practice with children, youth, and families and were
unsure if the practice model served its original purpose of providing guidance to the training
unit re-design.

Some staff also reported concerns that the re-design efforts focused too much on the
traditional training provided by the training system and did not address how the Social
Services Information System (SSIS—Minnesota’s SACWIS system) and resource family
training fit into the broader re-design efforts.

What became apparent to DHS leaders and sponsors during this process was the importance
of phased planning, implementation, and monitoring of work. Leaders expressed it was
helpful to have an overarching plan for the re-design but that flexibility to implement planned
activities and commitments in phases made the plan “do-able.” Leaders noted the importance
of who to involve in which phases of systems change work. They expressed how beneficial it
was to have a facilitator to plan the phased work and to provide input in decisions about who
to involve in each phase of work.

Implementing the Continuous Improvement Plan

From July 2009 to April 2010, the training staff implemented quick wins and mid-range
remedies while DHS leadership explored funding options to fully operationalize the CI plan.

In March of 2010, leadership identified the need to reallocate existing resources within the
training system since no additional funding options were available. Training staff re-engaged

112 Training and Development in Human Services


in learning-by-doing sessions, facilitated by APHSA, with the goal of identifying how to
move forward with the tasks outlined in the CI plan while reallocating resources. This step
was taken to stay true to the commitment by leadership to monitor staff capacity throughout
the training system re-design.

The team participated in four sessions over a five month period. They prioritized work to be
completed from the CI plan, placing the majority of the planned activities and commitments
on hold until reallocation of funds was completed. The training system then developed an
eight month reallocation plan that would be implemented from May to December 2010. The
plan identified tasks the staff would need to engage in as resources were re-aligned to support
the implementation of the training system re-design. The following work products were
completed:

• Prioritization of curriculum for development and revision


• Development and implementation of a priority work task list
• Development of position descriptions for new position within the system
• Completed internal DAPIM session with SSIS staff of the CSP division to support
communication and collaboration efforts between the units during the training unit’s
re-design and to more fully engage SSIS trainers in the re-design efforts
• Defined the elements of the new organizational needs assessment tool and process;
identified strengths and gaps within the unit to develop and implement the new tool
and process
• Defined a new structure for the unit that would support the newly hired staff
• Development of a plan to support the building of OE capacity for identified internal
staff

The unit monitored commitments during each session to ensure progress was being made in
their redesign efforts and to capture lessons learned. The unit recognized that communication,
decision making, and trust issues were preventing them from being the high performing team
they desired to be. Some team members reported not having a clear understanding of what
they were doing and why. Some expressed they did not have regularly scheduled meetings to
review planned activities and monitor progress. Some reported when they did have a meeting,
not all staff placed a priority on attendance. Many staff felt OE efforts were only discussed
when the facilitator attended meetings. Many staff felt communication around the purpose of
the meetings prior to the facilitator attending was not clear, making participation a challenge.

To improve communication, the unit agreed to meet weekly to review commitments and share
results. A commitment was made to make decisions regarding the unit during the meeting so
that everyone was aware of decisions and planned next steps. The weekly meeting did show
improvements in communication and follow through on decisions. The unit also participated
in sessions with an internal DHS facilitator to work on trust issues.

Training and Development in Human Services 113


Throughout the facilitated sessions both the APHSA facilitator and participants remained
flexible in order to address emerging issues due to the transition of resources. The transition
did result in additional workload responsibilities for current training staff, reducing their
capacity to move forward with planned activities from the CI plan even more than anticipated.

In December 2010 and January 2011, sponsors of the effort made the decision to develop four
internal staff as OE facilitators. This would ensure the availability of internal facilitators to
support staff in returning to implementation the CI plan for aligning to the practice model.
Two of the four staff selected were responsible for SSIS and resource family training. The
decision to address concerns in these areas of work was not fully embraced in the training
unit. APHSA coached the four staff and one supervisor in preparing to become OE
facilitators.

Also, in January 2011 APHSA met with all training unit staff to review the efforts they have
engaged since November 2008 to re-design the training unit.

For some, the January session helped them remember decisions made to place CI plan
activities and commitments on hold during the re-allocation of resources. Some staff
expressed it was unclear to them when decisions were being made and why, and they were
having difficulty understanding the OE efforts. Some staff requested going back to the notes
from early meetings linking their work products and remedies chart to the original desired
future state, gaps, and root causes. At the time the chart was developed it made
implementation and monitoring of planned activities easier; however, a lesson learned for the
unit was by moving remedies to their own chart they could not remember why they were
doing certain activities. APHSA has since developed an overview document that reconnects
remedies to the priority gap areas and root causes. The document has been provided to the
training unit leadership.

At the time this case study was prepared, internal staff were preparing to facilitate the
implementation and monitoring of mid- and long-term activities and commitments from the
CI plan related to assessment and curriculum development. The internal staff developed to
support OE facilitation in SSIS and resource families continue to have capacity issues in
performing day-to-day tasks. This is because staff vacancies and high work loads associated
with reallocation of resources mean they have not been able to engage in OE facilitation
associated with the SSIS and resource families areas in the re-design.

Feedback: One Year After CI Plan Development

In an after action review, staff reported liking the DAPIM flywheel and learning-by-doing
process. Many staff felt it was positive to have all training unit staff engaged in the re-design
efforts and that it created a “shared sense of responsibility.” However, many training unit staff
felt they “took on too much.” Reporting this resulted in difficulty with maintaining focus on
the unit’s progress and a loss of momentum for the planned work. Looking back, a lesson

114 Training and Development in Human Services


learned for some staff was recognition they should have “gone through a DAPIM on one area
of work at a time,” allowing them to experiences successes more clearly versus planning for
the entire training system re-design.

Some staff found the language used in OE to be confusing and prevented them from fully
understanding the work they were doing. Some newly hired staff agreed that they found the
language to be confusing.

Many staff reported they felt it was too early to provide feedback on the impact on the
training system of OE models and tools and their planned activities and commitments. Given
the need to re-allocate resources, staff reported they were just now starting to implement the
mid- and long-term activities of the CI plan related to the areas of work completed by the
training unit.

The training unit supervisor acknowledged the impact re-allocation of resources had on the
training unit. He reported “it became the plan, the driving force, but was originally only a
small piece of the plan.” He also reported the one thing he would do differently “is to hire the
second supervisor first.” He believed that a second supervisor could have assisted in
addressing communication and decision-making gaps and that both current and newly hired
staff would have been better supported during training unit re-design.

A second supervisor was added to the training unit in March 2011. The current supervisor was
looking forward to having the additional supervisor supports in place that staff need for this
type of major transformation. He reported the new supervisor could help “to gain speed—we
have been idling.”

Planned Next Steps

Training system staff remains committed to the re-design of the training unit. They are
looking forward to the newly developed internal facilitators supporting them in OE efforts and
implementation of their CI plan. Sponsors of the efforts continue to encourage and support the
staff in the unit with re-design efforts. They acknowledge the staff openly for their continued
commitment to aligning to the practice model and building OE capacity to support counties
and tribes.

Sponsors also remain committed to hiring additional staff to facilitate OE technical support
sessions with counties and tribes. They remain committed to using lessons learned to face
challenges they encounter while implementing a major transformation effort—the re-design
of the Minnesota training system.

Training and Development in Human Services 115


SECTION THREE
INSTRUMENTATION AND METHODOLOGY

Instrumentation continues to be an integral component within many human services training


and development programs. Instruments (e.g., learning style inventories, attitude
questionnaires, Q-sort cards, and knowledge tests) can be used in all phases of the training
cycle from needs assessment to evaluation. However, instruments can also be misused (e.g.,
using an unreliable knowledge test to evaluate a worker’s performance while on probation).

The intent of this section is to (1) provide examples of useful instruments as well as other
training assessment and evaluation methodological approaches, (2) provide a venue for the
exploration of appropriate applications (as well as potential misapplications) of various
instruments being used in human services training and development programs, and (3)
advocate for responsible application of instruments and other assessment/evaluation
methodologies by training and development professionals.

Instrumentation articles written for this section are expected to provide information on the
following areas:

• Conceptual/theoretical background from which the instrument was developed


• Description of the instrument
• Intended use
• Procedures for utilization
• Data regarding sample norms (if available)
• Reliability
• Validity
• Cautions (e.g., how might the data results be misleading)
• A copy of the instrument or how to obtain it

The articles in this issue provide examples of how instruments can be used to promote
positive organizational and system change. The first article describes a situational judgment
assessment approach used in the development of the National Child and Youth Care

Training and Development in Human Services 117


Practitioner Certification Exam. The authors indicate that the exam scores are associated
with more effective performance on the job.

The second article is a shortened version of the Transfer Potential Questionnaire. It stresses
the importance of assessing individual, organizational, and training design factors that
research has identified as affecting transfer of learning. The authors suggest that the user-
friendly instrument described in the article (Application Potential of Professional Learning
Inventory—APPLĪ 33) can help human services training and development professionals
assess a worker’s potential to use new learning on the job. Suggestions to improve the
potential for effective transfer of learning also are provided.

118 Training and Development in Human Services


THE NATIONAL CHILD AND YOUTH CARE
PRACTITIONER CERTIFICATION EXAM

Child and Youth Care Certification Board

Conceptual Background

The certification exam was developed to address broad cross-field (e.g., afterschool,
residential treatment, juvenile justice) competencies that were identified from a meta-analysis
of 87 sets of competencies used in North America (Eckles, et al., 2009; Mattingly, Stuart &
VanderVen, 2002). The competencies are organized into five competency domains
(professionalism, applied human development, relationship and communication, cultural and
human diversity, and developmental practice methods).

A situational judgment exam was developed that requires practice judgments from the
examinee based on case scenarios. The scenarios were developed from case studies of actual
incidents elicited from the field from a variety of practice settings. A situational judgment
approach (SJA) to assessment emphasizes the use of realistic scenarios, typically asking test-
takers to identify the best alternative among the choices offered. The most correct answer for
each item is determined by a panel of subject matter experts. The SJA was used because
research has determined that SJA tends to have a high level of face and content validity, less
adverse impact by gender and ethnicity, can measure a variety of constructs including
interpersonal competencies that are crucial in child and youth work settings, are relatively
easy to administer in bulk (paper and pencil or online), and typically have acceptable validity
correlations with job performance (Chan & Schmitt, 2002; Clevenger, Pereira, Wiechmann,
Schmitt, & Harvey, 2001; McDaniel, Morgeson, Finnegan, Campion, & Braverman, 2001).
Finally, situational judgment was felt to be a means of addressing arguments of those who
resist exams as being poor indicators of competence. A situational judgment test can present
complex situations and multiple questions to assess the candidates' ability to deal with
complexity and make independent decisions based on theory as well as context, thus
exhibiting professional judgment necessary to be effective in child and youth work settings
(Curry, Schneider-Munoz, Eckles, & Stuart, in press).

Description of Instrument

The exam consists of 75 situational judgment multiple choice items pertaining to 17 case
studies that were elicited from a variety of actual practice settings and situations. Each item

Training and Development in Human Services 119


pertains to a specific competency that a panel of experts determined would be best assessed
by the exam (compared to the supervisor assessment or portfolio). The number of items
pertaining to each competency (one to three) was also determined by a panel of experts (Curry
et al., 2009; Eckles et al., 2009).

The following is an example of a case and item that requires practice judgments pertaining to
the competencies from the Standards of Practice/Competencies for Professional Child and
Youth Work.

Competency IB5a. Access and apply relevant local, state/provincial and federal laws,
licensing regulations, and public policy (e.g., staffing ratios, confidentiality, driving
laws, child abuse/neglect reporting, children, youth and family rights).
Competency IB4c. Apply specific principles and standards from the relevant Code of
Ethics to specific problems.

You are a practitioner working in an inner-city emergency shelter that primarily serves
homeless youth. The shelter serves youth who are 14 to 21 years old. Legally, in this
state, runaways under the age of 16 must be reported to authorities. One evening a
young-looking female youth comes in and makes inquiries as to the services available
in the shelter. She tells you she is 18, but you strongly suspect she is much younger,
possibly 13 or 14. As you interview her, she reveals that she ran away from home
about a year ago and has been working as a prostitute for the past 6 months. She
refuses to tell you her real name or where she is from. When you ask her what she
needs from the shelter, she tells you that she doesn’t know but she sure could use a
place to stay overnight. You end up in the kitchen talking with her as she eats the soup
you heated for her.

As a practitioner, you:
a) Have a legal obligation to talk her into staying at the shelter until a longer-
term program can be worked out or she can be reconnected with her family.
You have no obligation to contact the authorities.
b) Have a legal obligation to make the shelter services available to her and
check to be sure she is aware of the risks involved in her lifestyle.
c) Have a professional obligation to contact the appropriate authorities if she
leaves the shelter.
d) Have no legal or ethical obligation beyond making services available to her
that she has specifically asked for.

There may be no ideal answer provided. But the examinees are expected to choose the best
answer from the alternatives provided.

120 Training and Development in Human Services


Intended Use

The exam is used as a one of the components of a national certification program for child and
youth workers from a variety of settings (community-based, out-of-home care), who work
with various populations, (typical and atypical such as youth experiencing child maltreatment
or mental health concerns), and with youth of various ages (early childhood through
adolescence). Certification applicants must take and pass the exam prior to submitting a
completed application that includes an extensive supervisory assessment and electronic
portfolio (see http://www.cyccertificationboard.org).

Although not the primary function of the exam, it is possible that the exam can also be used to
assess the effectiveness of comprehensive education and/or child and youth work training
programs. Several organizations (e.g., Casa Pacifica) are currently exploring use of the exam
for this purpose. In this way, the certification exam in conjunction with comprehensive
training and development initiatives can become more fully integrated into a systematic
organizational development plan.

The exam may also be used for research purposes (e.g., examining the relationship between
professional child and youth care judgment and various constructs—personality, empathy,
relationship building, intervention style, retention, etc.).

Procedures for Use

Administration of the exam occurs both in written (paper and pencil) and electronic (web-
based) formats. The total time permitted to complete the exam (including the instruction
overview and obtaining identifying information for examinees) is 150 minutes. This is a
proctored exam, administered at approved proctored sites including various child and youth
work conference venues (e.g., Conference of the American Association for Children’s
Residential Centers, International Child and Youth Care Conference). Agencies may inquire
about becoming an approved proctored site. See contact information at the end of this article.

Sample Norms

Norms are established from a sample of 775 child and youth workers from 29 sites in six
states (Maryland, Pennsylvania, Ohio, Oklahoma, Texas, and Wisconsin) and two Canadian
provinces (Ontario and British Columbia). The sample was very diverse, representing various
segments of the child and youth care worker population. However, the most frequent
characteristics of the sample were that the individuals were female (61%), African American,
(45%), spoke English as a first language (97%), practiced in a residential treatment setting
(46%), worked as a direct care worker (49%), considered themselves professional child and
youth care workers (95%), and held a baccalaureate degree (36%). The average examinee's age
was 37 (ranging from 17 to 76), and 10 years was the average number of years of experience as

Training and Development in Human Services 121


a child and youth care worker (see Table 1 for a summary of the sample's demographic
characteristics).

Exam scores ranged from a low of 16% of the total 75 items to 96%, with a mean total raw
score of 47.66 and standard deviation of 11.646. The standard error of measurement (SEM)
was 3.68 or 4.9% of the total questions.

Table 1. Demographic Characteristics-Certification Exam Validation Study


___________________________________________________________________________

Frequency Percentage
___________________________________________________________________________

Sex
Male 301 39
Female 470 61
Race
African American 337 44.9
American Indian or American Indian First 5 .7
Asian 7 .9
Caucasian 320 42.7
Hispanic 56 7.5
Multi-ethnic (more than one race) 23 3.1
Other 2 .3
First Language (English) 749 97
Country
U.S.A. 735 95.3
Canada 36 4.7

Practice Setting (Education)


Early Childhood 127 16.5
Public and Private Schools 109 14.1
Practice Setting (Out-of-Home Care)
Foster Homes 37 4.8
Residential Treatment 355 46
Psychiatric Hospitals 21 2.7
Medical Hospitals/Clinics 12 1.6
Physical Disabilities 10 1.3
Juvenile Corrections 58 7.5
Emergency Shelters 96 12.5
Basic Residential Care 127 16.5
Transitional Living 58 7.5
Developmental Disabilities 19 2.5
Practice Setting (Community-Based Services)
After School Programs 50 6.5
___________________________________________________________________________

122 Training and Development in Human Services


___________________________________________________________________________

Frequency Percentage
___________________________________________________________________________
Prevention/Intervention Programs 122 15.8
Street Outreach 35 4.5
Developmental Disabilities 22 2.9
Early Intervention 45 5.8
In-home Detention Programs 6 .8
Physical Disabilities 13 1.7
Recreation 38 4.9
In-home Family Care & Treatment Services 45 5.8
Organizations (YMCA, Scouts, etc.) 38 4.9
Clinic-based Day Treatment Services 26 3.4
Practice Settings (Other) 57 7.4

Type of Position
Direct Care Worker 370 48.7
Educator 38 5.0
Supervisor 102 13.4
Administrator 62 8.2
Counselor 84 11.1
Therapist 6 .8
Foster Parent 1 .1
Other 93 12.2
Professional CYC
Yes 729 95.0
No 33 4.3
Education
None 99 13.6
Associate 97 13.4
Baccalaureate 263 36.2
Masters 87 12.0
Doctorate 2 .3
No degree but coursework 177 24.4
___________________________________________________________________________

Mean SD

Age 37.35 10.95


Years of Experience 10.43 8.05
___________________________________________________________________________

N=775; Settings are not mutually exclusive. Respondents may have selected more than one
setting. Also, the relatively large number of participants indicating “other” for “type of
position” is in part due to blended/hybrid positions (e.g., lead worker with supervisory
responsibilities, social worker/direct care worker).

Training and Development in Human Services 123


A modified Angoff procedure involving a 10-member expert panel was used to determine a
cut score (pass/fail) for the exam (Angoff, 1971). Additional test analysis scenarios were
explored based on the Peng and Subkoviak approximation to estimate the number of correct
placings (pass/fail) and estimate the accuracy of the classification over and above what we
might get through chance alone (i.e., through tossing a coin and classifying people as masters
if we get heads, and nonmasters if we get tails) (Subkoviak, 1984, pp.275-276). A cut score of
65% was determined as most ideal. If the cut score would have been determined prior to the
pilot study, 53% of the examinees would have passed the exam. Preliminary data from current
administration of the exam since the pilot study indicate that 85% of the examinees pass the
exam.

Reliability

Internal consistency of the exam is considered excellent (Chronbach’s alpha is .8999). The
standard error of measurement is 3.68 (4.9% of the total items).

Validity

Content Validity

There is a high level of content validity as a result of using case scenarios from the varied
practice fields and developing each item in response to a specific competency area that was
judged (by expert panels) to be best assessed by the exam (compared to the other measures of
supervisor assessment and portfolio).

Face Validity

A survey instrument was constructed to receive feedback from the examinees that included
four items assessing face validity (Chronbach’s alpha for the four items = .77). Almost all
(97%) of the examinees completed the face validity and feedback questionnaire. Findings
indicate that the vast majority of respondents (90%) perceived that the items in the exam
accurately assess important aspects of child and youth care work and the case examples
provide realistic samples of child and youth care work.

These findings strongly endorse a belief that the exam seems to be measuring the essential
elements of child and youth care work. Somewhat less (80%) indicated that the content in the
exam is similar to their actual job duties, and only 59% stated that they believe that their
performance on the exam is an accurate indicator of their actual performance on the job (34%
indicated that they neither agreed or disagreed). Apparently, the examinees viewed the exam
as an excellent indicator of child and youth care practice; however, they appeared to have less
confidence that how they performed on the exam is indicative of their job performance. Since
the exam covered practice areas from a variety of settings and ages, some of the participants
may have perceived their job duties in a more limited manner (e.g., confined to working with

124 Training and Development in Human Services


children ages 3 to 5 in a day care setting only). In addition, the examinees did not have access
to their test results. So, the relatively large number of examinees who indicated that they
neither agreed nor disagreed that the exam is an excellent indicator of their job performance
may have lowered this item rating relative to the other face validity items. Despite these
findings, statistically significant differences were not found for the composite face validity
variable among the varied practice sites (education, community-based, out-of-home-care),
suggesting that the exam has satisfactory face validity across the varied child and youth work
fields of practice.

Criterion (concurrent) Validity

The total exam scores were correlated with the examinees supervisors’ assessment of worker
competence on the job. The supervisory assessment was a composite variable comprised of
six items. One item pertained to each of the five major competency domains and one item
referred to the worker’s overall competence. The item anchor descriptors ranged from
“consistently demonstrates competence” to “does not demonstrate competence.” The
composite competence score (the sum of the six items) was used as a concurrent criterion
measure of job performance (Chronbach's alpha of .94 for the six items).

A validity coefficient (r=.26) was obtained. The strength of the correlation between the SJA
exam score and supervisor ratings is comparable to findings in other studies considered to
underestimate the true strength of the relationship between the exam score and on-the-job
performance. The relationship is underestimated due to range restriction (e.g., very poor
performers tend to be weeded out, thus limiting the range of performance assessed), and there
is little variability in the supervisory assessments. The supervisory ratings from this sample
primarily ranged from 3 to 5. For example, there were no ratings lower than 3 on the overall
competence item and only 10% gave ratings of 3 (inconsistently demonstrates competence).

According to Sackett, Borneman, and Connelly (2008), both range restriction and criterion
unreliability (typical of efforts correlating exam scores with supervisor ratings of worker
performance) lead to an underestimation of validity. Clevenger et al. (2001) estimate that
typical SJA validity correlations (in the .20 to .40 range) to be significantly higher and among
the best validity coefficients when compared to personnel selection approaches.

Construct Validity

Factor analysis (principal components analysis) was applied to the test data to investigate
possible dimensionality of the test. Although the test items were derived from competencies
organized into five domains, the analysis did not support a five-factor domain structure. The
test appears to primarily measure one general construct (professional child and youth care
judgment). A visual inspection of the item loadings and scree plot was conducted to identify
the factors upon which most of the items loaded. The scree plot (Figure 1) indicates the

Training and Development in Human Services 125


Figure 1. Scree Plot Indicating One Primary Factor

12

10

8
Eigenvalue

0
46
49
4

40
43
22
25
28

61
64
67

85

97
88
52

91
55

94
31

58

82
34
37

76
79
7

70
73
1

13
16
19
10

Component Number

presence of one primary trait or construct. Therefore, subscale scores are not determined; only
the overall total score (percentage) is reported to examinees.

Cautions

The nature of child and youth care work is complex and difficult to assess through traditional
measures. Many child and youth care leaders consider the field to be more art than science.
There is no “silver bullet” assessment strategy. The exam is considered to be one component
of a more comprehensive assessment approach including the use of portfolio analysis and
supervisory assessment. Initial research indicates that the multi-measure assessment approach
is a better predictor of competence on the job than use of a single indicator (Curry, et al., in
press). Other potential uses of the exam (e.g., evaluation of training) should be considered
exploratory.

The case study approach to the exam may suggest further use of case scenarios in the
education and training of child and youth work personnel. Feedback from some of the
examinees indicates that the exam helped them to think in greater depth about these

126 Training and Development in Human Services


competency areas. Others indicate that the wide range of case scenarios helped them better
recognize the large scope of professional child and youth care practice.

However, use of the exam results to provide specific direction for training within the five
competency domains may not be indicated. Initial research findings indicate that the exam
appears to asses an overall general factor of professional judgment in child and youth care
work. Only the overall score (percentage) is reported to examinees. Sub-scores by domains
are not reported.

Finally, exam users (e.g., examinees, evaluators, researchers, administrators) should recognize
that a cut score of 65% does not equate to a “D” grade as many of us have been conditioned to
interpret throughout our schooling. The cut score was determined by the best estimates of a
panel of experts (along with additional statistical cut score analysis) to determine the
minimum level that should be obtained by a professional level child and youth work
practitioner.

How to Access

Contact the national office of the Child and Youth Care Certification Board at
http://www.cyccertificationboard.org or (979) 764-7306.

References

Angoff, W. H. (1971). Scales, norms, and equivalent scores. In R. L. Thorndike (Ed.),


Educational measurement (2nd ed., pp. 508-600) Washington, D.C.: American Council
on Education.

Chan, D., & Schmitt, N. (2002). Situational judgment and job performance. Human
Performance, 15, 233–254.

Clevenger, J., Pereira, G. M., Wiechmann, D., Schmitt, N., & Harvey, V. S. (2001).
Incremental validity of situational judgment tests. Journal of Applied Psychology, 86,
410–417.

Curry, D., Eckles, F., Stuart, C., & Qaqish, B. (2010). National child and youth care
practitioner certification: Promoting competent care for children and youth. Child
Welfare, 89, 57-77.

Curry, D., Qaqish, B., Carpenter-Williams, J., Eckles, F., Mattingly, M., Stuart, C., &
Thomas, D. (2009). A national certification exam for child and youth care workers:
Preliminary results of a validation study. Journal of Child and Youth Care Work, 22,
152-170.

Training and Development in Human Services 127


Curry, D., Schneider-Munoz, A.J., Eckles, F., & Stuart, C. (in press). Assessing youth worker
competence: National child and youth worker certification. In D. Fusco (Ed.).
Advancing Youth Work: Critical Trends, Critical Questions. Routledge.

McDaniel, M. A., Morgeson, F. P., Finnegan, E. B., Campion, M. A., & Braverman, E. P.
(2001). Use of situational judgment tests to predict job performance: A clarification of
the literature. Journal of Applied Psychology, 86, 730–740.

Mattingly, M., Stuart, C., & VanderVen, K. (2002). North American Certification Project
(NACP) competencies for professional child and youth work practitioners. Journal of
Child and Youth Care Work, 17, 16-49.

Sackett, P.R., Borneman, M.J., & Connelley, B.S. (2008). High-stakes testing in higher
education and employment: Appraising the evidence of validity and fairness. American
Psychologist, 63, 215-227.

Subkoviak, M. J. (1984). Estimating the reliability of mastery-nonmastery classifications. In


R. A. Berk (Ed.) A guide to criterion-referenced test construction. Baltimore, Maryland:
The Johns Hopkins Press.

_____________________________

Child and Youth Care Certification Board and Research Committee members contributing to this article include:

Dale Curry, Kent State University


Carol Stuart, Vancouver Island University
Basil Qaqish, University of North Carolina Greensboro
Dana Fusco, CUNY
Frank Eckles, Academy for Competent Youth Work
Jean Carpenter-Williams, National Resource Center for Youth Services, University of Oklahoma
Andrew Schneider-Munoz, University of Pittsburgh
Sister Madeleine Rybicki, Holy Family Institute
Debbie Zwicky, St. Rose Youth and Family Center
Cindy Wilson, New England Network for Child, Youth and Family Services
Pamela Clark, Independent Consultant
Michael Gaffley, NOVA Southeastern University

128 Training and Development in Human Services


APPLICATION POTENTIAL OF PROFESSIONAL
LEARNING INVENTORY—APPLĪ 33

Dale Curry, Kent State University,


Michael Lawler, University of South Dakota,
Jann Donnenwirth, University of California, Davis, and
Manon Bergeron, University of Quebec at Montreal

Conceptual Background

Professional development and training practitioners in human services increasingly have


recognized the need to plan, implement, and evaluate transfer of learning interventions to
facilitate effective application of newly learned skills on the job (Curry, McCarragher &
Dellmann-Jenkins, 2005; McCracken & Wilson, 2009). The identification of factors that help or
hinder application can suggest useful transfer of learning strategies—an essential activity of a
comprehensive professional development and training program.

Baldwin & Ford (1988) provided a useful framework for understanding transfer of learning.
They emphasized the importance of individual characteristics (e.g., motivation to learn and
apply), the work environment (e.g., supervisor support for learning and application), and
training design factors (e.g., identical elements—the degree of similarity of the learning
task/environment and the application task/environment). Within the field of human services
professional development and training, Curry and colleagues (1991; 1994) expanded upon
this framework and emphasized the importance of incorporating these factors with field theory
(Lewin, 1951) into a Transfer of Training and Adult Learning (TOTAL) approach that assesses
transfer driving and restraining forces before, during, and after a training event. The
identification of key persons (e.g., supervisor, worker, trainer, co-worker) within the worker’s
learning and application environments (transfer field) is another key element of this transfer
model (Figure 1).

Field theory provides a relatively simple approach to change, involving the interplay between
two opposing sets of forces. Change/transfer of learning occurs when equilibrium is disrupted.
An existing field of forces is changed by increasing transfer driving forces and/or decreasing
transfer restraining forces. The number and strength of driving and restraining forces (before,
during, and after training) will determine if transfer occurs, as well as the extent of transfer. If
the strength of the total number of transfer driving forces is greater than the restraining forces,
transfer will occur. If the total strength of the restraining forces is greater or equal to the driving

Training and Development in Human Services 129


forces, transfer will not occur (Broad & Newstrom, 1992; Curry, et al., 1991; 1994, Curry, 1997;
Curry & Caplan, 1996).

Based upon this approach, the Transfer Potential Questionnaire (TPQ) was constructed in an
attempt to (1) identify factors affecting the transfer of learning of child welfare workers and (2)
predict high versus low users of training on the job. The TPQ identified 11 transfer factors and
was predictive of subsequent transfer of learning in Ohio child welfare practice settings (Curry,
1997). The TPQ was subsequently found to significantly correlate with transfer of learning with
other human service professionals (e.g., employment and training, eligibility workers) in
California (Curry, Donnenwirth, & Lawler, in press; Lawler, Donnenwirth, & Curry, in press).

Figure 1. The Transfer Field.

Transfer will occur if the total number and strength of driving forces is greater than the
restraining forces.

Significant Actor Before During After


Worker
(driving force)
Supervisor
(driving force)
Administrator
(driving force)
Co-worker
(driving force)
Trainer
(driving force)
Client
(driving force)

Client
(restraining force)
Trainer
(restraining force)
Co-worker
(restraining force)
Administrator
(restraining force)
Supervisor
(restraining force)
Worker
(restraining force)

130 Training and Development in Human Services


Using data from these Ohio and California validation studies, shortening the TPQ was
considered according to guidelines suggested by Stanton, Balzar, Sinar, & Smith (2002).
External (e.g., average item correlations with the transfer of learning criterion), internal (e.g.,
factor analysis), and expert judgment criteria were used in the selection of items. Stanton et al.
(2002) recommend giving preference to external criteria when clear-cut choices are not apparent.
With this in mind, items were selected with the highest validity coefficients while also sampling
at least one item from each of the eleven factors. The results of the analysis produced a 33-item
instrument with similar psychometric properties to the full TPQ, providing support for the
reduced TPQ version.

Description

The APPLĪ 33 is a 33-item, five-choice agree/disagree Likert scale questionnaire that was
developed from the full-length TPQ. The items sample 11 transfer of learning factors identified
by Curry (1997) (See Table 1).

Intended Use

The overall mean score on the APPLĪ 33 can be used to predict later transfer of learning of
human service workers. In essence, it can be used as a transfer of learning proxy measure,
providing a training effectiveness indicator in addition to traditional reaction measures at the end
of training.

Table 1. Subscales Derived from Ohio Child Welfare Sample

1. Trainer adult learning and transfer strategies


2. Relevance and applicability
3. Supervisor support for training/transfer
4. Organizational/top management support
5. Application planning
6. Perceived learning
7. Pre-training motivation
8. Prior experience with training/application
9. Co-worker support
10. Training/organization congruence
11. Pre-training preparation

Training and Development in Human Services 131


Similar to current reaction measures, typical transfer norms can be established and routinely
provided to key persons in the training system (e.g., trainer, administrators). Training sessions
that deviate significantly below the norm may indicate a need to examine individual,
organizational, or training design variables more closely to determine how to more successfully
encourage transfer of learning to practice settings. Routinely providing APPLĪ 33 rating results
to trainers and other key personnel may influence training strategies to better emphasize transfer-
related interventions.

Exploring items associated with individual, organizational, and training design factors can
provide suggestions for how to increase the potential of more effective transfer of learning. For
example, regional training administrators can explore differences in overall transfer potential or,
more specifically, supervisor and administrator support for transfer of learning by agency
(suggestive of different transfer environments). The degree of perceived applicability of training
may reaffirm or suggest necessary changes to the training design. Use of the APPLĪ 33 can
suggest ways of increasing and decreasing transfer driving and restraining forces.

Procedures for Utilization

The APPLĪ 33 is administered to participants at the end of a training session. It takes


approximately ten minutes to complete.

Sample Norms

Frequency distributions for the Ohio and California samples are provided in Tables 2 and 3.
Total mean APPLĪ 33 scores for the Ohio sample range from 1.97 to 5.0 with a mean and
median score of 4.0 and standard deviation of .5. The distributions for the California sample
range from 1.88 to 5.0 with a mean score of 4.10, median score of 4.24, and standard deviation
of .5.

Reliability

Internal reliability is very strong with Chronbach’s alpha of .95 for both the Ohio and California
samples.

Validity

Correlation of the APPLĪ 33 with the full scale TPQ is .97 (Ohio) and .98 (California),
suggesting that full and reduced scales assess the same construct of transfer potential.
Correlation of the APPLĪ 33 with the transfer criterion is .60 (Ohio) and .57 (California). In
addition, the TPQ was found to be predictive of long-term retention of child protective services
(Curry, McCarragher, & Dellmann-Jenkins, 2005). Also, the mean score of the items composing
the APPLĪ 33 is associated with long-term retention of child protective services professionals,
but it has yet to be determined with another human service population sample.

132 Training and Development in Human Services


Cautions

Administrators and supervisors may be reluctant for professional development and training
practitioners to be aware of the participants’ assessment of supervisor and administrative
support. Care must be taken to describe how the results will be used as well as communicate the
importance of assessing individual, organizational, and training design factors that affect transfer
of learning. Similarly, the participants may be hesitant to rate their supervisor and administrator
negatively (particularly those who are still on their probationary period). They will need to be
assured of how the results will be used and who will have access to the results. See standards P.
2 and PR. 8 in the NSDTA Code of Ethics for Training and Development Professionals in
Human Services (Curry, Brittain, Wentz, & McCarragher, 2004).

Further research is necessary to determine the use of possible subscales. Research with the
APPLĪ 33 should also be conducted with additional human service populations as well as
additional transfer criterion measures.

References

Baldwin, T.T., & Ford, K.J. (1988). Transfer of training: A review and directions for future
research. Personnel Psychology, 41, 63-105.

Broad, M.L. & Newstrom, J.W. (1992). Transfer of Training: Action-Packed Strategies to
Ensure High Payoff from Training Investments. Reading, MA. Addison-Wessley.

Curry, D.H. (1997). Factors affecting the perceived transfer of learning of child protection
social workers. Unpublished doctoral dissertation, Kent State University, Kent, Ohio.

Curry, D., Brittain, C., Wentz, R. & McCarragher, T. (2004). The NSDTA Code of Ethics for
Training and Development Professionals in Human Services: Case Scenarios and Training
Implications. Washington, D.C.: American Public Human Services Association and the
National Staff Development and Training Association.

Curry, D. & Caplan, P. (1996). The transfer field: A training exercise. The Child and Youth Care
Leader, 7, 28-30.

Curry, D.H., Caplan, P., & Knuppel, J. (1991). Transfer of Training and Adult Learning. Paper
presented at the Excellence in Training International Conference, Cornell University.

Curry, D.H., Caplan, P., & Knuppel, J. (1994). Transfer of training and adult learning (TOTAL).
Journal of Continuing Social Work Education, 6, 8-14.

Curry, D., Donnenwirth, J., & Lawler, M.J. (in press). Scale reduction: Developing user-friendly
human service training and development assessment instruments. 2010 Proceedings of the

Training and Development in Human Services 133


National Human Services Training Evaluation Symposium. University of California
Berkeley.

Curry, D., McCarragher, T. & Dellmann-Jenkins, M. (2005). Training, transfer and turnover:
The relationship among transfer of learning factors and staff retention in child welfare,
Children and Youth Services Review, 27, 931-948.

Lawler, M.J., Donnenwirth, J., & Curry, D. (in press). Evaluating transfer of learning in public
welfare training and development: Factors affecting the transfer of learning of California
public welfare workers. Project synopsis in the 2010 Proceedings of the National Human
Services Training Evaluation Symposium. University of California Berkeley. Retrieved
from http://calswec.berkeley.edu/CalSWEC/03_2010_NHSTES_All_Synopses_
FINAL.pdf.

Lewin, K. (1951). Field theory in social science. New York: Harper and Row.

McCracken, M. and Wilson, M. (2009). How a Supportive Organisational Environment May


Enhance Transfer of Training: Findings from the Residential Childcare Sector, 10th
International Conference on Human Resource Development, Research and Practice across
Europe, University Forum for HRD, Northumbria University, Newcastle Upon Tyne, UK,
10th-12th June.

Stanton, J.M., Sinar, E.F., Balzer, W.K., & Smith, P.C. (2002). Issues and strategies for reducing
the length of self-report scales. Personnel Psychology, 55, 167-194.

134 Training and Development in Human Services


Table 2. APPLĪ 33 Ohio CPS Workers Frequency Distributions

Total Mean Score Frequency Percent Valid Percent Cumulative Percent


1.97 1 .2 .2 .2
2.12 1 .2 .2 .3
2.39 1 .2 .2 .5
2.42 1 .2 .2 .7
2.52 1 .2 .2 .9
2.64 1 .2 .2 1.0
2.70 3 .5 .5 1.6
2.76 3 .5 .5 2.1
2.79 1 .2 .2 2.3
2.85 3 .5 .5 2.8
2.88 2 .3 .3 3.1
2.91 2 .3 .3 3.5
2.97 3 .5 .5 4.0
3.00 3 .5 .5 4.5
3.06 1 .2 .2 4.7
3.09 2 .3 .3 5.0
3.12 4 .6 .7 5.7
3.15 2 .3 .3 6.1
3.18 4 .6 .7 6.8
3.21 1 .2 .2 6.9
3.24 3 .5 .5 7.5
3.27 2 .3 .3 7.8
3.30 6 1.0 1.0 8.9
3.33 8 1.3 1.4 10.2
3.36 4 .6 .7 10.9
3.39 7 1.1 1.2 12.2
3.42 4 .6 .7 12.8
3.45 4 .6 .7 13.5
3.48 5 .8 .9 14.4
3.52 5 .8 .9 15.3
3.55 8 1.3 1.4 16.7
3.58 8 1.3 1.4 18.1
3.61 9 1.5 1.6 19.6
3.64 9 1.5 1.6 21.2
3.67 9 1.5 1.6 22.7
3.70 9 1.5 1.6 24.3
3.73 10 1.6 1.7 26.0
3.76 9 1.5 1.6 27.6
3.79 16 2.6 2.8 30.4
3.82 12 1.9 2.1 32.5
3.85 16 2.6 2.8 35.2
3.88 19 3.1 3.3 38.5
(table continues)

Training and Development in Human Services 135


Table 2. Continued

Total Mean Score Frequency Percent Valid Percent Cumulative Percent


3.91 21 3.4 3.6 42.2
3.94 20 3.2 3.5 45.7
3.97 15 2.4 2.6 48.3
4.00 21 3.4 3.6 51.9
4.03 12 1.9 2.1 54.0
4.06 15 2.4 2.6 56.6
4.09 11 1.8 1.9 58.5
4.12 10 1.6 1.7 60.2
4.15 21 3.4 3.6 63.9
4.18 10 1.6 1.7 65.6
4.21 9 1.5 1.6 67.2
4.24 10 1.6 1.7 68.9
4.27 7 1.1 1.2 70.1
4.30 7 1.1 1.2 71.4
4.33 7 1.1 1.2 72.6
4.36 13 2.1 2.3 74.8
4.39 7 1.1 1.2 76.0
4.42 12 1.9 2.1 78.1
4.45 10 1.6 1.7 79.9
4.48 7 1.1 1.2 81.1
4.52 10 1.6 1.7 82.8
4.55 11 1.8 1.9 84.7
4.58 9 1.5 1.6 86.3
4.61 11 1.8 1.9 88.2
4.64 12 1.9 2.1 90.3
4.67 12 1.9 2.1 92.4
4.70 9 1.5 1.6 93.9
4.73 4 .6 .7 94.6
4.76 4 .6 .7 95.3
4.79 3 .5 .5 95.8
4.82 2 .3 .3 96.2
4.85 7 1.1 1.2 97.4
4.88 4 .6 .7 98.1
4.91 1 .2 .2 98.3
4.94 5 .8 .9 99.1
4.97 2 .3 .3 99.5
5.00 3 .5 .5 100.0
Total 576 92.9 100.0
Missing 44 7.1
Total 620 100.0

Mean 4.00; Median 4.00; SD .50

136 Training and Development in Human Services


Table 3. APPLĪ 33 California Public Welfare Workers

Total Mean Score Frequency Percent Valid Percent Cumulative Percent

1.88 1 .2 .2 .2
2.64 1 .2 .2 .5
2.73 1 .2 .2 .7
2.76 1 .2 .2 .9
2.79 1 .2 .2 1.1
2.82 2 .4 .5 1.6
2.88 1 .2 .2 1.8
2.91 1 .2 .2 2.1
2.94 1 .2 .2 2.3
2.97 1 .2 .2 2.5
3.02 1 .2 .2 2.7
3.09 1 .2 .2 3.0
3.12 2 .4 .5 3.4
3.15 1 .2 .2 3.7
3.21 2 .4 .5 4.1
3.24 1 .2 .2 4.3
3.27 1 .2 .2 4.6
3.30 1 .2 .2 4.8
3.33 3 .7 .7 5.5
3.36 2 .4 .5 5.9
3.39 4 .9 .9 6.8
3.42 1 .2 .2 7.1
3.45 1 .2 .2 7.3
3.48 2 .4 .5 7.8
3.52 2 .4 .5 8.2
3.55 4 .9 .9 9.1
3.58 8 1.7 1.8 11.0
3.61 9 2.0 2.1 13.0
3.64 7 1.5 1.6 14.6
3.67 2 .4 .5 15.1
3.68 1 .2 .2 15.3
3.70 5 1.1 1.1 16.4
3.71 1 .2 .2 16.7
3.73 2 .4 .5 17.1
3.76 10 2.2 2.3 19.4
3.79 5 1.1 1.1 20.5
3.82 6 1.3 1.4 21.9
3.83 1 .2 .2 22.1
3.85 9 2.0 2.1 24.2
3.88 13 2.8 3.0 27.2
3.91 13 2.8 3.0 30.1
3.94 6 1.3 1.4 31.5
3.97 8 1.7 1.8 33.3
(table continues)

Training and Development in Human Services 137


Table 3. Continued

Total Mean Score Frequency Percent Valid Percent Cumulative Percent

4.00 5 1.1 1.1 34.5


4.03 12 2.6 2.7 37.2
4.06 9 2.0 2.1 39.3
4.09 14 3.1 3.2 42.5
4.12 7 1.5 1.6 44.1
4.15 5 1.1 1.1 45.2
4.18 7 1.5 1.6 46.8
4.21 9 2.0 2.1 48.9
4.24 6 1.3 1.4 50.2
4.27 9 2.0 2.1 52.3
4.30 17 3.7 3.9 56.2
4.33 10 2.2 2.3 58.4
4.36 7 1.5 1.6 60.0
4.39 9 2.0 2.1 62.1
4.42 4 .9 .9 63.0
4.45 10 2.2 2.3 65.3
4.48 10 2.2 2.3 67.6
4.50 1 .2 .2 67.8
4.52 8 1.7 1.8 69.6
4.53 1 .2 .2 69.9
4.55 11 2.4 2.5 72.4
4.56 1 .2 .2 72.6
4.58 14 3.1 3.2 75.8
4.61 11 2.4 2.5 78.3
4.64 13 2.8 3.0 81.3
4.67 8 1.7 1.8 83.1
4.70 8 1.7 1.8 84.9
4.73 8 1.7 1.8 86.8
4.74 1 .2 .2 87.0
4.76 8 1.7 1.8 88.8
4.79 5 1.1 1.1 90.0
4.82 8 1.7 1.8 91.8
4.85 5 1.1 1.1 92.9
4.88 6 1.3 1.4 94.3
4.89 1 .2 .2 94.5
4.91 3 .7 .7 95.2
4.94 3 .7 .7 95.9
4.97 3 .7 .7 96.6
5.00 15 3.3 3.4 100.0
Total 438 95.4 100.0
Missing 21 4.6
Total 459 100.0

Mean 4.19; Median 4.24; SD .50

138 Training and Development in Human Services


Application Potential of Professional Learning Inventory—APPLĪ 33
Please respond using the following criteria:
Strongly Disagree = 1 Disagree = 2 Uncertain = 3 Agree = 4 Strongly Agree = 5

Item Criteria Statement

1. 12345 As a result of the training, I substantially increased my knowledge on this topic.


2. 12345 As a result of the training, I have developed new skills.
3. 12345 The training has affected some of my attitudes concerning this topic area.
4. 12345 As a result of this training, I have a better conceptualization of what I already do on the job.
5. 12345 I am motivated to put this training into practice on the job.
6. 12345 I will meet with my supervisor to discuss application of this training on the job.
7. 12345 My supervisor expects me to use this training on the job.
8. 12345 Even if no one notices, I will use knowledge learned from this training on the job.
9. 12345 The trainer helped me to see how the training can be applied on the job.
10. 12345 The information I received from this training can definitely be used with my clients.
11. 12345 I have already made a plan with a co-worker to use this training.
12. 12345 There is at least one co-worker who will be supportive of my application attempts.
13. 12345 I will have sufficient opportunities to practice the new ideas/skills/techniques on the job.
14. 12345 My organization expects me to use the training on the job.
15. 12345 When I think back to other training I have attended, I can say that I have used the training on
the job. I can even think of specific application examples.
16. 12345 I have a plan to implement this training.
17. 12345 I am very confident that I will use the training on the job.
18. 12345 I will have the time to review materials and make an implementation plan.
19. 12345 Prior to the workshop, I was motivated to attend.
20. 12345 During the training, I was thinking of ways I could apply the training content to the job.
21. 12345 The trainer/training provided sufficient opportunities to practice new information/skills.
22. 12345 I can think of specific cases/clients to which (with whom) this training can be used.
23. 12345 My supervisor helped to prepare me for this training by discussing my learning needs and
potential applications.
24. 12345 The trainer provided some practical ideas that can be used on the job.
25. 12345 The trainer gave examples of when to use ideas/skills/strategies on the job.
26. 12345 The trainer helped motivate me to want to try out training ideas on the job.
27. 12345 The workshop objectives were adequately addressed.
28. 12345 This training content is consistent with my agency’s mission, philosophy and goals.
29. 12345 This training content is consistent with my agency’s policies and my individual
responsibilities.
30. 12345 This training will help me to continue learning in this topic area.
31. 12345 As a result of the training, I will be a more effective worker.
32. 12345 The information I learned today can help make a difference with my client(s).
33. 12345 Overall, I am very satisfied with this training.

Copyright © 2010 by Dale Curry, Michael Lawler and Jann Donnenwirth. May be used by obtaining permission
from the authors. Primary contact: Dale Curry – dcurry@kent.edu.

Training and Development in Human Services 139

S-ar putea să vă placă și