Documente Academic
Documente Profesional
Documente Cultură
What
are
some
of
the
criticisms
voiced
against
evidence-based
practice
in
social
work
or
social
intervention?
To
what
extent
to
you
agree
with
these
criticism,
and
why?
Introduction
This
paper
will
point
out
some
of
the
criticisms
voiced
against
evidence-based
practice
in
social
work
and
social
intervention
and
argue
that
while
many
of
these
criticisms
are
misinformed
on
what
evidence-based
practice
(EBP)
actually
is,
others
reflect
an
ignorance
of
why
the
need
for
EBP
arose
in
the
first
place
and
the
potential
for
harm
interventions
can
have.
This
essay
will
do
this
by
first,
defining
evidence-based
practice
as
described
by
its
inventors
and
outlining
the
steps
needed.
Second,
by
looking
at
criticisms
that
are
based
on
a
fundamental
misunderstanding
of
what
EBP
actually
is.
Finally
by
looking
at
criticisms
based
on
the
view
that
EBP
is
not
applicable
in
the
social
sciences
and
arguing
that
it
actually
is
necessary
in
order
to
prevent
harm.
Figure
1
-
An
Updated
Model
For
Evidence-Based
Decisions
SOURCE:
(Haynes,
Devereaux,
&
Guyatt,
2002,
p.
36)
(Sackett
et
al.,
1997,
p.
3),
(Gibbs
&
Gambrill,
2002,
p.
454)
Figure
1
illustrates
how
EBP
is
the
integration
of
clinical
expertise,
research
evidence
and
also
patient
preferences.
All
three
aspects
must
be
taken
into
account
in
an
EBP
approach.
One
of
its
main
features
is
that
it
is
anti-
authoritarian.
As
oppose
to
an
I
know
best
approach
there
is
much
more
focus
on
sharing
knowledge
and
clinical
expertise
whereby
the
client
is
involved
in
the
decision
making
process.
Transparency
is
also
a
vital
hallmark
of
EBP
whereby
uncertainties
are
highlighted
rather
than
hidden
away
(Gambrill,
2003,
p.
4).
In
short
EBP
can
be
described
as
closing
the
gaps
between
research
and
practice
to
maximise
opportunities
to
help
clients
and
to
avoid
harm
(Gambrill,
2006,
p.
339).
Criticisms
of
EBP
Businesses
that
parade
as
EBP
It
is
important
to
distinguish
between
criticisms
that
are
based
on
EBP
as
described
by
its
originators
above
and
ones
that
are
not.
In
the
aptly
titled:
Evidence-Based
Practice:
Sea
Change
or
The
Emperors
New
Clothes?
Gambrill
points
out
that
often
criticisms
of
EBP
are
not
based
on
true
EBP
but
rather
on
authoritarian
businesses
that
have
simply
rebranded
as
evidence-based
without
possessing
any
of
the
aforementioned
characteristics.
The
paper
points
out
how
such
businesses
are
a
major
challenge
in
gaining
credibility
for
EBP
as
they
possess
all
the
characteristics
that
EBP
is
actively
trying
to
suppress
such
as
psyeudosciences,
fads
and
businesses
which
do
not
involve
transparency
and
ignore
the
real
potential
to
harm
that
interventions
can
have
(Gambrill,
2006,
p.
352).
Such
criticisms
of
EBP
are
invalid
simply
because
they
are
not
actually
referring
to
EBP.
Misunderstandings of EBP
Another
similar
area
of
EBP
criticism
can
be
categorised
quite
well
as
they
are
largely
ignorant
on
the
processes
of
EBP.
Ignores
clinical
expertise
One
unfounded
criticism
of
EBP
as
pointed
out
by
Gibbs
is
that
it
ignores
clinical
expertise.
Although
clinical
expertise
is
not
seen
as
sufficient
evidence
that
an
intervention
is
working,
it
is
vital
and
necessary
to
work
through
the
steps
have
concluded
that
the
intervention
was
beneficial
when
it
was
in
fact
harmful
(McCord,
2003,
p.
23).
Ironically
Webb
points
out
himself
in
the
next
sentence:
What
is
meant
by
effectiveness,
of
course,
is
often
a
matter
of
personal
interpretation.
(Some
considerations
on
the
validity
of
evidence-based
practice
in
social
work,
2001,
p.
62).
This
is
why
it
is
so
important
that
social
work
decisions
should
be
based
on
sound
evidence
and
not
subjective
expert
opinion.
Webb
further
goes
on
to
argue
that
EBP
assumes
that
professionals
are
rational
actors
ignoring
their
complexity.
He
argues
that
by
underplaying
the
values
and
anticipations
of
social
workers
at
the
level
of
ideas
[EBP]
ignores
the
processes
of
deliberation
and
choice
involved
in
their
decision
making
(Some
considerations
on
the
validity
of
evidence-based
practice
in
social
work,
2001,
p.
67).
However,
once
again
Webb
has
missed
the
point
in
that
EBP
suggests
that
social
workers
and
professionals
in
fact
are
not
rational
actors.
It
is
because
of
their
complexity
that
a
framework
is
required
to
minimise
individual
biases
and
judgement,
which
can
ultimately
lead
to
harm
as
seen
in
the
Cambridge-
Somerville
Study.
Systematic
reviews
and
RCTs
not
relevant
to
social
sciences
Another
aspect
of
EBP
criticised
is
the
applicability
of
systematic
reviews
to
assess
the
effects
of
social
interventions.
Certain
criticisms
however
are
simply
misinformed.
One
criticism
is
that
systematic
reviews
exclude
qualitative
data.
In
fact,
Webb
argues
that
EBP
relies
on
numerical
data
suggesting
that
qualitative
data
isnt
included.
However
a
simple
search
on
the
Cochrane
library
of
systematic
reviews
shows
many
examples
containing
qualitative
data
for
example
a
paper
titled:
"Barriers
and
facilitators
to
the
implementation
of
lay
health
worker
programmes
to
improve
access
to
maternal
and
child
health:
qualitative
evidence
synthesis."(Barriers
and
facilitators
to
the
implementation
of
lay
health
worker
programmes
to
improve
access
to
maternal
and
child
health:
qualitative
evidence
synthesis,
2013).
Webb
further
criticises
EBP
for
prioritising
the
systematic
review
and
randomised
trials
and
ignoring
other
research
methods
taught
on
sociology
and
cultural
studies
courses
(Some
considerations
on
the
validity
of
evidence-based
practice
in
social
work,
2001).
However
Chalmers
points
out
that
people
such
as
Webb
who
criticise
EBP
generally
fail
to
address
the
bigger
problem
in
how
different
methods
of
reviewing
evidence
can
lead
to
different
conclusions
(Chalmers,
2003).
One
of
the
main
advantages
of
the
EBP
approach
is
that
it
also
sheds
light
on
what
isnt
known,
involving
clients
as
oppose
to
cherry
picking
positive
results
to
present
a
distorted
image
(Gambrill,
2003,
p.
14).
Criticism
of
Randomised
Controlled
Trials
Despite
the
established
place
randomised
controlled
trials
(RCTs)
hold
in
the
medical
field,
social
scientists
still
tend
to
view
them
with
suspicion
arguing
that
they
are
inappropriate
for
evaluating
social
interventions
(Oakley,
1998,
p.
1239).
However
the
main
argument
EBP
proponents
use
in
favour
of
RCTs
is
that
without
a
control
you
can
never
be
completely
sure
that
any
effects
observed
are
actually
because
of
the
intervention.
This
can
be
seen
in
the
Cambridge-
Somerville
study,
as
if
it
wasnt
for
the
matched
control
group
of
boys
the
practitioners
would
have
falsely
believed
their
intervention
was
doing
good
when
it
was
in
fact
doing
harm.
Oakley
argued
that
RCTs
offer
the
same
thing
in
social
interventions
that
they
promise
to
do
in
medicine:
protection
of
the
public
from
potentially
damaging
uncontrolled
experimentation
and
a
more
rational
knowledge
about
the
benefits
to
be
derived
from
professional
intervention
(Oakley,
1998,
p.
1242).
Conclusion
In
conclusion,
although
many
of
the
criticisms
of
EBP
are
simply
misinformed
and
will
hopefully
decrease
as
the
field
grows
more
established
and
the
literature
grows,
there
are
other
criticisms
of
EBP
that
not
only
demonstrate
an
ignorance
of
EBP
but
they
underestimate
the
inherent
dangers
of
interventions
that
are
not
grounded
in
evidence.
Bibliography
Chalmers,
I.
(2003).
Trying
to
do
more
Good
than
Harm
in
Policy
and
Practice:
The
Role
of
Rigorous,
Transparent,
Up-to-Date
Evaluations.
The
ANNALS
of
the
American
Academy
of
Political
and
Social
Science,
589(1),
2240.
doi:10.1177/0002716203254762
Glenton,
C.,
Colvin,
C.
J.,
Carlsen,
B.,
Swartz,
A.,
Lewin,
S.,
Noyes,
J.,
&
Rashidian,
A.
(2013).
Barriers
and
facilitators
to
the
implementation
of
lay
health
worker
programmes
to
improve
access
to
maternal
and
child
health:
qualitative
evidence
synthesis.
Cochrane
Database
Syst
Rev,
10.
Fischer,
J.
(1978).
Effective
casework
practice.
McGraw-Hill
College.
Gambrill,
E.
(2006).
Evidence-Based
Practice
and
Policy:
Choices
Ahead.
Research
on
Social
Work
Practice,
16(3),
338357.
doi:10.1177/1049731505284205
Gambrill,
E.
D.
(2003).
From
the
Editor:
Evidence-Based
Practice:
Sea
Change
or
the
Emperor's
New
Clothes?
Journal
of
Social
Work
Education.
Gibbs,
L.,
&
Gambrill,
E.
(2002).
Evidence-Based
Practice:
Counterarguments
to
Objections.
Research
on
Social
Work
Practice,
12(3),
452476.
doi:10.1177/1049731502012003007
Haynes,
R.
B.,
Devereaux,
P.
J.,
&
Guyatt,
G.
H.
(2002).
Clinical
expertise
in
the
era
of
evidence-based
medicine
and
patient
choice.
Evidence
Based
Medicine,
7(2),
3638.
doi:10.1136/ebm.7.2.36
McCord,
J.
(2003).
Cures
That
Harm:
Unanticipated
Outcomes
of
Crime
Prevention
Programs.
Annals
of
the
American
Academy
of
Political
and
Social
Science,
587,
1630.
doi:10.2307/1049945?ref=no-x-
route:5189576885763da15149e1c27a47a7b3
Newman,
T.,
Moseley,
A.,
Tierney,
S.,
&
Ellis,
A.
(2005).
Evidence-based
social
work:
A
guide
for
the
perplexed.
Oakley,
A.
(1998).
Experimentation
and
social
interventions:
a
forgotten
but
important
history,
317(7167),
12391242.
doi:10.1136/bmj.317.7167.1239
Sackett,
D.
L.,
Richardson,
W.
S.,
Rosenberg,
W.,
&
Haynes,
R.
B.
(1997).
How
to
practice
and
teach
evidence-based
medicine.
New
York:
Churchill
.
Webb,
S.
A.
(2001).
Some
considerations
on
the
validity
of
evidence-based
practice
in
social
work.
British
Journal
of
Social
Work,
31(1),
57-79