Documente Academic
Documente Profesional
Documente Cultură
DELIVERABLE
Deliverable 1.3
Framework for APOLLON Evaluation and Impact Assessment
including KPI definition and measurement
Authors:
Apollon – Deliverable 1.3
Revision
History
The
information
in
this
document
is
provided
as
is
and
no
guarantee
or
warranty
is
given
that
the
information
is
fit
for
any
particular
purpose.
The
user
thereof
uses
the
information
at
its
sole
risk
and
liability.
1. Summary
The
aim
of
this
deliverable
is
to
provide
an
evaluation
and
impact
assessment
framework
that
will
allow
to
assess
the
APOLLON
methodology
&
tool
set
as
well
as
to
identify
the
added
value
of
cross
border
Living
Lab
networking
with
specified
key-‐
performance
indicators.
In
this
deliverable,
a
presentation
of
the
theoretical
framework
that
has
guided
the
development
of
the
evaluation
framework
is
given.
In
addition
the
investigation
that
was
performed
among
APOLLON
partners
in
the
beginning
of
the
project
is
presented.
This
investigation
served
to
identify
relevant
measures
of
performance
among
the
involved
project
partners.
We
have
also
used
the
D
x.1
deliverables
from
the
other
work-‐packages
as
a
means
to
identify
relevant
performance
indicators
of
the
experiments
in
the
thematic
areas.
These
sources
of
information
then
formed
the
basis
of
this
deliverable
together
with
the
theoretical
framework.
This
framework
has
then
been
used
as
a
basis
for
the
design
of
the
evaluation
process
as
well
as
the
evaluation
framework.
The
process
of
evaluating
the
APOLLON
methodology
is
described
where
the
liaison
person
from
WP1
collaboratively
and
iteratively
evaluates
the
different
stages
of
the
methodology.
The
process
of
evaluating
the
experiment,
which
is
carried
out
in
the
different
work-‐packages,
is
based
on
self-‐assessment
where
the
leaders
of
task
x.4
in
each
work-‐package
apply
the
framework
and
make
the
necessary
adjustments
for
their
context
into
the
framework.
This
report
also
contains
the
research
framework
which
is
applied
into
the
experiments
in
the
work-‐packages
to
help
them
design
and
assess
their
experiments
in
a
considered
and
researchable
manner.
Finally,
this
deliverable
contains
two
different
evaluation
framework
templates,
first
the
framework
for
evaluating
the
APOLLON
methodologies
each
phases,
and
secondly
the
framework
for
evaluating
the
added
value
and
impact
of
the
experiments
for
relevant
stakeholders.
2. Introduction
The
evaluation
and
impact
assessment
framework
developed
in
the
APOLLON
project,
aims
to
monitor,
analyse
and
assess
the
APOLLON
methodology
as
well
as
the
added
value
of
cross
border
Living
Lab
networking.
The
aim
of
this
deliverable
is
to
provide
an
evaluation
and
impact
assessment
framework
that
will
allow
to
assess
the
APOLLON
methodology
&
tools
set
as
well
as
to
identify
the
added
value
of
cross-‐border
Living
Lab
networking.
In
this
framework
key
performance
indicators
are
defined
which
will
be
measured
in
the
experiments
in
WP
2,
3,
4
&
5
in
task
x.4.
This
evaluation
framework
will
therefore
assess
two
different
processes,
(1)
the
APOLLON
methodology
supporting
the
cross
border
networking,
and
(2)
the
added
value
of
the
cross
border
Living
Lab
networking.
The
developed
APOLLON
methodology
will
provide
a
framework
for
engaging,
empowering
and
mobilizing
self-‐organizing
individuals
within
actor
networks.
The
proposed
cross-‐industry
infrastructure
provides
new
opportunities
and
insights
for
individuals
as
relationships
between
the
organization
and
its
members,
and
among
actors
within
and
across
organizations.
The
individual
steps
of
the
methodology
will
be
continuously
evaluated
in
three
month
intervals
during
the
project
in
close
collaboration
with
the
other
work-‐packages.
Research
Focus
To
what
extent
is
Difficulties
faced
Evaluate
new
ways
of
What
is
needed
for
a
trans-‐national
with
product
collaborating
with
engaging
users
to
innovation
system
integration
partners
participate,
cultural
able
to
stimulate
differences
the
adaptation
of
innovations
successful
in
one
country
to
another
country
Research
How
can
we
Difficulties
with
User
co-‐innovation
Which
cultural
specific
Question
transfer
a
users
culture
and
issues
are
problematic
contextualised
their
when
extending
more
project
into
surrounding
innovative
applications
another
cross-‐ environment
to
a
broader
contexts
border
project
and
what
issues
are
related
to
that
Method
Compare
use
of
Results
on
the
Networking
between
LLs
Integration
of
different
the
platform
in
impact
of
solutions
–
impact
on
different
LL
regulatory
market
fragmentation
contexts
environment,
climate,
culture
and
behaviour
compared
between
the
different
LLs
Expected
Benefits
Market
Potential
of
cross
SMEs
transnational
The
applications
ability
opportunities
for
border
to
answer
to
Data
collection
Monitoring,
Study
user
Feedback
on
platform
Lingual
and
cultural
interviews,
behaviour
deployment
and
misunderstandings
questionnaires
change
and
integration
services
mechanisms
Skills Enhancement
Efforts Support
Dissemination
activities
Constructs
What
are
the
What
are
the
How
do
you
How
do
you
filter
variables
that
you
elements
that
decide
best
pilot
specific
study?
you
measure?
practices
across
elements
out?
the
experiments?
Model
What
are
the
basic
What
measures
What
are
the
How
do
you
assess
assumptions,
do
you
use
to
success
criteria
the
wider
causalities
and
evaluate
the
that
you
use?
applicability
of
the
outcomes
that
you
validity
of
the
model?
perceive?
assumptions?
Method
What
is
the
How
do
you
How
do
you
justify
How
do
you
process
for
evaluate
and
the
use
of
selected
ensure
the
validating
the
adjust
the
methods?
scalability
and
assumptions?
validation
wider
applicability
process?
of
the
methods?
Installation
Who
are
the
How
do
you
How
do
you
justify
How
do
you
stakeholders
at
evaluate
added
the
selected
compile
your
experiment?
value
for
each
collaboration
recommendations
stakeholder?
model?
for
sustainability
Figure 2. Thematic experiments’ focus and content communicated in categories of ‘activities’ and ‘outputs’
By
applying
this
framework,
the
work-‐packages
are
facilitated
in
their
process
of
defining
the
measures
and
key-‐performance
indicators
of
each
experiment.
This
information
will
then
be
used
as
input
to
the
evaluation
framework
of
the
thematic
experiments.
The
answers
will
reflect
the
variables
that
are
measured
in
project
level
by
Tasks
X.4
in
the
thematic
experiments.
This
collected
data
will
be
fed
back
to
the
development
of
APOLLON
evaluation
framework,
and
contribute
to
the
creation
of
the
final
version
of
the
document.
In
this
formative
process
we
need
to
be
in
contact
with
the
vertical
experiments
regularly.
We
need
your
contribution
and
experiences
in
order
to
provide
you
with
usable
advice
on
collaboration
practices
within
their
living
lab
network.
Initially,
we
propose
the
following
practices:
1. Requirement
collection
from
thematic
experiments
2. Dedicated
Work
Package
1
members
as
liaison
to
vertical
experiments
3. Regular
collaboration
and
formal
meetings
for
iterative
concept
validation
4. A
wiki
as
platform
to
share
insights
practices
and
with
vertical
experiments
We
propose
regular
collaboration
and
formal
meetings
for
iterative
concept
validation
every
3
months.
Responsibility
for
calling
these
meeting
will
be
with
WP1,
and
all
WP
leaders
commitment
to
participate
in
the
process
either
themselves
of
with
a
nominated
representative
will
be
needed.
This
process
would
kick
off
at
the
APOLLON
general
assembly
in
September
30th,
and
meet
at
3
month
intervals
in
December
2010,
March
2011,
June
2011,
September
2011
and
November
2011.
APOLLON
wiki
at
mybbt
will
be
used
to
disseminate
the
latest
results.
This
platform
is
open
for
comments
and
contributions
at
any
time.
Other
channels
are
meetings,
presentations
and
emails.
WP1
will
also
take
more
active
role
in
large
APOLLON
events
as
findings
accumulate.
For
more
information,
please
refer
to
D1.2,
of
which
this
is
a
short
concluding
summary.
Methodology
Evaluation
In
this
section
the
aim
is
to
evaluate
the
APOLLON
cross-‐border
methodology.
This
evaluation
will
be
carried
out
continuously
by
the
liaison
person
in
collaboration
and
dialogue
between
the
different
work-‐packages
and
WP1.
Work-‐package number
Figure
3:
Living
Lab
Milieu
Key
Components
Background
Information
In
the
subsequent
rows
some
background
data
is
required
to
se
the
evaluation
in
the
right
context.
WP
number
Experiment
description
Involved
Partners
Number
of
countries
involved
in
the
experiment
Type
of
cross-‐border
activities
that
has
been
carried
out
in
the
experiments
Purpose
of
the
cross-‐
border
activities
(expected
outcome)
Experienced
strengths
of
working
in
cross-‐
border
collaboration
Approach
Approach
refers
to
the
methods
and
techniques
that
have
been
used
to
support
the
cross-‐border
collaboration
in
the
APOLLON
project.
Hence,
it
has
a
broader
scope
than
what
is
usually
assessed
in
Living
Lab
activities.
In
the
following
table,
some
questions
require
numerical
value
while
others
are
of
more
descriptive
character.
Thus,
not
all
questions
will
have
a
numerically
measureable
impact
but
if
other
impact
has
been
observed
these
should
be
filled
in.
Theme
Measures
Value
Measurement
Impact
(output)
tool
(e.g.
%
(where
the
data
ratio
of
stem
from,
e.g.
ordinary
deliverable
number,
values,
or
interview
etc)
qualitative
impacts)
No
of
cross-‐border
activities
Approach
No
of
intellectual
(The
lines
products
that
only
(methodologies,
have
one
know-‐how
etc)
column
to
transferred
in
the
fill
in
aims
at
experiment
gathering
qualitative
No
of
technology
data)
transfer
activities
Which
methods
were
used
in
the
experiment?
Please
name
and/or
shortly
describe
the
methods
own
specific
wealth
of
knowledge
and
expertise
to
the
project
and
thus,
helped
to
achieve
cross-‐border
networking
experiments.
In
the
following
table,
some
questions
require
numerical
value
while
others
are
of
more
descriptive
character.
Thus,
not
all
questions
will
have
a
numerically
measureable
impact
but
if
other
impact
has
been
observed
these
should
be
filled
PARTNERS:
SME
In
this
section,
the
aim
is
to
evaluated
the
SME
engagement
and
the
added
value
of
their
participation
for
them
as
SMEs
Theme
Measures
Value
(output)
Measurement
Impact
tool
(%
ratio
(where
the
data
of
stem
from,
e.g.
ordinary
deliverable
values)
number,
interview
etc)
No
of
SMEs
involved
in
the
experiment
No
SME
engagement
activities
No
of
new
international
PARTNERS:
partners
SME
No
of
signed
letter
of
intent
between
(The
lines
partners
and/or
that
only
have
customers
one
column
to
fill
in
aims
at
No
of
new
gathering
businesses
qualitative
generated
in
other
data)
countries
No
of
new
business
proposals
No
of
new
customers
in
other
countries
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
turnover
I
do
not
know
Not
relevant
in
this
experiment
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
customer
I
do
not
know
retention
Not
relevant
in
this
experiment
SME
engagement
activities
in
detail
(e.g.
developing
technology,
user
tests,
implementation
of
technology
etc)
What
was
the
role
of
the
SME
in
the
cross-‐border
collaboration?
PARTNERS:
Large
Enterprises
In
this
section,
the
aim
is
to
evaluate
the
Large
Enterprises
engagement
and
the
added
value
of
their
participation
for
them
as
Large
Enterprise
No
of
LEs
involved
in
the
experiment
No
LE
engagement
activities
Large
Enterprise
No
of
new
international
(The
lines
that
partners
only
have
one
column
to
fill
No
of
signed
letter
in
aims
at
of
intent
between
gathering
partners
and/or
qualitative
customers
data)
No
of
new
businesses
generated
in
other
countries
No
of
new
business
proposals
No
of
new
customers
in
other
countries
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
turnover
I
do
not
know
Not
relevant
in
this
experiment
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
iincreased
customer
I
do
not
know
retention
Not
relevant
in
this
experiment
LE
engagement
activities
in
detail
(e.g.
developing
technology,
implementation
of
experiments
etc)
What
was
the
LE
role
in
the
cross-‐
border
experiment
PARTNERS:
Local
Authorities
In
this
section,
the
aim
is
to
evaluate
the
Local
Authorities
engagement
and
the
added
value
of
their
participation
for
them
as
Local
Authorities
Theme
Measures
Value
(output)
Measurement
Impact
tool
(%
ratio
(where
the
data
of
stem
from,
e.g.
ordinary
deliverable
values)
number,
interview
etc)
No
of
local
authorities
involved
in
the
experiment
No
local
authority
engagement
activities
No
of
new
PARTNERS:
international
Local
partners
Authorities
No
of
signed
letter
(The
lines
that
of
intent
between
only
have
one
partners
and/or
column
to
fill
customers
in
aims
at
No
of
new
gathering
businesses
qualitative
generated
in
other
data)
countries
No
of
new
business
proposals
No
of
new
customers
in
other
countries
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
turnover
I
do
not
know
Not
relevant
in
this
experiment
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
customer
I
do
not
know
retention
Not
relevant
in
this
experiment
Local
authority
engagement
activities
in
detail
(e.g.
implementation
of
experiments,
experimental
settings
etc)
What
was
the
local
authorities
role
in
the
cross-‐border
No
of
products
that
has
been
transferred
in
the
experiment
No
of
cross-‐border
collaboration
tools
that
has
been
used
the
experiment
No
of
NEW
(for
the
Technologies
stakeholders)
ICT-‐
tools
that
has
been
used
in
the
experiment
No
of
distributed
cross-‐border
collaboration
activities
Did
the
cross-‐border
Yes
collaboration
tools
No
you
used
lead
to
increased
access
to
I
do
not
know
relevant
information
We
didn´t
use
any
collaborative
tools?
Did
the
cross-‐border
Yes
collaboration
tools
No
you
used
lead
to
increased
I
do
not
know
effectiveness
in
We
didn´t
use
communication
any
collaborative
tools?
Did
the
cross-‐border
Yes
collaboration
tools
No
you
used
lead
to
increased
co-‐ I
do
not
know
creation
of
We
didn´t
use
innovations
among
any
stakeholders
collaborative
tools?
Which
collaboration
tools
have
been
used
to
support
the
cross-‐
border
collaboration
in
the
experiment
Research
Research
symbolizes
the
collective
learning
and
reflection
that
take
place
in
the
Living
Lab,
and
should
result
in
contributions
to
both
theory
and
practice.
In
the
following
table,
some
questions
require
numerical
value
while
others
are
of
more
descriptive
character.
Thus,
not
all
questions
will
have
a
numerically
measureable
impact
but
if
other
impact
has
been
observed
these
should
be
filled
in.
In
the
questions
where
answers
of
Yes
and
No
character
are
asked
for,
please
respond
according
to
the
experiences
from
the
experiments.
This
is
not
an
exact
measure,
it
rather
strive
to
gather
the
impressions
of
the
impact.
interview etc)
No
of
research
activities
that
has
been
performed
during
the
experiment
No
of
authored
journal
articles
No
of
research
conference
presentations
Management
Management
represent
the
ownership,
organization,
and
policy
aspects
of
Living
Labs.
In
this
project,
the
aim
is
also
to
define
the
role
of
the
Living
Lab
in
the
cross-‐border
collaboration
as
well
as
the
impact
of
the
project
on
local
Living
Labs
as
well
as
the
EnoLL.
In
the
following
table,
some
questions
require
numerical
value
while
others
are
of
more
descriptive
character.
Thus,
not
all
questions
will
have
a
numerically
measureable
impact
but
if
other
impact
has
been
observed
these
should
be
filled
in.
In
the
questions
where
answers
of
Yes
and
No
character
are
asked
for,
please
respond
according
to
the
experiences
from
the
experiments.
This
is
not
an
exact
measure,
it
rather
strive
to
gather
the
impressions
of
the
impact.
No
of
Living
Labs
that
has
been
involved
in
the
experiment
No
of
new
collaboration
initiatives
between
Living
Lab
network
MANAGEMENT:
members
(planned,
Living
Lab
prepared
or
Management
submitted)
Role
No
of
new
Living
Lab
network
members
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
access
I
do
not
to
user
know
communities
in
Not
relevant
other
countries?
for
our
experiment
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
value
I
do
not
proposition
to
the
know
stakeholder
Not
relevant
community
for
our
experiment
Did
the
cross-‐ Yes
border
No
collaboration
lead
to
increased
I
do
not
learning
of
Living
know
Lab
collaboration
in
Not
relevant
networks
for
our
experiment
References
Benyon,
D.,
Turner,
P.,
and
Turner,
S.
2005.
Designing
Interactive
Systems.
Edinburgh:
Pearson
Education
Limited.
Bergvall-‐Kåreborn,
B.,
Ihlström
Eriksson,
C.,
Ståhlbröst,
A.,
and
Svensson,
J.
2009.
A
Milieu
for
Innovation
-‐
Defining
Living
Labs.
The
2nd
ISPIM
Innovation
Symposium
-‐
Stimulating
Recovery
-‐
The
Role
of
Innovation
Management.
New
York
City,
USA.
6-‐9
December
2009
Córdoba,
J.,
and
Robson,
W.
2003.
Making
the
Evaluation
of
Information
Systems
Insightful:
Understanding
the
Role
of
Power-‐Ethics
Strategies.
Electronic
Journal
of
Information
Systems
Evaluation
(2),
http://www.ejise.com/volume6-‐
issue2/vol6-‐i2-‐articles.htm.
Guba,
E.,
and
Lincoln,
Y.
1989.
Fourth
Generation
Evaluation.
Newbury
Park:
Sage
Publications
Inc.
Karlsson,
O.
1999.
Utvärdering
-‐
mer
än
metod.
Edited
by
s.
kommunförbundet.
Vol.
3,
ÁJOUR.
Stockholm:
Kommentus
Förslag.
Lewis,
J.
2001.
Reflections
on
Evaluation
in
Practice.
Evaluation
7
(3):384-‐394.
Lundahl,
C.,
and
Öquist,
O.
2002.
Idén
om
en
helhet
-‐
utvärdering
på
systemteoretisk
grund.
Lund
Studentlitteratur.
Newman,
W.,
and
Lamming,
M.
1995.
Interactive
System
Design.
Cambridge:
Addison-‐
Wesley
Publisher
Ltd.
Patton,
M.,
Q.
1990.
Qualitative
evaluation
and
research
methods.
2nd
ed.
Newbury
Park:
Sage
Publications.
Patton,
M.,
Q.
.
1987.
How
to
Use
Qualitative
Methods
in
Evaluation.
California:
Sage
Publications.
Perrin,
B.
2002.
How
to
-‐
and
How
Not
to
-‐
Evaluate
Innovation.
Evaluation
8
(1):13-‐28.