Documente Academic
Documente Profesional
Documente Cultură
2.1 on the bounce fitness website find the policies and procedures that relate to this initiative. Discuss how
will you plan detailed in 2.3 of assessment task 2 will ensure the compliance, effectiveness and efficiency of
your information management system.
The ability to measure the effectiveness of your Knowledge Management (KM) program and the initiatives
that are essential to its success has been a challenge for all organizations executing a KM Program. Capturing
the appropriate metrics is essential to measuring the right aspects of your KM Program. The right metrics will
facilitate a clear and correct communication of the health of the KM program to your organization’s
leadership. In this post, I will identify metrics (or measurements) of four key initiatives of most KM Programs.
These initiatives are: Communities of Practice, Search, Lessons Learned, and Knowledge Continuity.
Community of Practice (CoP) Metrics
Search Metrics
Search Metrics are determined through Tuning and Optimization. Site Owners/Administrators should
constantly observe and evaluate effectiveness of search results. Site Administrators/Owners should be able to
gather Search Results reports from the KMS administrator periodically (every two weeks).
Search engine usage – Search engine logs can be analyzed to produce a range of simple reports,
showing usage, and a breakdown of search terms.
Number of Searches performed (within own area and across areas)
Number of highly rated searches performed
User rankings – This involves asking the readers themselves to rate the relevance and quality of the
information being presented. Subject matter experts or other reviewers can directly assess the quality of
material on the KM platform.
Information currency – This is a measure of how up-to-date the information stored within the system is. The
importance of this measure will depend on the nature of the information being published, and how it is used.
The great way to track this is using metadata such as publishing and review dates. By using this, automated
reports showing a number of specific measures can be generated:
Average age of pages
Number of pages older than a specific age
Number of pages past their review date
Lists of pages due to be reviewed
Pages to be reviewed, broken down by content owner or business group
User feedback – A feedback mechanism is a clear way to indicate if staff is using the knowledge. Alternatively,
while many feedback messages may indicate poor quality information, it does indicate strong staff use. It also
shows they have sufficient trust in the system to commit the time needed to send in feedback
Lessons Learned Metrics
Lessons Learned Basic Process: Identify – Document – Analyze – Store – Retrieve
Metrics are determined and organized by key fields from the lessons learned template and includes responses
gathered during the session. Lessons Learned should be identified by Type of lesson learned captured (i.e.,
resource, time, budget, system, content, etc.). Summarize the lesson learned by creating a brief summary of
the findings and providing recommendations for correcting the findings (i.e., Findings – a summary of the
issues found during the review process; Recommendations – recommended actions to be taken to correct
findings).
Knowledge Continuity
The keys at the heart of knowledge continuity include:
What constitutes mission-critical knowledge that should be preserved?
Where is the targeted mission-critical knowledge and is accessible and transferable?
What decisions and action are required to stem the loss of valuable and in many cases irreplaceable
knowledge?
Successfully obtaining, transferring, and storing the lessons learned and best practices from their most
experienced and valuable workers to a knowledge-base or (KM Application) before employees depart or
retire?