Documente Academic
Documente Profesional
Documente Cultură
Abstract—To realize smart homes with sophisticated services Many activity recognition methods using wearable sensors are
including energy-saving context-aware appliance control in homes proposed (e.g., [7]). However, they focus on simple human
and elderly monitoring systems, automatic recognition of human activities like walking, running, etc. and it is difficult for these
activities in homes is essential. Several daily activity recognition methods to use for recognition of various complex activities
methods have been proposed so far, but most of them still have in homes.
issues to be solved such as high deployment cost due to many
sensors and/or violation of users’ feeling of privacy due to use In this paper, we propose a method for recognizing various
of cameras. Moreover, many activity recognition methods using human activities in homes, which suppresses deployment cost
wearable sensors have been proposed, but they focus on simple and does not violate users’ privacy feeling by using only
human activities like walking, running, etc. and it is difficult to
positioning sensors and power meters. The proposed method
use these methods for recognition of various complex activities
in homes. In this paper, we propose a machine learning based targets various in-home daily activities (our final goal is to
method for recognizing various daily activities in homes using recognize all of the following activities: cooking, taking a
only positioning sensors equipped by inhabitants and power meal, cleaning rooms, washing clothes, taking bath, having
meters attached to appliances. To efficiently collect training data a wash, washing dishes, going to washroom, reading a book,
for constructing a recognition model, we have developed a tool studying, changing clothes, makeup, conversation, telephoning,
which visualizes a time series of sensor data and facilitates a user watching TV, listening music, gaming, using PC, outing, and
to put labels (activity types) to a specified time interval of the returning home).
sensor data. We obtain training samples by dividing the extracted
training data by a fixed time window and calculating for each In the proposed system, we construct an activity recog-
sample position and power consumptions averaged over a time nition model by applying a machine learning algorithm to
window as feature values. Then, the obtained samples are used sensor logs collected in a target home. To easily obtain training
to construct an activity recognition model by machine learning. data set from all the data collected in a home, we have
Targeting six different activities (watching TV, taking a meal, developed a tool which visualizes temporal variation of sensor
cooking, reading a book, washing dishes, and other), we applied
logs consisting of locations of inhabitants, power consumption
our proposed method to the sensor data collected in a smart home
testbed. As a result, our method recognized 6 different activities of each home appliance collected with positioning sensor and
with precision of about 85% and recall of about 82%. power meters. The tool has a function which allows a user
to easily attach a label representing activity type to a specific
time interval of sensor logs. It also has a function to play back
I. I NTRODUCTION video of each room in the target home so that the user can
intuitively confirm what activity actually happens in the video
In recent years, various sensing devices including smart- while investigating the sensor value variation in the specified
phones have been on the market and widespread. These time interval. Each sensor log of a time interval with a label
sensing devices make it possible to sense various contexts is extracted and stored as training data. Then each training
in households, not only environmental information such as data is divided into samples by a fixed time window (e.g., 5
temperature and humidity but also human activities and device minutes) and position and power consumptions averaged over
usage conditions. It is expected that automatic recognition of time window are calculated as feature values of each sample.
human activities enables various daily living support services Finally, extracted samples with features are used to construct
including energy-saving context-aware appliance control in an activity recognition model by associating the features with
homes [1] and elderly monitoring systems [2]. Such services, the label (activity type). Among several machine learning
however, require high recognition accuracy for many differ- algorithms, we employed SVM (Support Vector Machines) to
ent activities. There are many studies which utilize various construct the model.
sensing technologies and machine learning based methods to
recognize human activities in homes. There are several activity To evaluate accuracy of the proposed recognition method,
recognition systems using fixed cameras such as [4], [5]. we collected sensor data in a smart home testbed (consisting
However, these systems may decrease users’ comfort level of one bed room, one living room with kitchen, bathroom,
since they violate the users’ privacy feeling (a feeling of being washroom, etc.) which was constructed in Nara Institute of
monitored) by cameras. There are some studies (e.g., [6]) Science and Technology (NAIST) and derived accuracy of
which attach contact sensors to objects like cups and dishes recognizing six different activities (watching TV, taking a
and recognize complex activities like making coffee, making meal, cooking, reading a book, washing dishes, and other)
pasta, etc. However, contact sensors must be attached to almost by applying the proposed method to the collected data. As
every object, and it will be too costly for use in general homes. a result, we confirmed that our proposed method recognizes 6
different activities with precision of about 85% and recall of abstract activities that cover the basic daily living activities in
about 82%. We also show that degrading positioning accuracy a home.
to some extent does not affect the activity recognition accuracy
so much.
III. ACTIVITY R ECOGNITION IN H OMES : R EQUIREMENTS
II. R ELATED W ORK AND S ENSORS U SED
Many studies about indoor activity recognition have been As we already addressed in Sect. I, the following require-
conducted. Activity recognition studies can be roughly cate- ments must be satisfied in activity recognition in homes.
gorized into two methods: processing images taken by video
cameras, and using various sensors, such as pressure and (1) Abstract and various types of living activities are
contact sensors. recognized.
Brdiczka et al. [8] proposed a technique for recognizing (2) Low-cost and a small number of sensors are used.
living activities inside a smart home. Their study used an
ambient sound sensor and a 3D video tracking sensor, and (3) Low privacy exposure of the residents is realized.
achieved recognition rates ranging from 70% to 90% for both
individual activities, such as working and naps, and activities Basic steps to solve these requirements are described as
performed by more than one person, such as conversations follows. To satisfy requirement (1), we target the twenty daily
and games. However, their method requires a specific camera living activities such as “cooking” and “taking a meal” to cover
and microphone and places the residents at risk to privacy the basic activities in the home. For requirements (2) and (3),
exposure. In addition, the recognition accuracy of their method we use only indoor positioning sensors and power meters. The
is not enough as many other activities are left unrecognized. following subsection contains the definitions of living activities
and the types of sensor data collected.
Kasteren et al. [9] designed a system for recognizing living
activities such as eating, watching TV, going out, using the toi-
let, taking showers, doing the laundry, and changing clothes in A. Definition of living activity
a smart home embedded with door sensors, pressure-sensitive
mats, float sensor, and temperature sensor. The recognition We describe the target living activities in this section.
accuracy of their system ranges from 49% to 98%. It can According to the Statistic Bureau, Ministry of Internal Affairs
recognize many activities, but it has a high initial costs and and Communications in 2011, the main activities within one
low recognition accuracy depending on the type of activities. day is classified into the 20 types shown in Figure 1. The ac-
tivities are classified as primary activities (i.e., physiologically
Chen et al. [6] designed a system for recognizing com- necessary activities such as sleeping and eating), secondary
plex living activities such as making coffee, cooking pasta, activities (i.e., mandatory activities in social life such as
watching TV, taking a bath, and washing hands in a smart working and housework), and tertiary activities (i.e,. activities
home embedding contact, motion, tilt and pressure sensors. in during times that can be used freely). In addition, a detailed
Their system achieved a recognition accuracy greater than classification method with 6 large classifications, 22 middle
90%. However, this method requires many sensors and overall classifications, and 90 small classifications of action within
system cost will be high. one day are also defined. We refer to these definitions in our
Activity recognition methods that use wearable accelerom- study, and we extracted the 20 activities as targets of our
eters have already achieved accuracies greater than 90% for living activity recognition method: cleaning rooms, washing
simple actions such as walking, sitting, running and sleeping clothes, taking bath, having a wash, washing dishes, going
[10]. However, using wearable accelerometers to recognize to washroom, reading a book, studying, changing clothes,
abstract or complex activities has not yet been proposed. The makeup, conversation, telephoning, watching TV, listening
method of Bao et al. [11] can recognize 20 activities, such music, gaming, using PC, outing, and returning home.
as watching TV, cleaning, and working, using five wearable
accelerometers. However, the burden on users is heavy because
it requires a user to wear five sensors. Maekawa et al. [12] B. Collection of sensor data
focused on the magnetic field generated by home appliances
when used, and proposed a method of recognizing the living In this subsection, we describe the sensors used in our
activities, such as watching TV, shaving, the operation of the study. Data is collected by a person living in the smart
mobile phone, brushing of teeth and cleaning, using a wearable home shown in Figure 2 (Experimental housing facilities of
magnetic sensor. However, their approach is limited to actions 1 bed room and 1 living room with kitchen built in the Nara
associated with the operation appliances and their recognition Institute of Science and Technology). In the smart home,
accuracy is only approximately 75%. power meters, ambient sensors (i.e. temperature, humidity,
illumination, human sensors embedded in different places),
To overcome the limitations in the aforementioned studies, ultrasonic positioning sensor, door sensors, faucet sensors
we propose a daily living activity recognition method that has are deployed. In the proposed method, we use only power
low initial costs, low privacy exposure, and uses only power meters and ultrasonic positioning sensor for living activity
meters and indoor positioning sensors that are expected to recognition. Sensor data acquired by both sensors are sent
become widely-used in the future. In addition, our system aims by ZigBee and automatically stored in a server. We describe
to achieve high recognition accuracy for a range of simple and details of each sensor below.
355
The 2nd International Workshop on Smart Environments: Closing the Loop, 2015
356
The 2nd International Workshop on Smart Environments: Closing the Loop, 2015
357
The 2nd International Workshop on Smart Environments: Closing the Loop, 2015
TABLE I. C ONFUSION MATRIX IN THE CASE OF USING BOTH THE TABLE III. C ONFUSION MATRIX IN THE CASE OF USING ONLY THE
POWER CONSUMPTION AND POSITION INFORMATION POSITION INFORMATION
TABLE II. E VALUATION RESULTS IN THE CASE OF USING BOTH THE TABLE IV. E VALUATION RESULTS IN THE CASE OF USING ONLY THE
POWER CONSUMPTION AND POSITION INFORMATION POSITION INFORMATION
living activity Precision(ˋ) Recall(ˋ) F-measure(ˋ) living activity Precision(ˋ) Recall(ˋ) F-measure(ˋ)
cooking 100.00 80.00 88.89 cooking 84.85 93.33 88.89
taking a meal 66.67 100.00 80.00 taking a meal 69.77 100.00 82.19
washing dishes 75.86 73.33 74.57 washing dishes 85.71 60.00 70.59
watching TV 81.08 100.00 89.55 watching TV 41.30 63.33 50.00
reading a book 86.67 86.67 86.67 reading a book 42.86 30.00 35.30
other 100.00 50.00 66.67 other 100.00 53.33 69.56
average 85.05 81.67 81.06 average 70.75 66.67 66.09
358
The 2nd International Workshop on Smart Environments: Closing the Loop, 2015
359