Sunteți pe pagina 1din 3

HealthFog: An Ensemble Deep Learning based Smart Healthcare System for

Automatic Diagnosis of Heart Diseases in Integrated IoT and Fog Computing


Environments

Fog computing is a distributed computing system in which information, computation, processing and
applications are somewhere between the cloud and the data source, because both require getting the
data closer to where it is made. This is often done to improve performance, but it can also be used for
purposes of protection and safety. Paradigms of fog and cloud computing have emerged as the moral
backbone of modern technology and use the Internet to provide end users with on-demand services.
Cloud computing, however, is not a good option for applications needing real-time response due to high
time delay. Thanks to its robustness and ability to provide different response characteristics depending
on target usage, technological developments like edge, fog computing, Internet of things and Big Data
have taken on board. By using the new technologies, we are facilitated by improving mobility,
confidentiality, protection, low-latency bandwidth, and network capacity so that latency and real-time
applications are fully compatible.

As we know, Healthcare is one of the prominent areas of application that demands precise and real-time
results, and Fog Computing has been implemented in this field and is progressing positively. By using
Fog computing, the resources are brought closer to the users and latency decreases and the safety
measure is thus increased

There is some past work which needs to mentioned to understand this paper and how it is different
from these related works.

1- Low-cost fog-assisted health-care IoT system with energy-efficient sensor nodes:


This present a low-cost health monitoring system that provides continuous remote
monitoring of ECG together with automatic analysis and notification. The system consists of
energy-efficient sensor nodes and a fog layer altogether taking advantage of IoT. The sensor
nodes collect and wirelessly transmit ECG, respiration rate, and body temperature to a
smart gateway which can be accessed by appropriate care-givers.
2- Design and Evaluation of a Person-Centric Heart Monitoring System over Fog Computing
Infrastructure:
This work proposed an ECG-based Healthcare (ECGH) system to diagnose cardiac
abnormalities.
3- Real-time Heart Attack Mobile Detection Service:
This work proposed an IoT e-health service, which collects data through smartphone in
the form of voice control and finds the health status of patients. Further, an IoT e-health
service finds the type of heart attack using mobile application based conceptual model.
4- A Smart Fog Gateway for End-to-End Analytics in Wearable Internet of Things
This work proposed a Smart Fog Gateway (SFG) model for end-to-end analytics in
wearable IoT devices and demonstrated the role of the SFG in orchestrating the process of
data conditioning, intelligent filtering, smart analytics, and selective transfer to the cloud for
long-term storage and temporal variability monitoring. SFG model optimizes the
performance in terms of execution time and energy consumption.
As we saw the past work still, Current fog models have many drawbacks, relying on both accuracy and
decreased response time, but not both, from a limited perspective. In this paper the quick response is
not only considered but the accuracy of result is highly considered. Deep learning is used to predict the
patient status. Ensemble learning is used to get the best result from the multiple classifiers used to
predict the result. Many algorithms are trained and tested on given input if the number of trues is higher
then false then the ensemble technique will returned the result as true means the patient has the heart
disease otherwise the patient is safe and have no heart disease.

The data for the model is collected from various IoTs sensors which can be healthcare sensors. Getway
devices(Mobils,laptop,PC) get the data and send it to the fog worker nodes. Which ensure data integrity,
security and privacy. Fogbus use blockchain authentication and encryption techniques which increase
the reliabilities and robustness of Fog environment.

The whole system architecture is consist of three parts The sensor to generate data , the end user which
will use the Gatway devices to input the data generated by IoTs sensors as mentioned above and send it
to the FogBus Based fog computing environment which consist of Broker node ,worker node and cloud
data center. The Broker node receives Gateway devices job requests and/or input information. Task
input module accepts job requests from Gateway devices immediately prior to data transfer. Security
Management module provides secure communication between different components and protects the
collected data from unauthorized access. The Worker node includes deep learning models for input data
processing, analyzing and results generation. Certain modules for data processing, data sorting and
extraction, Big Data analysis and storage can also be included in the Worker node. Worker nodes get
input data directly from the devices of the Gateway, produce results and share with the same devices. In
Health-Fog model, the Broker node can also behave as a Worker node. Here comes the interesting part
of the model which is Cloud Data Center. If the fog network is overwhelmed, latency tolerant facilities or
the volume of data input is considerably higher than the average size, HealthFog uses cloud data center
(CDC) tools. It makes the data processing location more reliable, is able to accomplish heavy load tasks
rapidly and also make data preprocessing location independent.

The software part of the system consist of Data filtering and pre-processing, Resource Manager, Deep
learning Module, Ensembling Module. In the first step the input data are processed and checked. Filters
are used to reduce the data into a smaller dimension using Principal Component Analysis (PCA) using Set
Partitioning In Hierarchical Trees (SPIHT) algorithm and encrypted using Singular Value Decomposition
(SVD) technique with the goal of extracting key components of data feature vectors that affect the
health status of patients. The Resource Manger part is comprises of two parts, workload manager and
arbitration module. Workload administrator for data processing manages job requests and work queues.
It also manages a large amount of data that needs to be processed. The Arbitration module schedules
the fog or cloud services offered to handle tasks that the workload manager has queued and managed.
The Deep learning part of the software uses the dataset to train a Neural Network to identify data points
that are feature vectors collected after the Body Area Sensor Network data has been pre-processed. It
also predicts and produces outcomes for the data obtained from the Gateway devices based on the role
given by the Resource Manager. The last part of the software is Ensembling Module, This Module
collects predictive results from different models and uses voting to assess the output category of
whether or not the patient has heart disease.

The system has basically 3 ways of sequence of communication with gateways devices to generate
appropriate output for the end user:

1- Gateway send job request to Broker node. Broker node sends master IP then gateway send
job(data ) the Broker node will que it, run prediction and return result to gateway.
2- Gate send job request to broker node. Broker node send Worker IP as Master has workload.
Gateway then send job to worker node. Worker will que it, run the prediction and return the
result to gateway.
3- Gateway send job request to Broker node. Broker sends Cloud IP as Master and Worker has
workload. Gateway then send job to cloud. The cloud will que it, run prediction and
generate result and send it to the gateway.

In above all scenarios the latency from gateway to Broker is minimum as compare to Worker and cloud
because of multiple hop transfer of data.

The data set for the Healthcare is taken from Cleveland Dataset on which the models are trained and
use to predict the outcome for the input data. The dataset consist of the 1807 examples out which 1355
were used for training and 452 were used for testing. Davison of data into training, Validation and
testing set in the ratio of 70:10:20. The training examples were divided equally across all worker/broker
nodes equally to obtain their respective trained deep learning models. As the number of Fog nodes
increases to use all resources for training the dataset examples would have to be distributed to all
nodes. The training set is used for training the model, the validation set is used for tuning the model and
the test set is used for testing how the model performs on new data. As in this work the SciKit training
library's BaggingClassifier to enforce voting scheme. The model takes the form of deep neural network
base classifier in this case as well as the number of classifiers as input.

S-ar putea să vă placă și