Sunteți pe pagina 1din 6

Power Quality Monitoring of Distributed Generation Units Using a Web-based

Application
S. Antila, K. Kivikko, P. Trygg, A. Mkinen and P. Jrventausta
Tampere University of Technology, Institute of Power Engineering
Abstract
This paper presents power quality data management system developed in a research project at
Tampere University of Technology (TUT), Finland. The system consists of a remote reading
system, a database developed for managing measurement data, and a web-based application for
power quality monitoring. The system is implemented by using application service provisioning
(ASP) model. Also suitability of system for power quality monitoring of distributed generation and a
case study are described in this paper.
1.

INTRODUCTION

In modern information society requirements and


expectations associated with power quality have
become increasingly important. Reasons for that are
increased requirements for power quality by network
utilities, customers and regulators. Many industrial
and commercial customers have equipment that is
sensitive to power disturbances. Therefore, it is more
important to understand the quality of power being
supplied. Also regulation models, laws, and
guidelines relating to network business provide
requirements for developing power quality
management.
Electricity is not anymore a luxury article like few
decades ago, but it has become a necessity and a part
of our everyday life. Even short interruptions and
voltage sags can be harmful when the amount of
computers, programmable logics etc. in industry and
as well in households have increased rapidly. The
standard EN 50160 gives the main characteristics of
the supply voltage at the customers supply terminals
under normal operating conditions with regard to
frequency, magnitude, waveform and symmetry of the
three phase voltages. So far continuous power quality
monitoring in larger extent is not yet very common,
but the situation is changing quite rapidly.
Also integrating distributed generation into electricity
distribution network settles new needs for power
quality monitoring, because distributed generation
units are usually connected to weak distribution
network. In Finland these units (i.e. wind mills) are
located mostly in the western coast or in Lapland in
the fells, where the wind circumstances are most
favourable. These are partly very sparsely populated
areas where energy consumption is low, distribution
network is weak and transmission distances are long.
In these conditions quick changes in wind speed or in

power may produce remarkable voltage variations,


which can lead to malfunction of devices and
customers discomfort. That is why power quality
should be monitored at distributed generation units to
find and prevent possible problems.
Web-based applications have many advantages
compared to ordinary applications and they offer an
easy way for power quality monitoring. In addition to
usability and functionality, web-based applications
offer possibility for electricity utilities to outsource
their operations where the service is provided using
application service provisioning (ASP) model. Power
quality monitoring is one example of implementing
outsourced service using ASP model where the
service provider takes care of supply chain of the
power quality information in total and allows
customer to focus on core business and leave
technological solutions into providers responsibility.
Tampere University of Technology (TUT) has going
on projects on power quality data management and
integration of distributed generation into electricity
distribution network. In case of power quality
management a target is on one hand to define and
demonstrate power quality monitoring functions as a
part of integrated automation systems, and on the
other hand power quality issues are considered from
the point of view of network planning and operation
including basic analyses, measurements, outage
reporting, and also issues of distributed generation. A
wider concept for power quality monitoring is
presented e.g. in reference [1]. The project of
distributed generation investigates the integration of
distributed generation into an electricity distribution
network from planning and operation point of view.
The project includes development of network steady
state and stability analysis tools and methods,
consideration of distribution network protection,
power quality and reliability, and investigation of

basis for connection fees and transfer charges of


distributed generation.
This paper introduces a power quality data
management and web-based monitoring system,
which is suitable for continuous power quality
monitoring also in distributed generation. The
developed system consists of power quality meters
(Quality Guards), their remote reading software,
database server and structure, web application for
presenting and analysing data, web server, and client
(i.e. web browser). In addition to power quality
monitoring functions, functions for energy
consumption and building automation (i.e. water and
heat
consumption,
state
information,
and
temperatures) can be implemented in the system.
Developed web-based power quality monitoring
system makes it also possible to gather together all
measured power quality data from various distribution
generation units of the same producer. Also new
technologies, which make the development of web
and mobile applications possible, are presented.
Power quality monitoring has been studied and
applications have been developed worldwide, too.
Reference [3] describes a low-cost smart web sensor,
which is designed and implemented to acquire,
process and transmit data over 802.3 network and to
construct dynamic HTML pages. Reference [4]
presents a web-based multi-channel power quality
monitoring system, which consists of PQ meters, data
transmission and displaying in the Internet. Reference
[5] depicts a concept of database management system
to store power quality data and a web-based user
interface of the system.

2.

THE CONCEPT OF POWER QUALITY


DATA
MANAGEMENT
AND
MONITORING SYSTEM

Power quality monitoring is a process of gathering,


analyzing and interpreting raw measurement data into
useful information. The process of gathering data is
usually carried out by continuous measurement of
voltage and current over an extended period. The
process of analyzing and interpretation has been
traditionally performed manually, but recent advances
in signal processing and artificial intelligence fields
have made it possible to design and implement
intelligent systems to automatically analyze and
interpret raw data into useful information with
minimum human intervention. [6]
The power quality data management and monitoring
system presented in this paper consists of power
quality meters (Quality Guards), their remote reading
software, database server, web application, web
server and client (i.e. web browser). Purpose of the
system development has been to illustrate and
demonstrate outsourced service relating to power
quality. Figure 1 illustrates the system architecture of
the system.
Quality Guard is a commercial product of MX
Electrix Oy, which is a Finnish energy and power
quality meter manufacturer. Quality Guard is a fairly
cheap smart kWh-meter with power quality
monitoring functions. Quality Guard is remotely read
by using GSM modem. Measuring functions are
implemented by using sparse sampling methods
described in reference [7]. Most of the power quality
quantities described in standard EN 50160 can be
measured with Quality Guard. In addition to this, data
on powers and currents can be measured.

Customer

Internet
Web-based
application

Quality Guard

Web browser

Quality Guard

Transmit DB
Quality Guard

Quality Guard

Modem
reading

Transmit
Database

PQDB
SQL procedures

Power
Quality
Database

Web server

Application Service Provider

Figure 1: System architecture of the web-based power quality monitoring system.

Quantities measured by Quality Guard are:


Voltage level, current, real power, total
reactive power, fundamental frequency
reactive power, apparent power, DC offset,
total harmonic distortion (THD) of the
supply voltage, and power factor for each of
the three phases
Total real power into both directions
Total capacitive and inductive reactive
power
Total apparent power
Frequency of the supply voltage
The ratio of the negative- and zero-sequence
components to the positive-sequence
component
Harmonics of voltage (3., 5., 7., 9., 11., and
13.)
Maximum values of total harmonic
distortion (THD) of voltage, DC offset and
power factor
Quality Guard records also timestamp, length
minimum/maximum value and rms value of voltage
dips and swells, and starting and ending times of
interruptions. In addition to this, quantities for energy
consumption and building automation (i.e. water and
heat
consumption,
state
information,
and
temperatures) can be delivered with Quality Guard.
The recording density can be set as some seconds,
minutes or hours. Maximum of 24 quantities can be
measured at a time and measured values are stored in
memory of Quality Guard. With recording density of
10 minutes and 24 measured quantities the memory of
Quality Guard can store data of approximately three
days period. Setup and remote reading of Quality
Guard are carried out by GSM modems or serial port
of the meter using Transmit-system developed by
Enersoft Oy or EQL Online application developed by
Enease Oy.
Transmit-system is a remote reading system of
Quality Guards consisting of reading applications,
timer applications and Transmit database. With timer
applications, remote reading can be set to launch
automatically, e.g., once a day. Measurements are
stored into Transmit database, which is a relational
database requiring MS SQL Server. Each measured
value generates one row in the Transmit database
consisting of name of measuring point, quantity,
recording density, time stamp, measured value and
profile, which means that each day 24*6*24=3456
rows are generated in the Transmit database by each
measuring point with recording density of 10 minutes
and 24 quantities. It follows that ten measuring points
generate over 1000000 rows in one table in the
Transmit database in time period of one month.
Consequently, queries to Transmit database become
slow and ineffective, which means that the structure

of Transmit database is not suitable for database


applications.
For the reason of mentioned shortcomings in the
structure of Transmit database, it has been necessary
to develop a database called Power Quality Database
(PQDB) with a new database structure. The structure
of PQDB is suitable for long-time data saving and for
applications to use. Each measuring point has its own
table where each measured value with same
timestamp is stored in same row. In result of this, the
number of stored rows in one day diminishes to 144
from 3456 and the number of rows stored in one table
is independent of the number of measuring points.
Data of voltage dips/swells and interruption is also
stored in PQDB. Measurements are transferred from
Transmit database to PQDB by automatically called
stored procedures of MS SQL Server.
Data stored in PQDB is analyzed and presented by
using web-based application. Implementation of the
application is carried out using Microsoft Visual
Studio.NET application development tool and C#
programming language, which are parts of the new
Microsoft .NET architecture. Functionality of the web
application consists of:
Presenting measurement data graphically in
chart and grid mode.
Presenting voltage dips/swells in grid mode.
Presenting pulse measurements (e.g. water
and heat consumption) and energy
consumption calculated from power
measurements.
Graphical user interface of the web application is
presented in figure 2. Authorization is required to
access the solution, and authorized measuring points
are concluded with the logging into solution. After
logging in, the main page with basic information of
the measuring point is presented.

Figure 2: Graphical user interface of the web


application for presenting power quality data.

available on the server side. There is no need


to have a system administrator to install the
software to each PC in the company. The
time-consuming software upgrade process is
eliminated.
Maximization of system scalability: Web
pages can be developed gradually. There is
no need to recompile or relink the whole
application to present updated information in
the Internet.
Supporting multi-media, video conferencing,
etc.: Internet browsers have the capability to
show pictures, charts, graphics, audio
sounds, and video clips. Therefore,
interactive
and
effective
two-way
communication can be achieved.
Providing relational database access: Internet
applications can be database-driven to
present the latest information in the database
all the time without any Web page changes.
Data stored in a popular relational database
can be dynamically displayed on Web pages
and updated through the Internet.

User can select examined quantities from the list and


time period from the calendars in the right hand side
of the user interface. Time period can be specified
from the drop lists below calendars. After selections
user can examine trends of selected power quality
measurements in chart mode or examine specified
measured values in grid mode. Visible chart and grid
are automatically updated when quantities or calendar
selections are changed. Voltage dips/swells and
interruptions are presented both in chart and grid
mode. The screen is printable and charts can be saved
to users computer for further study. Changing
measurement point results presenting main page,
changing todays dates to calendar components, and
updating list of quantities with ones measured from
selected measuring point.
3.

ADVANTAGES OF WEB APPLICATION

The development of advanced Internet technologies


has had influence on trends in software production.
Using the Internet techniques in application
development gains the following benefits: [5]
Fairness: Users do not need to purchase any
expensive hardware equipment or software
application to gain access to the Internet
throughout the world. One can use various
communication
media,
e.g.,
normal
telephone line, ISDN line, cable modem,
etc., to retrieve information or update data
remotely.
Supporting cross-platform architecture: In
standardized Internet browser environment
with HTML and TCP/IP protocols, users
will have the same look and feel for web
displays in different hardware platforms.
Hence, graphical user interface is developed
once and can be presented in Windows and
UNIX platforms.
Supporting open system architecture:
Internet applications follow open system
standards, e.g., HTML, HTTP, TCP/IP, FTP,
PPP and SQL. Data exchange and system
expansion are done with minimum efforts.
Application and database interface are
defined consistently according to industrial
guideline, e.g., Open Database Connectivity
(ODBC).
Providing user-friendly and consistent
human-machine interface: Working on wellbehaved Internet browser, users do not need
to learn difficult commands or mouse
operations to surf any newly developed
application. Hence, training efforts can be
minimized.
Minimization
of
installation
and
maintenance efforts: New applications can
be directly downloaded through Internet to
client sides. Software upgrades can be done
automatically when new versions are

4.

POWER QUALITY MONITORING AS


APPLICATION SERVICE PROVISIONING
(ASP)

Concentration on core businesses is at present a trend


among Finnish distribution companies and same kind
of development can be found in other countries, too.
Companies are outsourcing their functions and only
the most important business (i.e. managing the
network assets) is left to the distribution company.
For example, network planning and construction and
maintenance are more often given to another
company providing this kind of services. One
example of future trends could be outsourcing power
quality data management and monitoring where the
advantages of web applications are undeniable.
A company supplying software applications and/or
software-related services over the Internet is called
application service provider (ASP). An ASP owns and
operates a software application. The ASP also owns,
operates and maintains the servers that run the
specific application. An ASP employs the people
needed to maintain the application, and also makes
the application available to customers everywhere via
the Internet, either in a browser or through some sort
of thin client. The ASP bills for the application either
on a per-use basis or on a monthly/annual fee basis.
The ASP model has evolved because it offers some
significant advantages over traditional approaches.
Especially for small businesses and start-ups, the
biggest advantage of an ASP is the low cost of entry
and short setup time. Furthermore, the ASP model, as
with any outsourcing arrangement, eliminates head
count. IT headcount tends to be very expensive and

very specialized, so this is frequently advantageous.


Moreover, the ASP model also eliminates specialized
IT infrastructure for the application as well as
supporting applications and licences. Another
important factor leading to the development of ASPs
has been the growing complexity of software and
software upgrades. Distributing huge, complex
applications to the end user has become extremely
expensive from a customer service standpoint, and
upgrades make the problem even worse. The ASP
model eliminates most of these headaches with short
implementation time and updates.
When a distribution company is buying the power
quality monitoring services from an ASP and power
quality data is collected to ASP-companys database
server. The data is accessed through the Internet and
power quality reports can be made with the web
application in the user interface of a web browser.
The distribution company benefits from easy-to-use
power quality monitoring applications, which do not
spend the companys human resources like ordinary
applications with all the maintenance and updating
tasks do. On the other hand, the ASP-company
benefits from fewer application and database updates
when all the applications and databases are located in
the same place.
Case of distributed generation
In case of distributed generation it is also possible to
gather together all measured power quality data from
various distribution generation units of the same
producer. In addition to power quality monitoring,
e.g., energy production of each distribution generation
unit can be examined easily at the same time. ASP
model allows also presenting data to every party
needing the information, e.g., network company,
customer, consult, electricity contactor and
researchers.

5. IMPLEMENTATION AND EXPERIENCES


OF A CASE STUDY
Power quality monitoring of distributed generation
unit has been demonstrated in 750 kW wind mill in
Nrpi. Nrpi is located in the western coast of
Finland and the power quality monitoring has been in
use there since January 2003 with Quality Guard. The
Quality Guard is read remotely to Tampere University
of Technology (TUT) and the power quality data is
served to all parties that need it with a web
application in a web server that is located in TUT.
The parties having access to the power quality data in
this case are the network company (who in this case
also owns the unit) and the researchers of the project.
First two months power quality was monitored with
recording density of ten minutes, which is also the
recording density described for the most of the

quantities in the standard EN 50160. Nevertheless, to


get more accurate measurements of the power quality
the recording density was reduced to one minute.
Because of this, the meter has to be read remotely in
every eight hours. Measurement data is transferred
into power quality database (PQDB) once a day.
Figure 3 illustrates trends of real powers over one
week period with recording density of one minute.

Figure 3: Trends of real powers over one week period


in Nrpi wind mill.
With demonstrated system, the owner of the wind
mill can easily check the power quality level of the
generation unit. The fact that Quality Guard is
basically a kWh meter offers automatically the
amount of energy produced by the generation unit.
The owner can also check if there have been
interruptions or voltage dips/swells. There should be
some logic to conclude the source of voltage dip/swell
and in case of small dip the distributed generation unit
has to support the voltage level. In case of autoreclosing, the distributed generation unit has to be
automatically disconnected from to network. Power
quality monitoring offers the owner statistics of these
disconnections.
Measured data allows also researchers to study and
understand the behavior of the distributed generation
unit and investigates the effects of integration of
distributed generation into an electricity distribution
network from planning and operation point of view.
In the future, one Quality Guard will be installed also
to the feeding primary substation connected to the
wind mill. This allows the network company and
other parties of the research project to analyze the
effects of the distributed generation unit on power
quality more widely.
6.

CONCLUSION

In this paper solutions and service models that


distribution companies can use in power quality data
management and monitoring were presented. One
goal in our research was to develop a system

consisting of remote data reading, database structure


for data storage and management, and a web
application for presenting and analysing power
quality data.
One big challenge in our research has been to test this
system in real working environment. The real
working environment usually brings out new
problems that were not considered in the research
environment and that is why these kinds of
applications should always be tested in an
environment they will actually be used in the future.
Concentration on core businesses is at present a trend
among distribution companies. Application service
provisioning (ASP) model gives new possibilities for
distribution companies to monitor power quality
continuously using outsourced service provider. The
development and commercialising of these
applications is about to start by a company founded
by the researchers.
7.

REFERENCES

[1]

A. Mkinen, M. Parkki, P. Jrventausta, M.


Kortesluoma, P. Verho, S. Vehvilinen, R.
Seesvuori, A. Rinta-Opas. Power Quality
Monitoring as Integrated with Distribution
Automation. Proceedings of CIRED 2001
conference, Amsterdam, Netherlands. 6 p.

[2]

P. Verho, P. Jrventausta, M. Krenlampi, J.


Partanen. Intelligent support system for
distribution
network
management.
International Conference on Intelligent
System Application to Power Systems,
Montpellier, France, September 1994. 8 p.

[3]

G. Bucci, I. Caschera, E. Fiorucci, C. Landi.


The Monitoring of Power Quality Using
Low-cost Smart Web Sensors. Proceedings of
the IEEE Instrumentation and Measurement
Technology Conference, Anchorage, AK,
USA, 2002. Pp 1753 1756.

[4]

R. P K Lee, L L Lai, N. Tse. A Web-based


Multi-channel Power Quality Monitoring
System for a Large Network. Proceedings of
the Power System Management and Control
Conference, 2002. Conference Publication
No. 488, IEE 2002. Pp 112 117.

[5]

Rong-Ceng Leou, Ya-Chin Chang, Jen-Hao


Teng. A Web-based Power Quality
Monitoring System. Power Engineering
Society Summer Meeting, 2001. IEEE,
Volume 3, 2001. Pp 1504 1508.

[6]

R. Dugan, M. McGranaghan, S. Santoso, H.


Beaty. Electical Power Systems Quality.
Second edition. McGraw-Hill. 2002.

[7]

P. Koponen. Sparse sampling methods for


power quality monitoring. Dissertation.
Tampere Universiti of Technology, Finland,
2002. 217 p.

S-ar putea să vă placă și