Sunteți pe pagina 1din 23

Business Analytics - Meaning, Importance and its Scope

Introduction

The word analytics has come into the foreground in last decade or so. The proliferation of the internet and
information technology has made analytics very relevant in the current age. Analytics is a field which
combines data, information technology, statistical analysis, quantitative methods and computer-based
models into one. This all are combined to provide decision makers all the possible scenarios to make a well
thought and researched decision. The computer-based model ensures that decision makers are able to see
performance of decision under various scenarios.

Application

Business analytics has a wide range of application from customer relationship management, financial
management, and marketing, supply-chain management, human-resource management, pricing and even in
sports through team game strategies.

Importance of Business Analytics

 Business analytics is a methodology or tool to make a sound commercial decision. Hence it


impacts functioning of the whole organization. Therefore, business analytics can help improve
profitability of the business, increase market share and revenue and provide better return to a
shareholder.
 Facilitates better understanding of available primary and secondary data, which again affect
operational efficiency of several departments.
 Provides a competitive advantage to companies. In this digital age flow of information is almost
equal to all the players. It is how this information is utilized makes the company competitive.
Business analytics combines available data with various well thought models to improve business
decisions.
 Converts available data into valuable information. This information can be presented in any
required format, comfortable to the decision maker.

Evolution of Business Analytics

Business analytics has been existence since very long time and has evolved with availability of newer and
better technologies. It has its roots in operations research, which was extensively used during World War
II. Operations research was an analytical way to look at data to conduct military operations. Over a period
of time, this technique started getting utilized for business. Here operation’s research evolved into
management science. Again, basis for management science remained same as operation research in data,
decision making models, etc.

As the economies started developing and companies became more and more competitive, management
science evolved into business intelligence, decision support systems and into PC software.

Scope of Business Analytics

Business analytics has a wide range of application and usages. It can be used for descriptive analysis in
which data is utilized to understand past and present situation. This kind of descriptive analysis is used to
asses’ current market position of the company and effectiveness of previous business decision.

It is used for predictive analysis, which is typical used to asses’ previous business performance.
Business analytics is also used for prescriptive analysis, which is utilized to formulate optimization
techniques for stronger business performance.

For example, business analytics is used to determine pricing of various products in a departmental store
based past and present set of information.

Data for Analytics

Business analytics uses data from three sources for construction of the business model. It uses business data
such as annual reports, financial ratios, marketing research, etc. It uses the database which contains various
computer files and information coming from data analysis.

Challenges

Business analytics can be possible only on large volume of data. It is sometime difficult obtain large volume
of data and not question its integrity.

Business Intelligence - Architecture, Components and its Benefits


Introduction

The current business environment is constantly evolving. The global economic scenario is providing
opportunities as well as challenges. The factors affecting business environment are consumer needs,
globalization, and government policies, etc.

In such a business environment, organization basically has four action steps. The organization can be
reactive, anticipative, adaptive, or/and proactive. For this, organization can develop a new strategy, get into
partnership, etc.

Today most of the businesses are having a computerized business support. This support is in form of
decision support system, business analysis, etc.

The main objective of business intelligence is to bridge the gap between organization current status and its
desired position. Business intelligence helps organization achieve commercial success along with sound
financial management.

Business intelligence is framework designed to support decision-making process. This framework


combines architecture, database, analytical tools and applications. Business analytics forms an integral part
of business intelligence.

Framework of Business Intelligence

More and more businesses are moving towards business intelligence. The reason for this movement is the
business environment. Organizations are forced to capture, store and interpret data. This data is at the core
of business success. Organizations require correct information for any decision-making process.

Business intelligence combines data warehousing, business analytics, performance, strategy and user
interface. Business receives data from various sources. This data is capture in the data warehouse where it
is stored, organized and summarized as per further utilization. Authorized users can access this data and
work on it to get desired results. This result than are shared to executives for decision-making process.
These data results can be published through dashboards or share points.
Business Intelligence Architecture and Components

The main components of business intelligence are data warehouse, business analytics and business
performance management and user interface.

Data warehouse holds data obtained from internal sources as well as external sources. The internal sources
include various operational systems.

Business analytics creates a report as and when required through queries and rules. Data mining is also
another important aspect of business analytics.

Business performance management is a linkage of data with business objectives for efficient tracking. This
business performance is then broadcasted to an executive decision-making body through dashboards and
share-point.

Benefit of Business Intelligence

The benefits of Business intelligence are as follows:

 Business intelligence is faster more accurate process of reporting critical information.


 Business intelligence facilitates better and efficient decision-making process.
 Business intelligence provides timely information for better customer relationship management.
 Business intelligence improves profitability of the company.
 Business intelligence provides a facility of assessing organization’s readiness in meeting new
business challenges.
 Business intelligence supports usage of best practices and identifies every hidden cost.

Business intelligence usage can be optimized by identifying key projects on which company would like to
focus. This process of highlighting key projects is called business intelligence governance.

The importance of business intelligence is growing, and its usage has proliferated across various types of
users. Earlier, it was in the domain of IT staff, but now business team is also independently handling
business intelligence.
The Role of Data and Analytics in Insurance Fraud Detection

The rise of analytics presents a world of almost limitless potential for industries such as insurance where
companies have long held a foundation of information. The industry at large has had a slow adoption of
new Big Data analytics because of cost concerns, and regulation may be the limiting pressure of the future.

In the past, fraud detection was relegated to claims agents who had to rely on few facts and a large amount
of intuition. New data analysis has intro¬duced tools to make fraud review and detection possible in other
areas such as underwriting, policy renewals, and in periodic checks that fit right in with modelling.
The role this data plays in today’s market varies by insurer as each weighs the cost of improving upon
information systems versus the losses caused by current fraud. This often comes down the question of: is
fraud creating a poor enough customer experience that infrastructure investments will improve fraud
detection and improve honest customer claims processes?
Protection of personal information is paramount, but fraud pattern recogni¬tion requires a large amount of
data from underwriting, claims, law enforce¬ment and even other insurers. Each new piece of legislation
has only made the protection hurdle higher when integrating these sources.
Once this data is collected and properly utilised, insurers must consider if it is accurate. Modelling often
relies on past behaviours for fraud predictions, but criminal practices change quickly enough to make some
of this analysis worthless. Assessing data quality has become a struggle.
While analysis has proven a difficult task to master, today’s insurers are seeing many benefits. Fraud
detection has improved and systems are now robust enough to provide analytics in real-time. Some insurers
have gained the ability to scan for fraud before a policy or claim is approved, pushing Big Data from a
siloed fraud unit all the way to agents in the field.
The future of fraud detection, however, cannot be via a pure analytics approach. The human element in
assessing risk will remain a vital piece of proper detection. Data can hasten the detection of fraudulent
activity and patterns, but people will always be required to turn reports into actionable intelligence.
The Role of Data and Analytics in Insurance Fraud Detection
Setting the Stage of Today’s Market
The illegal activities that encompass fraud are first and foremost a detri-ment to the financial stability of
each insurer, but the harm caused is much more far-reaching. These deliberate acts have a long-term impact
on all operations of an insurer. Fraud losses and risks can lead to price increases for loyal customers as well
as introduce additional time and review before insurers pay legit¬imate claims. This increased scrutiny of
honest customers is only visible when they feel most vulnerable and are in the greatest need of the insurer’s
services.
Pressing customers in such a vulnerable position can create significant harm to reputation and trust, risking
increased policy turnover.
Fraud detection units and internal auditors typically manage most of the data and systems used to store and
process fraud detection. As automated processes become more in-demand, IT has a bigger role to play
within the fraud unit. The availability of real-time services will further the importance of IT in budgets and
decision making. Regardless of an IT or fraud background, team members must be well-trained to
understand the modern threat. As many units are still growing to scale, team members are pulling double-
duty as both IT experts and fraud analysts.
The Face of Today’s Fraud
In Europe, fraud is largely gang-related, so the focus is typically on third-party instead of first-party fraud.
Fraudsters pursue the path of least resistance and this eventually shifts to areas where there is less fraud
detection. Analytics engines that aren’t applied across an entire organisation may indicate where fraud will
shift to, such as pet care divisions for some insurers. Ghost-broking is also a growing area of fraud and
tends to stick to one type of insurance product. Analytics engines can help identify some of these areas and
establish patterns to help the market identify concerns before paying a claim and potentially before a claim
is filed or policy issued.
“The success of an individual fraudulent claim depends on the fraudster’s ability to present that as a genuine,
unique occurrence. Obviously frauds have common traits, and these can be determined through data sharing
and analytics,” said Ben Fletcher, Director of the Insurance Fraud Bureau.
The Role of Data and Analytics in Insurance Fraud Detection
The What and When of Data Availability
Most insurers have a huge repository of existing data in terms of historic claims and policy information
plus a steady stream of new claims and application data. Insurers work with law enforcement to share some
information; however EU law as well as country laws significantly limit what information can be shared
among insurers.
Much of this data is typically used to validate what’s being told by the claimant and what is being processed.
Insurers not only look for red flags in terms of conflicts but they also look for connections to organised
crime. Insurers today look for fraud in new policies and then review information when there are policy
changes. Touch points that cause a review include coverage shifts by insurers, new claims, changes by the
policy holder, and during policy renewal.
“Sometimes not all needed data is available and the quality of the exist¬ing data is partly poor. We have to
find the right balance in reducing data volumes and gathering the best data for effective analysis,” said
Roland Woerner, Global Head of Counter Fraud at Zurich Insurance Group.
However, the market is improving. Unstructured data has become an opportunity instead of a problem.
Many insurers have the ability to change unstructured information into structured data and actively mine
this for the opportunities available therein.
“The challenge with some data is that some brokers are not always willing to give all of the information
that insurers’ fraud detection units would like, such as contact information. Email addresses and phone
numbers can be essential to identifying links to fraudulent activity,” said Steve Jackson, Head of Financial
Crime for Covea Insurance.
The Role of Data and Analytics in Insurance Fraud Detection
Existing Operations and Obstacles
To a large extent, Big Data analysis is being driven by IT imperatives and not mainline business operations.
Analytics are often introduced on a project basis and, if benefit is shown, then analytics platforms are
expanded to more divisions.
Insurers may implement these techniques in marketing or other customer service areas first, but fraud
detection units benefit from the tools and analysis just as much. The main point for the introduction of
analytics in a business sense is determining its present value and building the case for a consistent return.
It becomes a people plus power equation.
“For claims fraud prevention and detection, an insurer needs a highly professional organisation, and the
best people capabilities supported by excellent data analytics,” said Woerner.
These professionals can help companies make full use of core systems and external sources such as the
common fraud database provided by Insurance Information Centre. To avoid data concerns, “required fields
should be matched and accuracy of the fields examined step by step,” said Taşkın Kayıkcıoğlu, AGM, CIO
and Member of the Executive Committee at Groupama.
Fraud Systems from Silos to Ever-Present
In the past, systems were unable to speak together and often were siloed because integration technology
wasn’t available. Today, every insurer will be slightly different as they move to new services, so some
insurers have legacy problems while many others have robust systems that can pull data from multiple
sources.
Unfortunately, even insurers who have made significant investments are still operating with some silos
because of concerns over improper informa¬tion sharing within departments. For many customers, the
information they provide can only be used by the department responsible for their policy. This means an
auto policy division cannot access much information collected by a homeowner’s insurance division. While
data sometimes may be collected and processed en masse, insurers must make sure that results and other
information are not passed along improperly or without consent.
“Many legacy systems lack detail and this is compounded by the fact that some departments still work in
silos. This means that disparate pieces of useful information about an entity are rarely pooled; but if they
could be, we would create a single accurate impression. Many analytics solutions use mapping layers, which
helps fraud departments pull in multiple data streams, either internally or externally, into a
The Role of Data and Analytics in Insurance Fraud Detection consolidated view. Of course, this does
nothing to ensure that data is no longer siloed by other depart¬ments,” said Jackson.
The Holistic Fiefdom
Claims investigation units typically hold the data for fraud detection, so they have a necessity for systems
integration. Unfortunately, many organ¬isations still have a fiefdom and this precludes a more holistic view
of the complete fraud threat that exists today.
Data-focused insurers are struggling to unify information around the touch-points of claims and
underwriting. This operational convergence is of the utmost importance. The conversation still comes back
to three main questions that insurers must answer for their business models:
■ What are the costs of advancing data analytics to the organisation?
■ Are fraud losses today creating a significant burden for current or future operations?
■ Is fraud creating bad press or making the customer experience poor?
As a whole, insurers believe they have a control on the industry and its fraud, even with the slow pace of
adopting new technologies. Insurers who adopt a sense of urgency around data diligence are finding it to
be a signifi¬cant point of distinction for their customers and their bottom line.
“It comes down to: how little can they spend to give the impression of excellent customer service and
maintain capabilities,” said Richard Collard, WW Business Development IBM i2 Fraud Analytics.
“Insurers too often take a Band-Aid approach to an infrastructural concern.”
It may take pressure from state regulators for the industry to adopt new services on a broad level.
Model Citizens and Model Concerns
“I think today we’re looking at a change in behaviour and the propensity to commit fraud,” said
Collard. Behaviour changes represent a challenge to insurers because behaviour modelling currently trains
and bases predictions on past, identified fraud practices. Many of these models have not been relevant in
recent years because the prediction data they’re using is simply too old.
The Role of Data and Analytics in Insurance Fraud Detection
The established belief that models must train for future behaviour based on past experience took a
significant hit during the financial crisis. However, it has been beneficial for insurers and other industries
to see this hole.
“There are always challenges around data quality. It’s a perennial problem,” said Jackson.
Insurers and their fraud teams are starting to regain ground and learn what new behaviours look like to
respond to fraud. Predictive analytics is playing a stronger role as is entity analytics, the understanding of
who an individual is and if they are who they claim to be. Analytics engines can now run these checks and
raise concerns during the on-boarding process.
“The single biggest challenge is putting in the appropriate controls and team to ensure that you find the
fraud but that you don’t disrupt the customer experience in the process,” said Fletcher. Beyond speed, the
safety and security of the information itself is paramount.
Data Safety and Disclosure
“Everything we do is through a secure connection. I wouldn’t say we’re paranoid but we’re very conscious
about data security. Anything that leaves us goes through a secure connection,” said Jackson. A separate
fraud department exists in today’s insurer and this unit typically holds all of the data being used for
detection. Data from multiple sources, such as claims and underwriting, are syndicated and sent to the fraud
team that then does its analysis on-site.
Holding the data in a separate location allows the fraud team to enhance, modify, and update data safely
and securely. This also helps a fraud team keep data only on internal systems and away from Web-based
risks. Insurers take a significant blow to credibility when any data is lost or stolen.
While the data is being managed by fraud detection, it is up to individual agents throughout a policy’s
lifecycle to ensure that policy holders give their consent for data to be analysed. This has led to overt
disclosure that data will be monitored for fraud and that any discoveries will be shared with authorities.
“Transparency is important for credibility of anti-fraud activities. It’s one of our fundamental priorities to
keep our honest costumers informed and it’s part of our fraud prevention approach,” said Woerner.
The Role of Data and Analytics in Insurance Fraud Detection
The industry is hoping to expand this type of sharing to new data as it is collected. For fraud detection,
“image recognition and voice analytics will be used in near future. For example, one photo can be used for
multi-claims, it should be prevented technically,” said Kayıkcıoğlu. Overt disclosure has also had a chilling
effect on some fraudulent activity.
“Now, there’s a strong chance that they’re not going to commit fraud unless they’re organised criminals –
then they don’t care,” said Jackson.
Does Fraud Detection Get in the Way of Other Business?
Fraud units have three main goals:
■ Detect fraud and pull potential fraudulent claims for in-depth review.
■ Return non-fraudulent claims back into the claims cycle so honest customers are not upset.
■ Perform the first two operations as seamlessly in the business cycle as possible.
Many insurers are now capable of performing analysis with Big Data to quickly flag or validate claims. The
automation process focuses on this speed and, overall, the industry is at a place where it can claim that very
little gets in the way.
“On the whole we don’t face any real problems with interrupting the cycle on a genuine claim,” said
Jackson. “Nothing gets in the way of the claim when we can help it.”
“Taking an attentive approach to fraud and associated costs means we are able to protect our honest
customers and continue to provide them with the best possible insurance cover now and in the future,” said
Woerner.
New innovation is helping to speed up the fraud processing of data and other services. Some providers can
even process information and provide an initial analysis while a person is in an office signing up for a
policy. Agents can often get a real-time approval or denial from an initial claims unit review.
Is Real-Time a Necessity?
When discussing Big Data and analytics in a broad sense, there is typically a business-case emphasis on
real-time functionality. In the insurance world, real-time processes are the preferred approach for
operations, but they are not a necessity for analysis once potential fraud is determined.
The Role of Data and Analytics in Insurance Fraud Detectionto balance speed with thoroughness. The
ultimate goal is to avoid the need to look for fraud after an insurer has made a sale.
However, this is mainly a propensity modelling concern, not a complete search for fraud. This modelling
is used to determine the likelihood of a new policy holder to commit a fraudulent act, and it can be done in
real-time.Routine checks don’t have any need for lightning-fast speed, reducing the computing requirement
and overall cost of analytics programs. Again, insurers are likely deploying propensity models as new
information is uncovered or databases are updated.
In claims, insurers again want service to be as close to real-time as possible to maintain the best level of
customer service. Here and in policy origi¬nation, if fraud or a potential for fraud is detected, the need for
real-time decision making is reduced. Insurers want to take their time when reviewing cases for fraud, so it
is okay if the process becomes longer and more involved after a red flag is discov¬ered. “We’ve found
quite a bit of fraud based on this kind of approach,” said Jackson.
Police Under the Insurance Umbrella
Insurers are taking a more prominent role in community monitoring by working with police to fund specific
units for fraud enforcement. Last year, the Association of British Insurers announced plans to invest £11.7
million over three years to help fund an expansion of the Insurance Fraud Enforcement Department within
the City of London Police.
Additionally, groups like the IFB help UK insurers to detect fraud rings and have reduced some
informational barriers. Information must flow directly to the IFB and not to other insurers, which some
insurers say dampens the ability of their in-house teams. However, this type of system-flow could be a
benefit to the industry as a whole.
Current fraud police units also have limited data-sharing back to the insurers. Much of their work is
predicated on information provided by the insurance companies, but English laws prevent a proper back-
flow of infor¬mation to help all insurers learn new warning signs. Information resting solely in the hands
of law enforcement keeps a strong impetus out of the market. If all of this information were made accessible
to insurers, they would naturally write systems and software to share and collect what was available. This
sharing would be one of the strongest driv¬ing forces behind creating a common language for insurers’
systems.
The Role of Data and Analytics in Insurance Fraud Detection
It’s All Binary to Me
A system run by law enforcement is inherently rigid and the industry would need to conform to access data
it makes available. This could create a significant third-party market for software development and/or a rash
of in-house development that would potentially work across insurers.
“Understanding a common language will be an Esperanto for fraud inves¬tigation, which can only be a
good thing,” said Collard. As the language provided better channels to discovering new fraud, insurers
would focus on aligning more of their processes with this new language. “Success breeds success.”
The Acquisition Model
Many major insurers in the European Union have made significant size growth by leveraging mergers and
acquisitions. This creates a unique prob¬lem for the adoption of big data initiatives by creating multiple
databases that an insurer has access to.
On its face, having multiple datasets seems like a boon. In fact, using multi¬ple datasets is an established
best practice of fraud detection. However, the problem is that these datasets are not guaranteed to have a
similar archi¬tecture and may not integrate properly. Since these systems are typically not the focus of an
acquisition, they are often used in tandem instead of combined. This holds the insurer back by creating
multiple views of the customer.
“On this basis, it’s very difficult to create the ‘Holy Grail’ that is a single view of the customer,” said
Collard.
To address these issues, insurers must make fraud detection and analytics part of their core business rules
and development. “We combined all (busi¬ness) rules in the company and put mathematical modelling on
this data, and got the necessary accuracy to find fraudulent cases with a 72% success ratio for 20% of all
claims,” said Kayıkcıoğlu.
Understanding Legacy Systems
Requirements of today’s data analytics often include an upgrade on some systems, but fraud detection units
have largely maintained an IT budget that has allowed them to stay up-to-date. The real concern in terms
of systems is the use of a third-party service or software because privacy protections and concerns lay at
the feet of an insurer. Not having absolute control causes worry at the very least and a significant liability
at the worst. Third-party systems also lack enough customisation to make insurers feel absolutely
comfortable.
The Role of Data and Analytics in Insurance Fraud Detection
“One would’ve hoped that the EU would have standardised approaches to data protection to actually share
data. It’s actually gone the other way for us. It has created a far greater protection of individual’s rights.
This drives insurers and other institutions to continue to work in silos and that’s where the fraudsters pick
us off,” said Collard.
The Future
Why Is Today’s Fraud Detection Different?
Fraud detection has changed in its location relative to the insured. Insurers are now able to run predictive
and entity analytics during multiple touch points, essentially as each new piece of information is added. This
not only improves detection capabilities in the event of fraud, but it also allows an insurer to assess a fraud-
risk. Some have begun providing risky policy holders with high-priced policies in order to drive them to
other service providers.
The insurer today has moved away from a purely reactionary stance to a proactive effort to keep bad
business off of its books. Insurers are seeing the financial benefit of enacting large efforts to keep fraudulent
activity completely out of the business cycle by identifying it during signup.
“The move from reactively looking at data and intelligence at a practitioner level, to using analytical tools
to proactively look for trends and patterns at an industry level has been the single biggest step forward from
the IFB’s point of view,” said Fletcher.
Beyond this shift, much of current evolution is around communication and it presents a clear opportunity
for moving forward. The future is about collabo¬ration with brokers and other outside parties as much as
with other insurers.
“We need to be a lot more open about this information so we can do the proper analytics. The fact that we
haven’t got information isn’t an obstacle because most of it can be found with a little bit of research. But,
if it’s some¬thing that the policy holder is trying to conceal – such as publicly available phone numbers
being different from what they’ve given their broker – then it’s a potentially missed link or signal for fraud,”
said Jackson.
Blending the Art and Science
While analytics engines may get much of the coverage, the successful fraud detection unit of tomorrow
features a very well-educated staff.
The Role of Data and Analytics in Insurance Fraud Detection
“The more data we capture and the more detail we capture, the better we can refine these models. But,
there’s only so far we can go with probability,” said Jackson.
Fraud professionals are being asked to step up to the plate like never before. They have access to more data
and increasingly strong ways to manipulate it. Staff will need to be trained in these systems as well as new
fraud tactics.
“A strong emphasis on technical excellence guides us on how we approach fraud prevention and look after
the long term interests of Zurich and our customers,” said Woerner.
Insurers want to automate the fraud process as much as possible to weed out as many proper claims and
false positives as possible. At the end of the day, however, any flagged accounts still must be reviewed by
a person. A well-trained team can improve models by determining what normal behaviour is and what
fraudulent behaviour is. It’s the narrowing of the funnel from machine analytics on a large level to
individual attention for final review.
“We will never remove this from the human domain,” said Collard.
Where Is The Market Headed?
Use of analytics for fraud detection in insurance is essential to the future viability of the market. For new
technologies, there is a significant push in the underwriting process where rules and procedures can be
applied before a policy is issued. “Technically, handwriting scanning, image processing and smart phone
capabilities like geocoding and XDIF information can be used for advanced fraud solutions. We are
working with some R&D centres for these purposes,” said Kayıkcıoğlu.
However, there is no mad rush to adopt new third-party technologies or shift infrastructure. Recent market
events have made this image much clearer than many would have thought at the turn of 2014. Most notably,
Heartbleed poked a large security hole in Transport Layer Security (TLS) and its predecessor, Secure
Sockets Layer (SSL). The consum¬er-facing Internet largely relied on SSL as a way to signify that a site
and information were secure in the cloud. The flaw going unnoticed for years has likely caused a major
reduction in plans for insurers to move any part of operations to the cloud.
The Role of Data and Analytics in Insurance Fraud Detection
“We must protect data at all costs, no matter where it’s handled,” said Jackson.
The question has been: What is the potential benefit for economies that are predicated by adoption of the
cloud or a cloud-based platform? When the answer focused on reduced margins and increased competition,
cloud-based analytics were an easier case to make.Now, insurers must weigh the risks of criminals’ ability
to exploit the gener¬osity of insurers who keep data siloed versus the criminals’ potential ability to access
information if vulnerabilities arise from third-party processing power.
Where Should the Industry Look?
As austerity budgets continue in the UK and Europe, individuals, groups and gangs will look to the softest
option to make ends meet. Multiple insurers said their industry can learn a lot from credit card fraud
detection. These companies have adopted and invented new technologies to detect and deter fraud because
of a compelling business reason to act: regulators look heavily at money laundering.
“There is an overwhelming logic that says these technologies are absolutely relevant to what insurers should
be doing” and even without regulatory imperatives, these businesses should recognise the benefits available
to them, Collard said.
“Professional fraud analytics are crucial to bring fraud detection into the next level of excellence. At Zurich,
fraud detection analytics are there to support our people and to assure the highest level of objectivity,” said
Woerner.
The Future of Third-Party Data
Third-party data may play a role in fraud detection but it will likely reside in systems run by the IFB, police,
and other law enforcement for the near term. Major database providers don’t yet speak the same language
as insurers when it comes to privacy and value, so it’ll take a shift from the IT industry to start the adoption
of third-party data centres and fraud detection services.
In the UK, customer data is very strictly monitored. Similar protections are in place in France and Germany,
and EU nations are likely to move toward stricter data controls in the future. Privacy concerns will naturally
impact the data insurers use and own, so broad sharing will likely remain relegated to law enforcement
unless there is a significant shift in political climate.
The Role of Data and Analytics in Insurance Fraud Detection
Many insurers and other industries still feel burned from outsourcing and offshoring their customer service
to third-parties. Fraud detection systems become worthless when errors are introduced, so there is little
likelihood of complex systems being outsourced to anyone, even native developers. The potential of today’s
insurer lies in the realm of new data analysis, but its path is wholly determined by the human aspects present
in insurance.
The largest hurdle faced by insurers remains legislative barriers to sharing and pursuing information. Where
legislation allows, insurers are poised to collect and analyse new data and deliver better results. However,
tighter controls over an individual’s privacy may limit what analytics can do by stifling information pools.
The push toward Big Data and analytics for fraud is coming with a clarion call of automation and modelling.
Unfortunately, a pure automation oper¬ation can create as big of an opportunity for fraud as already exists
in the market by producing exploitable data pattern recognition. Fraud detection still needs a human touch.
Even the most advanced systems still deliver a data product, not a finalised piece of information. “People
are still required to take this analysis and produce the final intelli¬gence product that is useful to insurers,”
said Fletcher.
While data is at the core of the current revolution in insurance industry practices and advances, it must
inherently remain an industry that relies on gut feelings and human insight. A proper mix of machine and
human review can bring fraud detection to a new level, and an analytics backbone helps assure the highest
level of objectivity.
Ultimately, insurers face a choice of absorbing the cost to adopt these new fraud detection capabilities today
or of maintaining current operations in hopes that analytics will standardise and cheapen before increased
compe¬tition presses margins too thin.

Role of BA in Sensitivity Analysis:


What is a 'Sensitivity Analysis'

A sensitivity analysis is a technique used to determine how different values of an independent variable
impact a particular dependent variable under a given set of assumptions. This technique is used within
specific boundaries that depend on one or more input variables, such as the effect that changes in interest
rates have on bond prices.

BREAKING DOWN 'Sensitivity Analysis'


Sensitivity analysis, also referred to as what-if or simulation analysis, is a way to predict the outcome of a
decision given a certain range of variables. By creating a given set of variables, the analyst can determine
how changes in one variable impact the outcome.
Sensitivity Analysis Example

Assume Sue, a sales manager, wants to understand the impact of customer traffic on total sales. She
determines that sales are a function of price and transaction volume. The price of a widget is $1,000 and
Sue sold 100 last year for total sales of $100,000. Sue also determines that a 10% increase in customer
traffic increases transaction volume by 5%, which allows her to build a financial model and sensitivity
analysis around this equation based on what-if statements. It can tell her what happens to sales if customer
traffic increases by 10%, 50% or 100%. Based on 100 transactions today, a 10%, 50% or 100% increase in
customer traffic equates to an increase in transactions by 5, 25 or 50. The sensitivity analysis demonstrates
that sales are highly sensitive to changes in customer traffic.

Sensitivity vs. Scenario Analysis

In finance, a sensitivity analysis is created to understand the impact a range of variables has on a given
outcome. It is important to note that a sensitivity analysis is not the same as a scenario analysis. As an
example, assume an equity analyst wants to do a sensitivity analysis and a scenario analysis around the
impact of earnings per share (EPS) on the company's relative valuation by using the price-to-earnings
(P/E) multiple.
The sensitivity analysis is based on the variables impacting valuation, which a financial model can depict
using the variables' price and EPS. The sensitivity analysis isolates these variables and then records the
range of possible outcomes. A scenario analysis, on the other hand, is based on a scenario. The analyst
determines a certain scenario such as a market crash or change in industry regulation. He then changes the
variables within the model to align with that scenario. Put together, the analyst has a comprehensive picture.
He knows the full range of outcomes, given all extremes, and has an understanding for what the outcomes
would be given a specific set of variables defined by real-life scenarios.

Sensitivity is the magnitude of a financial instrument's reaction to changes in underlying factors. Financial
instruments, such as stocks and bonds, are constantly impacted by many factors. Sensitivity accounts for
all factors that impact a given instrument in a negative or positive way in an attempt to learn how much a
certain factor impacts the value of a particular instrument.

BREAKING DOWN 'Sensitivity'


Sensitivity determines how an investment changes with fluctuations in outside factors. Stocks and bonds
are especially sensitive to interest rate changes.

Interest rates are one of the most important underlying factors in the movement of bond prices and are
closely watched by bond investors. These investors get a better idea of how their bonds will be affected
by interest rate movements by incorporating sensitivity into their analyses.

Examples of Sensitivity
Fixed-income investments are very sensitive to interest rate changes. A bond's duration reflects changes in
the bond's price for each 1% fluctuation of interest rate. For example, a bond with a duration of 4 means
the value decreases 4% for every 1% change in interest rate. A bond with a long maturity and low coupon
has a longer duration, is more sensitive to rate fluctuations and is more volatile in a changing market.
Buying a bond at a low interest rate means the bond will be less valuable when rates rise and other bonds'
yields are higher. Municipal bonds provide higher yields and are typically less sensitive to interest rate
changes than corporate bonds.

Interest sensitive stock prices also vary with interest rate fluctuations. Large share price changes are seen
with small rate changes. Stocks with large betas are extremely rate-sensitive. Utility stocks and
some preferred stocks are two examples of price-sensitive stocks.

Benefits of Sensitivity Analysis


Sensitivity analysis helps determine uncertainties in stock valuation and reduce risks in a private or
corporate portfolio. An investor needs to determine how the simplest changes in variables will affect
potential returns. Criteria for success, a set of input values, a range over which the values can move, and
minimum and maximum values for variables must be preset to determine whether a successful outcome has
been reached. After determining profitability forecasts, an investor can make better-educated decisions
regarding where to place assets while reducing risks and potential error.

Treasuries and finance departments are increasingly being required to disclose sensitivity analysis or other
risk measurement in financial statements. Every public or private company needs a method to analyze risks
and hedge against them appropriately.

Maximizing Business Value from Loyalty Analytics

Analytics can help you understand your customers better, identify new opportunities for your business,
measure effectiveness, and grow in ways you couldn’t with intuition and experience alone. Many executives
are familiar with stories that exemplify the promise of analytics. When I break these stories down, they boil
down to one of two themes: either predictive analytics found an opportunity that was compelling enough
to take action on, or more commonly, a business leader had key insights they decided to take action on
based on their experience or knowledge of the industry.

As data volumes increase, companies are investing more in analytics to arm executives with the best
knowledge available to make decisions. Analytics’ real value is in suggesting change to current processes
and/or measuring the results of that change. For instance, what is the impact of offering a short “double
rewards” campaign on your loyalty program? What was the sales lift created by the campaign, and what
was the additional cost for offering this promotion? Did it bring customers back who had been dormant for
more than six months? Analytics are the best way to measure the impact of such offers. Based on results,
future campaigns can be adjusted to achieve the results the business is trying to achieve.

Current business analysis from The Economist found that only 2% of business executives believe sales and
marketing analytics have achieved “broad, positive impact” at their companies, despite 70% of those
executives considering analytics “very” or “extremely important.” Are these results surprising to you? I
believe the disparity lies in the expectations. Analytics are great at measuring the impact of change, but for
the promise of analytics to be realized, business leaders must also be willing to try new ideas. When such
ideas are tried, we can measure the impact and provide insights on successful factors that help produce
positive results or discuss factors that show minimal or negative results.

How to build a cycle of exploration you can use to drive success in your business? I have spoken about the
methodology at conferences across the US and Europe, receiving positive feedback from organizations who
decide to take a scientific approach to driving business change. I call this process PEAK:

Plan your campaign or business change:


 Focal Issue – What behavior are you trying to encourage? Increase volume of sales for products customers
already purchase? Introduce them to new categories or products? Cross-sell services based on product
sales?
 Scan – In order to build realistic expectations, collect historical information to gauge what success should
look like.
 Forecast – Based on your historical information, build a campaign forecast.
 Envision – Think of a future state that could drive the desired behavior long term if the campaign produces
the desired results.

Execute your campaign or business change:

 Release the campaign to your target audience.


 Communicate the advantages of taking action in the campaign.

Analyse campaign or business change results:

 Communicate results to those business leaders involved in the change. This is an area of opportunity for
most analytics teams. Use the analytics to create an outline for a compelling business story that is more
impactful than the numbers alone. In other words, the numbers should support the story rather than be the
story.
 Create an action agenda based on the results (continue, expand, change, stop).
 If the results are positive and you want to expand the program, automate the analytics to easily show
performance for future campaigns.

Keep adjusting based on the performance of the campaign or business change:

 Make changes from the action agenda in the prior step, adjust your Plan, and repeat the cycle.

With this easy to use framework, you harness the power of the scientific method, benefitting from a proven
structure that can help you identify methods that drive desired results.

human resources analytics (talent analytics)


Human resources analytics, also called talent analytics, is the application of sophisticated data mining and
business analytics (BA) techniques to human resources (HR) data.

The goal of human resources analytics is to provide an organization with insights for effectively managing
employees so that business goals can be reached quickly and efficiently. The challenge of human resources
analytics is to identify what data should be captured and how to use the data to model and predict
capabilities so the organization gets an optimal return on investment (ROI) on its human capital.

Although most organizations have enough data to make analytics useful, the data is often created and stored
in multiple places in multiple formats. There is no shortage of vendors who offer dedicated human resources
analytics software products, but many companies simply create a custom data warehouse for HR data and
leverage business intelligence (BI) applications on top. Other companies use data federation technology to
aggregate data from disparate sources in a virtual database.

HR analytics, also called talent analytics, is the application of considerable data mining and business
analytics techniques to human resources data. The goal of human resources analytics is to provide an
organization with insights for effectively managing employees so that business goals can be reached quickly
and efficiently. The challenge of human resources analytics is to identify what data should be captured and
how to use the data to model and predict capabilities so the organization gets an optimal return on
investment on its human capital.

HR analytics does not only deal with gathering data on employee efficiency. Instead, it aims to provide
insight into each process by gathering data and then using it to make relevant decisions about how to
improve the processes.

So how can companies use HR data analytics to make strategic personnel decisions? First, they’ll need
software, which isn’t hard to find. Huge vendors such as Oracle, IBM and SAP compete with many smaller
vendors to deliver the best HR analytics software as a service in the market.

But buying only HR analytics software won’t help at all if nobody in HR knows how to mine and interpret
data. Some larger companies are addressing this HR analytics talent shortage by hiring Big Data
Architect/Analyst or data scientists to work in human resources.

I think we have come across many times over “Big Data”, always sounds bigger to me but what is it? In
2012, Gartner updated its definition as follows: ‘Big data is high volume, high velocity, and/or high variety
information assets that require new forms of processing to enable enhanced decision making, insight
discovery and process optimization.’ i.e. we need new tools and technology because it is so big, fast
changing and potentially unstructured.

So what does big data look like? let’s see some examples.

 VISA processes more than 150 million transactions each day.


 500 million tweets are sent per day. That’s more than 5,700 tweets per second.
 Facebook has more than 1.19 billion active users generating social interaction data.

Too much! Right? Actually the big data and analytics market will reach $125 billion worldwide in 2015,
according to IDC & The International Institute of Analytics (IIA).

Analytics is always an important topic and trend in every part of business and HR is also not far behind.
Today many organizations are looking for metrics or analytics in HR which are not just related to people
but also on processes such as recruitment, retention, compensation, succession planning, benefits, training
& development, performance and appraisal and many others. In short Talent analytics is becoming more
popular these days as companies are doing lot of efforts to cultivate and align HCM with core business
objectives in order to achieve a competitive fringe.

HR analytics does not only gathering data on employee; instead it aims to provide insights into each process
by using data to make relevant decisions, improve the processes and operational performance. HR collects
enough data on employee’s personal information, compensation, benefits, retirements, attrition,
performance, succession time to time so it is important to use it properly to interpret the outcome and spots
the trends.

Some typical benefits and use cases of analytics are as follows:

 Improve organizational performance through high quality talent related decisions


 Forecast workforce requirements and utilization for improved business performance.
 Optimization of talents through development and planning.
 Identify the primary reasons for attrition and identify high-value employees for leaving.
 Provide the source of competitive platform for the organizations
 Manages applicants in better way on basis of qualification for a specific position.
 Recognize the factors which turn the employee satisfaction and productivity.
 To determine the individuals KPIs on the business.
 Enabling HR to demonstrate its benefaction to achieving corporate goals.

Analytics also used in HR to prepare cost and investment on their talent pool like cost per hire, cost per
participation on training, revenue and expense per employee. It provides opportunity for defining strategy
for retention and hire plan. It can also give complete picture of an organizational head counts based on
demographics – age, gender, geographical, departmental, qualifications etc.

The facts are also not different on HR analytics. A survey by MIT and IBM reported that companies with
a high level of HR analytics had:

 8% higher sales growth


 24% higher net operating income
 58% higher sales per employee

As mentioned before as well though there are currently many analytics options in HR but few of they are
really becoming popular these days.
One such is Talent analytics; which is more qualitative and is basically for processes from talent
management like personal development, recruitment, succession planning, retention etc. It can help
organizations to better analyze turnover, identifying top performers,identifying the gaps and develop the
proper training for them. It can also find out reasons for attrition and provide options to take strategic
decision for retention as well.

Workforce analytics is another common one which is more quantitative; it helps leaders todevelop
recruiting methods and specific hiring decisions, optimizing organization structure,identify quantify factors
for job satisfaction; determine the need of new departments and positions. It also helps the organization
to identify, motivate and prepare its future leaders.Align and motivate workforce and continuously improve
the way of work.

Workforce Analytics and Planning is the most common systematic identification and analysis of what an
organization is going to need in terms of the size, type, experience, knowledge, skills and quality of
workforce for ensuring that an organization has suitable access to talent to ensure future business success.
Workforce planning is a set of procedures that an organization can implement to maintain the most efficient
employee/management team possible, maximizing profits and ensuring long-term success. Workforce
planning falls into two broad categories: operational and strategic.

Almost every organization does conduct some form of workforce planning, like headcount/FTE planning
and workforce analysis. They recognize the need to transform that planning. What they may not know is
that most barriers are tactical in nature and require small hops and not a big leap.
Workforce analytics are more inherent, based on knowledge transfer, not on data based. But every
organization wants to back up their decision from data and facts. Predictive Analytics, based on statistics,
data and becoming more attractive. It helps leaders to take more strategic decisions based on the facts. Data
are generally presented in graphic, statistical reports, dashboards which are easy for leaders to understand.
It offer leaders to provide solutions to some complex decision making processes and helps them in
determining critical situations like tacking pay gaps, set of workers who are always at risk of
resigning,understanding the psychographics (personality, interest, work styles etc.) of
employees,behavioral qualities of applicants and many more.

What is Big Data for Human Resources?

By definition, Big Data in HR refers to the use of the many data sources available to your organization,
including those not traditionally thought of in HR; advanced analytic platforms; cloud based services; and
visualization tools to evaluate and improve practices including talent acquisition, development, retention,
and overall organizational performance. This involves integrating and analyzing internal metrics, external
benchmarks, social media data, and government data to deliver a more informed solution to the business
problem facing your organization. Using these tools, HR organizations are able to perform analytics and
forecasting to make smarter and more accurate decisions, better measure inefficiencies and identify
management “blind spots”.

The ability to capture and analyze big data has enabled many companies to both increase revenues by better
understanding and more accurately targeting customers and cut costs through improved business processes.

Big data also has attracted the attention of human resource managers who now can analyze mountains of
structured and unstructured data to answer important questions regarding workforce productivity, the
impact of training programs on enterprise performance, predictors of workforce attrition, and how to
identify potential leaders.

Trends are changing rapidly so it is very hard to catch up with all new technologies especially in analytics
which are not at all easy.

Big Data lakes, Hadoop, NoSQL, Predictive Analytics, In-Memory Analytics and so many.

According to the Cornell study, many organizations use dashboards to collect and share this engagement
information. HR staff should be encouraged to use dashboards as tools to plan for the future, rather than
just reviewing the data before heading into a meeting. One way companies could use these dashboards is
to predict potential problems or monitor how HR practices affect the workforce.

Most HR leaders understand the importance of HR analytics. Now they have to figure out how to use
analytics to enable their organizations to thrive because doing that will give their companies a leg up on the
competition.

So now organizations are involving themselves more into data management, analysis and further
interpretation of data. To get this complex analysis working they need off course mastery in data science
and statistics. Organizations those who taken this step already understand the benefit that data brings to
their decisions and the value that these decisions bring to the organization.

We should also keep in mind that analytics is not measured based on size but by the impact that the results
have on decisions. Also it is basically increase organizational effectiveness so just creating only reports
with no such value on decision making and optimization will increase cost for the companies.

Workforce Analytics:

Workforce analytics can help enterprise leaders to develop and improve recruiting methods, make general
and specific hiring decisions, and keep the best workers with the company. In addition, workforce analytics
can help management personnel to:

 Predict the probability of an individual employee's success.

 Identify the need for new departments and positions.

 Determine which departments or positions can be reassigned or eliminated.

 Identify and quantify physical risks to employees in specific positions.

 Identify and quantify factors that influence employee job satisfaction.

 Analyze and predict current and future technological needs.

 Assign and delegate responsibility for tasks and goals.

 Optimize the enterprise's organizational structure.

 Help the enterprise to identify, encourage, and cultivate its future leaders.

What are Workforce Analytics?

Analytics aren’t just for statisticians anymore. A growing number of organizations are using analytics to
examine and act upon data about their people in the workplace. Known as workforce analytics, these
sophisticated tech tools are changing the HR game, and those companies willing and able to harness the
power of Big Data for analytics are seeing great gains and meaningful advantages over their competition.

HR leaders use workforce analytics in various forms, such as predictive and prescriptive analytics. The
former helps managers comb through mounds of data about their people to determine information such as
which talent have the greatest potential of success with the organization or are most likely to leave in the
near future. The latter takes analytics a step farther and prescribes actions HR leaders can take to help
develop and retain key members of the workforce, based on data uncovered from predictive analytics.
HR analytics works by gathering workforce data, from work history to satisfaction scores, and feeding this
information into advanced computer models. Using sophisticated algorithms, these models churn out
insights that HR leaders can use to make critical decisions, such as whether to tweak commission structures
to drive sales or invest more heavily in training to curb high attrition rates.

Corporate executives are fast jumping on the HR analytics bandwagon and for good reason. HR analytics
marks a new frontier: No longer are senior managers responsible for making workforce changes based on
hunches and past history. Rather, HR leaders can analyze data to inform hiring strategies, highlight business
opportunities and forge the best career paths for top performers.

“HR is very eager to take advantage of the ability to forecast talent demand, gauge talent supply, predict
retention and to anticipate HR-related outcomes,” says Elizabeth Craig, a research fellow with Accenture
Institute for High Performance.

Technology providers are eager to help, too. In August 2012, IBM bought talent management software
provider Kenexa for a whopping $1.3 billion. SAP (with SuccessFactors) and Oracle (with Taleo) also
made acquisitions to enter this field while smaller players like Visier and Evolv gain ground with highly
scalable, cloud-based tools. Taken as a whole, HR analytics is an expanding field: Researchers from Bersin
& Associates project that the global market for integrated talent management technologies will grow 22
percent to nearly $4 billion this year—almost double the growth rate of 12 percent in 2011-2012.

Benefits and Use Cases


Making the most of HR analytics requires connecting HR data with a company’s strategic objectives.
Today’s HR leaders “face pressure to demonstrate the ROI” of an HR analytics system, according to Wendy
Hirsh, a principal at the HR consulting firm Mercer. Fortunately, Hirsch says, “analyzing data so that it
helps the company make the right business decisions” is a step in the right direction.
The right direction depends on the enterprise’s specific circumstances. For example, a Silicon Valley start-
up that’s having a tough time retaining tech-savvy talent may use HR analytics to better anticipate employee
turnover and provide incentives to curb attrition. A sales-driven agency, on the other hand, is more likely
to examine their data to differentiate a high sales performer from an under-achiever.

HR leaders that successfully match data elements to their human capital needs in a way that can impact
decision making can expect a number of key benefits:

Reduce attrition. By identifying top employees that are about to leave the company in the nick of time, or
sweetening the compensation pot for Baby Boomers considering early retirement, an HR analytics
application, effectively deployed, can save a company millions of dollars in lost talent. Factors such as
location, pay scale and personality type can all be fed into an HR analytics system to preserve the best
people in a talent pool. Consider, for example, The Results Companies based in Dania Beach, Fla. Using
Evolv’s workforce-analytics solution, the business-process-outsourcing company has been able to improve
its hiring practices, thereby reducing attrition rates by nearly 35 percent in less than two years. That’s a
savings of hundreds of dollars per employee.
Anticipate performance. Unfortunately, no amount of glowing references or impressive credentials can
truly predict a candidate’s on-the-job performance. HR analytics can address this gap by identifying
workers with strong leadership qualities and flagging those that are unlikely to mesh with a company’s
corporate culture. By better matching job applicants to the right positions, The Results Companies enhanced
performance rates by 20 percent and boosted revenue per agent by 4 percent.
Compensation efficiency. Although most commonly a tool used by sales managers, HR leaders can also
take advantage of today’s sales analytics applications—software that can compare and select from a variety
of sales compensation models before putting them into production. For example, a company may have a
compensation structure that rewards the acquisition of new accounts by granting salespeople a 10 percent
cut of the account’s estimated worth. However, through analytics, it may be discovered that rewarding top
performers with a predetermined annual bonus is a more cost-effective tactic for driving sales and stoking
competition among sales reps. Such questions are important: according to a 2011 study from Compensation
Analytics, 70 percent of respondents identified improving compensation design processes as a top priority.
Enhance employee morale. It can cost upwards of $10,000 in recruiting and training costs to replace a single
$8 per hour employee, according to The Sasha Corporation, a Cincinnati-based consultancy. Analytics tools
can gauge signs of dissatisfaction and point to ways for retaining individual workers, boosting employee
morale. Career-development planning, connecting high performers with training programs, gathering
information from employee surveys—they are all ways HR analytics tools can measure an employee’s
satisfaction and willingness to stay on board.

Challenges to Making Analytics Work

As with any analytics implementation, the promise of these HR systems depends on an enterprise’s ability
to employ the right people, business processes and tools to make them effective.

S-ar putea să vă placă și