Abstract— CEP and PA. This framework can be

Abstract— Event processing is a technique of tracking and scrutinizing of information about the events that happen and deriving a conclusion from them. Complex Event Processing (CEP) combines data from various sources, infers events or patterns and suggests more complicated circumstances. Complex events require real time detection in order to have time for appropriate reactions. However, there are several events which should be prevented rather than responding to them after they have occurred. This can be achieved using Predictive analysis. Predictive analysis (PA) enhances the performance of CEP. In this paper, we define CEP and PA technology and provide a conceptual framework which provides synergy between CEP and PA. This framework can be the basis of general design pattern in future.


Keywords—Complex Event Processing, Predictive Analytics

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!

order now


                                                                                                                                                       I.           Introduction

The paper presents the synergy between two promising technologies viz Complex Event Processing and Predictive Analysis. The mentioned technologies can be integrated to produce results which are efficient than either of the technologies. Complex Event Processing is the ability to import data from multiple sources and apply complex rules and derive outbound actions. Complex Event Processing can be employed in diverse application ranging from financial analysis, healthcare, real time business intelligence, fraud detection etc. CEP processes stream of data to produce complex events in real-time.

Predictive analytics is a methodology which uses certain data mining, machine learning techniques to predict the future events. It deals with analyzing of historical or current data to predict or foresee future. Examples of PA are weather fore casting where data from various sources has to be analyzed and appropriate predictive model has to be applied which helps in anticipating the weather. Credit card fraud detection is another example wherein the fraudulent activities are to be detected in real time.

CEP does not predict the complex events but requires the events to occur in order to analyze complex events and PA can predict events based on historic data. By combining these two

methodologies we aim at enhancing the analysis and prediction process.


                                                                                                                                                     II.         BACKGROUND


A.       Basic Definitions

 An event is anything that happens, or is contemplated as happening. An event can be visualized as an object that represents encodes or records an event, generally for the purpose of computer processing. An event can be a virtual or a real-world scenario. An event can be a simple event or a complex event. A simple event is the one which is not a composition or abstraction of other events. A simple event cannot be further broken down into simpler events. A complex event on other hand is an abstraction of simple events viz a complex event is composed of several simple events. An event source provides events as the input to the event processing system. Examples of event sources are simple RFID sensors and actuators, business flows, and CICS applications. An event cloud is a partially ordered set of events (poset), either bounded or unbounded, where the partial orderings are imposed by the causal, timing, and other relationships between the events. An Event processing network is a set of event processing agents (EPAs) and the channels they use to communicate. Event processing agent is a software module that processes events.


B.       Complex Event Processing:

Complex Event Processing or CEP is an event processing technique that takes in data from various sources as input, analyses and processes it to produce more complicated circumstances as output. The aim of Complex Event Processing is to identify meaningful and relevant events (such as opportunities or threats) and respond to them as fast as possible. CEP has a potential to define, manage and predict events or situations that can be an opportunity or a threat in complex networks. CEP deals with detecting real time complex events.

CEP executes data on stored queries unlike the traditional database systems where query is executed on stored data. CEP queries are applied on potentially infinite stream of data. The inputs are processed instantaneously and the data which is not relevant is discarded. Once the system matches the input events to predefined sequence or pattern the output is generated. This aspect leads to the CEP’s real- time analytics capability. CEP’s real-time analytics capability makes it suitable to be used in high demand continuous intelligence applications to enhance situation awareness and real- time decision making. CEP finds its application in algorithmic trading, security and fraud detection, health care etc.

In algorithmic trading 1 space, an algorithm means sequence of steps by which patterns in real-time market data can be recognized and responded in order to detect trading opportunities and place and manage orders in market. Before algorithmic trading the traders manually carried out the process of building and managing a trading strategy. The traders in trading stations had to watch out four to eight screens to watch real-time market. The traders had to manually analyze patterns possibly in a spreadsheet and work out where possibly to place orders (buying and selling orders). With the advent of algorithmic trading the traders can now initiate and manage hundreds of algorithms without actually being involved. In this way, the trader can be more productive.

Algorithmic trading can be advantageous if the time between algorithm conception and implementation can be reduced. The traditional approach of developing trading strategies in C++ or Java takes a long period of time to process. However, using CEP can reduce this time. CEP processes stream of continuously changing data and produces results in almost real time. This significantly increases the efficiency of the strategy.


CEP also finds its application in the IOT space. With the advancement in internet of things more and more data is collected from the sensors and smart devices. This huge amount of data collected should be analyzed and response should be generated as quickly as possible. Since CEP can analyses and predicts data in real time, integrating CEP with IOT could result in efficient systems.


C.      Predictive Analytics

Predictive analytics extracts useful information from data (past or present data) and analyzes the information to predict the future events. The patterns found in historical data can be used to identify and anticipate risks and opportunities. The task of the predictive model is to estimate that a similar unit in a different sample will exhibit the specific performance. Predictive analytics is used in several fields: financial services, insurance, telecommunication, retail, travel. healthcare, pharmaceutical industry, etc. The techniques used in PA are regression techniques which include linear regression, logistic regression etc. Machine learning techniques enable the computers to learn from data that is fed into the system. Machine learning techniques are widely used in PA because of its self-learning capability. A predictive process will follow the four steps

1.       First the raw data is collected from source and is pre-processed

2.       This pre-processed data is transformed in an appropriate structure so that it can be given as an input to the machine learning technique.

3.       Then, the learning model or the training set is constructed based on the transformed data.

4.       The predicted output is reported using the previously created training set.

Predictive modeling describes a methodology where the underlying relationships between the historical data are recognized and analyzed in order to make the data and the predictions. The predictive models analyze, test and validate the data in order to make the best possible prediction of the probability of an outcome A model is reusable and is created by a training algorithm by analyzing the historical data and saving the model. The fact that the model is saved is because there is always a fair probability that the model can be applied to a similar data in the future. The predictive models use certain algorithms to achieve this task. Some of the algorithms are

Time series model 2: Here predictions are time based i.e., time series model helps in predicting the behavior of variables at equally spaced points in time. Time series models are used in forecasting of short-term events from previous patterns. This model is