Analyse your data stream in real time and make the most out of it

Thanks to Complex Event Processing platform you can analyse your stream of information and automatically take actions in real-time

servers
They get value from Event Processing:

How does the Complex Event Processing work?

servers
servers

Data Source

Your IT systems exchange vast amount of information, that includes technical messages about opening a form on your website, network traffic information, sensor data, but also more meaningful information like new orders from your customer. \ You obviously have access to most of that information in dedicated systems, in a more aggregated manner and on-demand. However, what would you do if you had a chance to combine messages from different systems and react on the spot, just after they were generated? Event processing system are designed to analyse messages in real-time, enrich them with external information, combine into more complex events, analyze for patterns and trigger actions.

servers

Continuous Data Collection

Continuously collect data from various sources like transactional databases, application log files, messaging queues, IoT APIs. Collect them in multiple formats, like CSV, JSON, XML or Avro and via many protocols like HTTP, FTP, NFS or AMQP. Use CDC (Change Data Capture) to receive stream of changes from databases. Prepare stream of events also from batch sources to work on them in a streaming manner.

servers

Analyst workbench

Analyst workbench is the user interface for analyst to create, browse, update or delete configurations executed by the streaming platform. Streaming jobs are configured using Python, Java or SQL - depending on the required level of abstraction and performance goal. It is integrated with CI/CD pipeline, version control and test & review process for smooth deployment process.

servers

Continuous processing

Event processing is the core of the whole solution. This component can filter single events, cleanse from duplicates, enrich with external sources and sessionize to business events. Then aggregate the events, find complex patterns or detect anomalies. Finally it can score the events with Machine Learning models and when event meets required conditions, trigger appropriate action or alert.

servers

External Data Source

It may happen that you want to use data that is not available in your Data Lake, e.g. for data enrichment. Our design allows you to access data from multiple systems, like external databases, files and data stores, within a single query. You do not need to load data from different sources to use them in your event processing.

servers

ML Models

Including Machine Learning models in your real-time decision making process can be very beneficial. In our design you can enrich your events with an output from external models, like scoring your users in real-time, serving recommendations or perform fraud detection.

servers

Data Lake

Store your structured (like transactions from ecommerce system), semi-structured (e.g. XML or JSON files) and unstructured data (these can be image, but also documents) that is collected and transformed in ESP. Make it accessible for reporting and analytics purposes. The ESP platform archives processed messages in the Data Lake and made it available for further Big Data Analytics.

servers

Continuous Data Delivery

Instantly deliver the results of events processing to the destination systems, in the format and protocol they expect. The events can be published to a message queue, pushed to a HTTP API , inserted in the database. They can also be batched and send as a file to a file broker.

servers

Security

Security and access management tool allows to control user access to data and components of the environment. It provides audit capabilities for verifying who has access to specific resources.

servers

Automation

Deployment automation with proper configuration management are key to ensure the high quality of software delivery and to reduce risk of production deployments. All our code is stored in version control system. We design tests to be a part of the Continuous Integration and Continuous Deployment pipelines.

servers

Monitoring

Complex monitoring and observability solution gives detailed information on the state and performance of the components. You can also deploy metrics to observe application processing behaviour. Monitoring includes also alerting capabilities, needed for reliability and supportability.

servers

Orchestration

Originally all of the components of Hadoop ecosystem were installed with Yarn as an orchestrator to achieve scalability and manage infrastructure resources. Nowadays Kubernetes is becoming a new standard for managing resources in distributed computing environments. We design our applications and workloads to work directly on Kubernetes.

servers

Use Cases

Imagine you stream all the events generated by users (like clickstream) and all the messages being sent by different systems, even internal ones, and all transactional data changes (like from CDC) through one huge pipeline. Imagine that you are able to pick up the ones that has some semantic meaning for you, derive other complex events from them, and take a certain action on that. The actions could be sending a message to the customer, personalizing the website or computing real-time metric that half of the company look at.

How does the Complex Event Processing work?

Graph
Technologies:

Get Free White Paper

Read more about Complex Event Processing. Find out why analysing events is important for your business, what you can get out of events and how to implement CEP in your organizations.

ebook

We build the solution together with you, so you can learn how to maintain and extend it in future

How we work with customers?

Our Team of experienced architects, developers and data engineers have already completed a number of projects with real-time event processing. We regularly give presentations on international and local conferences and events.

  • Big Data focused

    Big Data focused

    Big Data is not about technologies, but about employing culture of collecting, analyzing and using data in a structured way, in innovation-friendly environment. We can help you start this journey.

  • Technology agnostic

    Technology agnostic

    Our solutions are designed to accommodate best practices and our vast experience in Big Data and are not based on specific technologies. This gives us a flexibility to adjust the design to the project specifics and current state-of-the-art to better serve the goal.

  • Open source or native cloud services

    Open source or native cloud services

    We build our solutions with openness in mind, so we extensively use open source software, however in justified cases we suggest to use managed services offered by public cloud providers

  • On-premise or in public cloud

    On-premise or in public cloud

    Our solutions are designed to be deployed on your local infrastructure, in hybrid cloud or fully in the public cloud.

  • DataOps principles

    DataOps principles

    Our code is versioned, unit tested and deployed using CI/CD. We also design unit tests for data to measure its quality in large data sets.

  • Hadoop distribution

    Hadoop distribution

    For our customers who want to stick to Open Source and free version of Hadoop, we have prepared our own distribution built out of the latest packages.

Ready to build Complex Event Processing Platform?

Please fill out the form and we will come back to you as soon as possible to schedule a meeting to discuss about your event processing needs.

What did you find most impressive about GetInData?

GetInData is a relatively small agency with experienced professionals that enjoy and perform their job exceptionally well. Their attentiveness and code quality are impressive
Our project with GetinData was very focused on our core problem statement. We had a very clear bullseye to target. The team worked on a regular basis without needing too much involvement from our side.
We were super impressed with the quality of their work and the knowledge of their engineers. They have very high standards in terms of code quality, organisational skills and are always willing to contribute with their best. They also are very friendly and easy going people, what made our collaboration more fun.

Let's start a project together

Type the form or send a e-mail: hello@getindata.com
By submitting this form, you agree to our  Terms & Conditions