Streaming Analytics for the Digital Asset Risk Management System - Cloudwall Success Story
How to minimize data processing latency to hundreds of milliseconds when dealing with 100 mln messages per hour? How can data quality be secure and…
Read more
Every second your IT systems exchange millions of messages. This information flow includes technical messages about opening a form on your website, network traffic information, sensor data, but also more meaningful pieces of information like new orders from your customer. You obviously have access to most of that information in dedicated systems, in a more aggregated manner, on-demand. However, what would you do if you had a chance to combine messages from different systems and react on the spot? Imagine you stream all the messages being sent by different systems, even internal ones, through one huge pipeline and you are able to pick up the ones that has some semantic meaning for you and you take certain action on that - like sending a message to your customer, updating your website or computing real-time metric that half of your company look at? This is what actually event processing is about! Let’s go through some use cases where there is a hidden value locked in the real-time streaming analysis.
If you think about making decisions on the spot you would probably think of marketing as the most natural use case where you can easily calculate business profits. I am sure that if you could shape your customer journey based on his/her behaviour you would be able to upsell or cross-sell your services. In this case, we would definitely look at clickstream data - everything that is happening on the website, online shop or mobile application. Based on that we could on the spot engage machine learning models to recommend better content or new products in personalised feed. This applies to e-commerce but also banking and telco - wherever you want to help your customers spot your product they might be interested in. Banking and telco systems have another interesting source of events - all transactional and infrastructure systems. If you see a certain type of transaction or action happening on your customer account you can send a personalised message, activate additional service, send an appointment request to your CRM or simply deactivate the account if you get wind of suspicious behaviour here.
Listening to the transactional event happening in your system opens another opportunity - fraud or anomaly detection. When you recognize an event that potentially can be a fraud, you can put that transaction on hold and check with the customer if he/she really intended to do so. Assuming that you do it smartly, your customers will definitely feel safer. The overall experience can be also improved if you start reacting on anomalies before your customers tell you there is something going wrong with your service, just to mention a network outage in case of mobile providers or online gaming platforms.
Analysing internal communication between your systems opens another set of use cases for business process automation and monitoring. You can combine information from your supply chain, transactional system, and ERP to check whether certain orders are processed according to the expected timeline. Stream processing can play a role of an integration layer as all events can be enriched with additional data and connected based on logical identifiers, like car plate number or customer number, launching actions in many systems at the same time. One event can end up in a set of requests to CRM to launch communication to customers, billing system, ERP system for invoicing and field management system for your technicians.
There is another value in analysing machine to machine communication, including sensor data, vision, and telematics from IoT devices. Starting with just plain monitoring we can compute metrics and publish them in real-time for a better understanding of what is happening in our facilities. Once we can measure processes we can use the data for optimisation of our production and tweaking our operations on the fly. There might be also a value in predicting our operations based on external conditions, like traffic or weather. Finally, event processing systems are a perfect place to run predictive maintenance models so we are able to fix things before they break our operations and impact customers.
While communication between our software infrastructure is like a bloodstream of our company, then event processing is like a nervous system that makes decisions in real-time and on a scale. Such automation can not only generate new value stream from personalised offering but also improve overall customer experience and save us money on potential failures or anomalies.
If you want to explore any of the above use cases or you might have your own to be implemented, contact us!
How to minimize data processing latency to hundreds of milliseconds when dealing with 100 mln messages per hour? How can data quality be secure and…
Read moreMLOps platforms delivered by GetInData allow us to pick best of breed technologies to cover crucial functionalities. MLflow is one of the key…
Read moreA year is definitely a long enough time to see new trends or technologies that get more traction. The Big Data landscape changes increasingly fast…
Read moreFounded by former Spotify data engineers in 2014, GetInData consists of a team of experienced and passionate Big Data veterans with proven track of…
Read moreMulti-tenant architecture, also known as multi-tenancy, is a software architecture in which a single instance of software runs on a server and serves…
Read moreYou could talk about what makes companies data-driven for hours. Fortunately, as a single picture is worth a thousand words, we can also use an…
Read moreTogether, we will select the best Big Data solutions for your organization and build a project that will have a real impact on your organization.
What did you find most impressive about GetInData?