There have been so many phrases concerning data thrown around in the business press. Data is the new currency of business! Enterprises have become data driven! Not forgetting how we are entering the age of insight. No matter which phrase represents your perspective on IT the outcomes are the same. Not only is data playing an important role in business but it’s on the move and is being combined with data arriving from many sources such that integration throughout the enterprise is occurring preceded and followed by increased data analysis.
The root cause of this focus on data, integration and analysis is caused by increased awareness that customer experiences need to be watched, coming to terms with supply chains that need to be more accommodating and perhaps, most important of all, the overriding need to improve business competitiveness. Markets are fickle and in this “new normal” world, business simply cannot stand still nor can it simply keep on doing what was always done. Behind every chatbot trying its best to help us and every medical practitioner trying to diagnose us remotely is artificial intelligence in one form or another.
This was the topic of a June 2021 post by Artificial Intelligence (AI) blogger Abhishek Kumar to the Data Science Blogathon, Do Artificial Intelligence and Business Analytics Complement Each Other? “Many consumers are drawn to proactive analytics, which delivers real-time warnings and insights. Businesses may make greater use of their operational data as a result of this.” Proactive analytics is all about “taking an active approach to active business monitoring in order to prevent incidents from escalating,” according to Google.
In the Gartner Top 10 Trends in Data and Analytics for 2020 their Trend 7: Data and Analytics Worlds Collide, they recognized just how quickly the transition to proactive analytics was happening. “The collision of data and analytics will increase interaction and collaboration between historically separate data and analytics roles. This impacts not only the technologies and capabilities provided, but also the people and processes that support and use them,” Gartner predicted. “To turn the collision into a constructive convergence, incorporate both data and analytics tools and capabilities into the analytics stack.”
And when it comes to operational data having an impact on active business monitoring, it’s hard to ignore just how important a role NonStop plays ensuring fresh data is always on tap for whatever analysis may be performed. For many NonStop users this collision of data and analytics is already taking place. When you consider the solutions already available to NonStop users in the field of transaction analytics, data integration and the need to get data created on NonStop into stacks specifically designed to provide analytics, there is much to be said about the diversity of offerings already available to NonStop users.
When application developers first address a business need following an understanding of the business requirements, there is always an opportunity to build the analytics into the application itself. Tools exist to do this and for some market verticals where the analytics required is well known to the business unit, this is perhaps the most optimal approach to be taken. Being able to predict unseasonal usage in real time has always proved invaluable to the NonStop user even if it just leads to more optimal configuration of the processes making up their NonStop system. However, for many of today’s NonStop users where the application is from a vendor, digging this deep into the code may not be an option.
And yet, some NonStop vendors have already made moves to exploit their own analytics as part of their solution. Other vendors have made it possible to readily pass information to designated analytics products running off-platform. There are even those vendors providing NonStop users with a complete platform that can be tapped for easy integration with the enterprise as a whole. Irrespective of the path chosen, the value that analytics is providing cannot be ignored. The days when it was simply about the data – verified and uncompromised – are long gone. So much more needs to happen once data is created on NonStop.
When IR unveiled the Prognosis Platform, functionality was spread across three distinct components – Prognosis Server, Prognosis Edge and Prognosis Cloud. Whereas the Prognosis Edge includes new intelligent software agents, it’s the Prognosis Server that NonStop users would be familiar with. It is the central real-time data processing and analytics engine for IR’s Transact (Payments) and Infrastructure (NonStop) solutions that NonStop users have relied upon since Prognosis first appeared on the scene.
Prognosis Server is mainly about the analytics that was developed by IR. “Some of this analytics is enabled on NonStop, as previously noted, while some of the newer analytics processing is now happening off-board (e.g. with Windows/SQL),” said IR Product Manager, Jamie Pearson. “Under consideration for future development (and driven by customer demand),” said IR Senior Product Manager Jason Krebs, “is the inclusion of relevant complementary data sources, one such possibility could be the data / metrics provided by cloud vendors.” The relationship between NonStop and cloud services is becoming more apparent just as is the acceptance of the cloud experience by NonStop users and such acceptance is opening the door to pursuing analytics almost anywhere there is suitable product offerings.
“Even as there are myriads of reasons for doing so, we find that more and more customers are standing up Prognosis Server in their public and private clouds to provide that circle of visibility around the NonStop, regardless of who owns the data center or hardware servers,” According to Jim Bowers, IR Payments and Infrastructure Senior Solution Engineer in his presentation at the NonStop Technical Boot Camp 2021,
This last reference to the cloud is indicative of where many NonStop users view as the ultimate destination for data created on NonStop. Recently NTI announced how it was capitalizing on the benefits that come with data replication technology based on Change Data Capture (CDC) to better integrate with external product offerings that include analytics. DRNet®/Unified for HPE Customers provides integration with Splunk>, an industry-leading platform that arguably “turns data into doing.”
NTI’s support for Splunk> focuses on data and on ensuring data created on NonStop and stored in databases, files and tables can be passed to popular analytics and presentation solutions using conventional CDC technologies. This is more important than simply feeding events created by popular NonStop utilities as it’s ensuring fresh data created by mission-critical transactions makes it to where it’s most needed for analytics.
“The message today is all about the arrival of the age of insight; with CDC we can replicate and transform to input to any popular analytics process – adjacent, peer or hybrid,” said NTI’s Global Director Worldwide Sales, Tim Dunne. “We have demonstrated this already with our introduction of JSON support for replicating from NonStop to Splunk> and this is only the beginning.”
Following support for business integration with Splunk>, NTI has also responded to NonStop user requests and added support for both Kafka and ELK. And this is perhaps the most important consideration NonStop users should be making today when running third-party solutions on NonStop. Rather than diving deep into the application code having a solution based on industry standards looking after the business integration required of the enterprise is of course preferable.
No discussion on vendors providing solutions for business integration would be complete without a reference to Striim. At the forefront of making sure data created on NonStop makes it to every possible target that would benefit from this data, Striim supports a real-time data streaming platform whereby NonStop participation is as a source.
As Striim’s director of Product Growth, John Kutay, expressed his views about the benefits that come with leveraging the Striim solution in commentaries posted last year, “BI tools that analyze data from only some of these sources aren’t helpful. And waiting days, weeks, or even months to make sense of information is unacceptable as well. Market opportunities quickly pass by.”
Striim is yet another NonStop vendor capitalizing on CDC technologies and as Kutay confirmed in his post, “What BI tools need are effective data integration solutions that continuously deliver data in a consumable format. Real-time data integration can also help companies speed their business intelligence operations by continuously replicating all their data to their respective cloud BI tool. While traditional data integration tools can slow down business intelligence operations, real-time data integration platforms allow companies to accelerate their insights.”
What does the future hold for analytics and what might the impact be on NonStop applications? According to the Gartner report already references, “By the end of 2024, 75% of enterprises will shift from piloting to operationalizing AI, driving a 5X increase in streaming data and analytics infrastructures.”
In other words, if you are not already well down the path to making sure data created on NonStop is beneficial to all aspects of the business, you may very well lose your competitive edge. But here is the good news for the NonStop community. No matter where you might be in your own journey to greater utilization of analytics, there are many members of the NonStop vendor community already committed to providing you with off-the-shelf product offerings.
Ignoring the context that contributed to the movie title, Analyze This, the message should resonate well across the community. Nothing today will help more to sustain business expansion in the fickle marketplaces where they exist than what can be revealed through analytics and fortunately, this is one area where NonStop continues to go from strength to strength.