Simplification: avoiding complexity and ensuring NonStop data “plays well with others!”

Hand pushing on a touch screen interface-SIMPLIFY button

Let’s talk about simplification. If you have as yet not had a chance to attend major HPE events and have missed the opportunity to hear HPE executives talking about strategy, you may have missed the emphasis HPE is placing on Hybrid IT. More importantly perhaps, you may have missed the further focusing of that emphasis to where HPE today talks about the need for IT to simplify the transformation to Hybrid IT. Almost by definition, hybrids represent more than one platform and equally by definition, complexity increases.

For the NonStop community the good news here is that NonStop has been ported to support the Intel x86 architecture which opened the doors for NonStop to be offered as virtualized NonStop (vNS) making it possible for NonStop to run on traditional systems as well as virtually on x86 server farms and within clouds. However, given such options, there is no hiding that complexity seems to be mounting, but fortunately this hasn’t gone unnoticed by the NonStop vendor community.

A major contributing factor driving continued vendor interest in NonStop is that mission critical applications that run on NonStop tend to create data – information that reflects the real world in which enterprises just happen to be engaged. No real news here however, but all the same, enterprises are beginning to express concerns over just how best to go about getting at this data. Should analysis of this data be done on the NonStop or should the data be exported to data warehouses or data lakes? Should infrastructure be acquired that displays data transformed into information at the very source where it is created or should it be enriched with other enterprise data on systems apart from NonStop? Is the cloud the ideal place to house data given the number of cloud-resident analytics options?

 

To view or to move?

There are many factors influencing this decision, not the least being that in some instances NonStop may be the sole enterprise class server in the data center. While there are examples of just this (and some financial institutions come to mind), perhaps a more prevalent scenario involves additional resources (processing and storage) apart from NonStop which is where the message of simplification of the transformation to Hybrid IT comes into play.

Capitalizing on the data to be found on NonStop quickly breaks down into two discussions – the infrastructure best suited to displaying data as meaningful information and the utilities best suited for collecting and then moving the data. As a community, the prospect of connecting NonStop to something else has always been a good option, as for a very long time files and databases were held on systems other than NonStop. However, this evolved considerably following the introduction of NonStop SQL and with the latest releases of NS SQL/MX what were once considered the de facto data bases of the enterprise have witnessed NS SQL/MX joining their ranks. Equally as surprising for many, NS SQL/MX has been the recipient of data once destined elsewhere as enterprises have come to recognize the value of having their data resident on NonStop. HPE’s own IT is being perhaps the best example of this when HPE moved data from many Oracle data bases to NS SQL MX. This transition that began more than a decade ago has made it even more important for data to be viewed, distributed, integrated and transformed.

 

Complexity Resolved?

Recently, the head of design for auto manufacturer Ferrari made reference to sculptor Constantin Brancusi when he declared that “Simplicity for me is complexity resolved.” Should our intentions be to simply view the data on NonStop then perhaps the best approach is to look for infrastructure that displays the data we need in a form beneficial to our business leadership. Anyone who has seen presentations by OmniPayments, LLC., of late that featured the recent big win for the company at JCPenney will recall how much of success of OmniPayments was attributed to the dashboards OmniPayments created for JCPenney.

As it was later reported in various publications, one of the key benefits of the new OmniPayments solution to JCPenney is the OmniPayments’ Analytics Dashboard. With OmniPayments deployed, the dashboard provides real-time information on the business transactions as well as the performance and general health of the system. This is one example that illustrates just one of the challenges facing all solutions providers today – they not only have to be seen reporting on their solutions and the supporting infrastructure their solutions depend upon but increasingly they have to dive deep into the data and provide insights for the business managers in order for the enterprise to remain competitive.

Perhaps no NonStop vendor comes with a better pedigree in this regard than IR. With Prognosis, IR has been a stalwart among operations personnel for more than three decades. I just happened to be in the Sydney offices of Tandem Computers when the original IR developers “borrowed” the newly installed VLX system to pull together the initial product offering. Today, via partnerships with companies like ACI Worldwide and even VMware, Prognosis can be seen turning data into information and providing insight into the performance of applications and systems. “Yes, you can look at data being generated as transactions occur and yes, you can load data into analytics platforms for analysis of trends, but in reality, being able to answer the simple questions in real time is just as important, if not more!”

Kevin Johnson, VP Sales – Payments Performance Monitoring at IR, developer of IR Prognosis performance management and monitoring software, explained that when it comes to financial institutions, this can mean answering questions like, “Why did the transaction just fail this very minute? Why is this region seeing performance degradation? Why are so many transactions being declined? Jason Krebs, Senior Product Manager at IR, recently posted to the IR blog, “Deep visibility also enables detailed analytics to be performed on performance and transaction flows.”

The challenge facing enterprises embracing Hybrid IT where NonStop plays a critical role is often the need for a lot more than just dashboards. More often than not, there are data warehouses and data lakes as well as a raft of analytics programs running off-platform. Networking has always been a strong point for NonStop systems and this tradition of connectivity is being put to the test as more NonStop customers are finding that they must meet the requirement of integrating data created on NonStop with data being captured elsewhere. More often that not, Hybrid IT is about clouds and exploiting APIs and Services on offer from Cloud service providers.

 

Unnecessary complexity?

As IBM recently noted in a post of August 9, 2019 to the blog Seeking Alpha, as it promoted its recent acquisition of Red Hat, “The cloud is a potentially huge and already exciting market … but it is perhaps easy to forget that there needs to be infrastructure to support it somewhere and that many businesses will still require their own even if they migrate a chunk of the workflow to the cloud.” HPE agrees with these observations, but in an August 16, 2019 article published on CIO.com, HPE blogger Chris Purcell added a note of caution, “As IT operators and developers become more experienced, they realize more cloud options means they can place workloads where they are best suited – on premises, off premises, in private clouds, or in public clouds … (but) can an enterprise have too many clouds – creating more complexity where IT is trying to simplify?”

“Sophistication: unnecessary complexity,” so I have heard said many times. However, there is truth to this and the effort HPE is putting into simplifying the transformation to Hybrid IT resonates well with organizations looking at their options. When it comes to NonStop there are already several NonStop vendors providing solutions to not only capture data as it is being created but to tightly integrate it with the analytics products running elsewhere in IT. These vendors are well known for both the way they make it easy to capture data and simplify the way integration can be accomplished.

As one example, the Striim platform makes it easy to ingest, process, and deliver real-time data across diverse environments in the Cloud (public or private / on-prem), that helps NonStop users rapidly adopt modern data architectures. “The fast adoption of cloud solutions requires building real-time data pipelines from in-house databases, in order to ensure the Cloud systems are continually up to date,” said Irem Radzik of Striim Product Marketing. “There are many reasons why streaming data integration is more common, but the main reason is quite simple: This is a relatively new technology, and you cannot do streaming analytics without first sourcing real-time data,” posted Striim CTO Steve Wilkes. “This is known as a ‘streaming first’ data architecture, where the first problem to solve is obtaining real-time data feeds.” More importantly, according to Wilkes (and something that sounds familiar to all NonStop users), “Organizations can be quite pragmatic about this and approach stream-enabling their sources on a need-to-have, use-case-specific basis. This could be because batch ETL systems no longer scale or batch windows have gone away in a 24/7 enterprise.”

To discover the value that is locked up in data on NonStop systems today and to make the data available to data analytics solutions, it is another NonStop vendor that has begun transitioning beyond just data replication. Capitalizing on features that are already a part of its solution, NTI shares a similar data ingestion methodology to Striim in that NTI too can capitalize on Change Data Capture (CDC). NTI has recently enhanced its DRNet® product offerings to include data integration, distribution and transformation. PROVisioning with DRNet®/Vision addresses the need to have data on NonStop integrated with tools such as Splunk, Elasticsearch, etc. Having just successfully wrapped up an early Proof of Concept at a customer site utilizing JSON messages out of DRNet®/Vision this eliminates the need for any staging server as it feeds NonStop data directly to products of choice within the enterprise. “Simply stated, with support for JSON messages,” said Tim Dunne, NTI Senior Vice President, Worldwide Sales, “we open the doors for the NonStop community to better integrate NonStop data with any data lake, warehouse or analytic process required of them by their organization.”

Perhaps you haven’t had an opportunity to hear from HPE executives as they project their strategy for simplifying the transformation to Hybrid IT or heard about the value they are placing on clouds, public and private. Perhaps too you haven’t had a real opportunity to look at the products already available from the NonStop vendor community that help discover important data on NonStop important for business to remain competitive. There are many more NonStop vendors active in this space that for reasons of space haven’t been covered here, but then I am anticipating there will be other articles by some of them in this issue of The Connection. Yes, it’s all out there and available to the NonStop community.

At a time when we are being reminded that data is the new currency of business, isn’t it good to see as big a commitment from the NonStop community to deliver products. When it comes to answering the simplest of questions – viewing the data on NonStop or moving it to where further analytics can be performed, it all comes back to the real business need and for many, the answers may involve a little of each! And in all reality, it’s not complex and it’s not complicated – it only takes a phone call to your favorite vendor to find out just how accommodating your NonStop system truly is when it’s asked to play well with others!

Author

  • Richard Buckle

    Richard Buckle is the founder and CEO of Pyalla Technologies, LLC. He has enjoyed a long association with the Information Technology (IT) industry as a user, vendor and more recently as a thought leader, industry commentator, influencer, columnist and blogger. Well-known to the user communities of HP and IBM, Richard served on the board of the HP user group, ITUG (2000-2006), as its Chairman (2004-2005), and as the Director of Marketing on the board of the IBM user group, SHARE, (2007-2008).

Be the first to comment

Leave a Reply

Your email address will not be published.


*