Perhaps the most enduring character from the fictional Star Trek franchise just happens to be Data. With his positronic brain he could out-compute the likes of Vulcans and in so doing, endeared himself to viewers worldwide, given his total lack of emotion. Before Data, there had been Spock, but in Data we saw a significant upgrade albeit one with absolutely no understanding of humanity. In many ways, the mere thought of having someone like Data onboard the spaceship USS Enterprise, gave the scriptwriters that all-important way out whenever situations escalated beyond belief. With Data, there was always an option.
In the Information Technology (IT) world where the majority of us within NonStop community work, our reliance on data cannot be underestimated. Whether it’s a simple request, “Show me the data” or perhaps something more compelling with “What does the analysis of the data tell me” business has come to rely on data unlike in any other era. Without data, we couldn’t possibly plot a course to the International Space Station or more commonly accessed by us, navigating our way home. Simply put, processing without data might be viewed as just marching in place.
Before IT there was Data Processing (DP) and with that label came the appreciation that it was the role of DP professionals to ensure orders were processed, materials shipped and prescriptions filled. And in the decade prior, it wasn’t an uncommon site to see business directories pointing us to the Electronic Data Processing (EDP) department. Fortunately, very early on it was recognized that including the word Electronic was a little redundant, but the message conveyed was inescapable. It’s all about the data, isn’t it, Data?
As the 1960s drew to a close it wasn’t uncommon to see large corporations showcasing their computers behind floor to ceiling glass structures that frequently were at ground floor level. Long before security became a concern these corporations were proud of their investments in computing and wanted the world to see how advanced they were. The magic that came from crunching the data was something that let even the most uninformed know that the business carried out across this corporation was cutting edge. Needless to say, a few bricks thrown through the glass as the turbulence of the 1970s unfolded meant that most of society became oblivious to the many transformations the industry was going through – if we grabbed a Kleenex to wipe clean our Xerox, then we processed data on our IBM!
Before we began exploring the capabilities of grouping data hierarchically or began adding pointers to network data or testing the waters with relational data bases – remember the third normal form that Dr. Codd espoused in 1971 – no IT professional could have imagined where this data journey would have taken the industry. Now the relational model is viewed by some as being somewhat passé, but it always intrigued me that the seminal work by Dr. Codd led to further explanation by C.J. Date, an over enthusiastic proponent of all things relational, I once had the opportunity to meet. If only his name had been C.J. Data, I had mused, then the pieces of the puzzle would have all fallen into place. For more years than I care to own up to, I kept copies of Date’s books in my office.
Once we reached relational database management systems, not only did the pieces fall into place but business as a whole began understanding the value databases provided. But as so often the case, some of their understanding boarded on the religious – if only we could add every piece of data processed by the corporation into one giant database, we would have a complete picture of the business on a day by day (or even hour by hour) basis. And so began the pursuit of the all-knowing, all-seeing, corporate database that reminded many of us of stories we had read in that other handbook on data, The Restaurant at the End of the Universe, which told us the answer to everything was 42.
As the 1980s came to a close it was left to Tandem Computers to launch a truly revolutionary approach to database – NonStop SQL. What made this so revolutionary for the times is that you could build mission critical database applications around the product. This, at a time, when many IT professionals still considered database to be batch-focused, subject to serious downtime for maintenance and simply too slow to provide answers in a real-time transaction processing environment. However, Tandem Computers proved the naysayers wrong and though it took time to develop traction it has become the standard go-to offering for applications that cannot tolerate downtime. For many CIOs it was a case of data finally having a home that they could trust.
It has been a decade since I published a white paper, NonStop SQL – The path to the always-on, easily administered, out-of-the-box clustered, database server! You may still be able to find it out on the web, but even with the passage of time, one thing should still resonate with anyone interested in data:
Customers do not have to trade-off between linearly scaling their data base platform to handle growth in their business volumes, or the 24×7 availability required by global enterprises, or even to restrict users to do certain database actions at certain time of day. Neither must customers consider compromises between lower costs of data base ownership and the higher costs of database administration. Today, customers get all of this with NonStop SQL to support their demanding, always-on environments. And CIOs have started to wrap their heads around it.
Before cloud computing and the explosion in offerings from cloud service providers there were numerous attempts to create the presence within IT of the “single source of truth”. That all-knowing, all-seeing, corporate database did take shape, after a fashion, even as it was still burdened with communications bandwidth issues that didn’t help in keeping the data fresh. CIOs did switch gear and proposed that business would benefit should they be able to go to a database assured in the knowledge that it reflected the state of business that very hour. But the sheer immensity of it all staggered the minds of the rest of the corporation when it came to funding it; even as CIOs were wrapping their heads around NonStop SQL and yes, even HPE looked to NonStop SQL and Neoview for answers, along came clouds offering unlimited data storage options.
When America’s NSA first unveiled its plans for a data center in Utah and talked of storing Yottabytes of data in gigantic databases, imagination ran wild. How big is that, exactly? As c/net published at the time, “As well as scoring 17 points in Scrabble, a yottabyte is equal to 1,000,000,000,000,000GB. We don’t know how to even say that out loud.” Today, cloud service providers talk of data lakes and data warehouses where numbers like this are being routinely thrown around. Yes, our creation of data and our need to analyze it all is driving databases whether SQL, NoSQL, NewSQL, or anything else, as the conversation turns to the real world problems of IoT, IIoT, V2V, the Edge and more it may be time to step back and appreciate further just how well NonStop SQL has stood the test of time.
Who could have imagined that we would welcome the latest iteration of NonStop SQL as the NonStop team brings to market NonStop SQL Cloud Edition? Yes, from data processing to data base to data in the cloud, the answer may not be as simple as 42, nor is it to be found coming from the android, Data, but rather, only a few keystrokes away as NonStop continues to deliver on the database needs of corporations today. And still, even as 2021 draws to a close, there will be many more CIOs who are still trying to wrap their heads around NonStop SQL and the value it affords every corporation.