Welcome!

Open Source Cloud Authors: Pat Romanski, Elizabeth White, Liz McMillan, Rostyslav Demush, Charles Araujo

Related Topics: Open Source Cloud

Open Source Cloud: Article

Don't Blame Batch Processing for Your Business Process Failures

Never take short cuts on adequate testing of changes and test planning

The recent train wreck that was the CA batch process failure at RBS should probably be ringing alarm bells for some folks running batch processes into their systems; however, more importantly it should remind us of a couple of different things:

  • Batch processing is real and present in many facets of IT data processing and systems at large
  • Never take short cuts on adequate testing of changes and test planning

A call I participated on a couple of months back with some students from an esteemed college in North America resulted in a conclusion that what I was talking about and showing these students wasn't very interesting because a lot of what I was focusing on, was batch processing of data.

For some naïve and deluded reason they were of the opinion that real time OLTP was a more interesting story and so wanted to focus on that aspect of the Winshuttle technology stack rather than on the mass and batch related activities.

In reality, our world of data processing and systems relies heavily on both aspects.  I have to confess that I hardly ever give the concepts of batch vs. real time much thought, but this recent failure in the payment runs for thousands of banking customers brought home the importance of batch processing and reminds us that batch processing is alive and well everywhere.

Payroll, transfers, interest calculations, reporting, diary actions, archiving,  amends, deletes, applications, returns, balancing etc are all activities that in the banking industry are often processed using batch processing cycles. In the longer term, many of these batch processes may move to real-time but there are a lot of benefits associated with the current batched approach.

Practically speaking, batch processing often represents a better value proposition for certain activities that involve pre-staged or staged data. In addition, even though your system may be a genuine real-time OLTP it is likely that the statistical and key figure reporting aspects of your system are all still bound up in batched processing.  In addition to this facet of summarization and aggregation reporting and data staging, there is the factor that OLTP resources that support all the things you could conceivably want to be able to do, is expensive and some processes really are not so urgent that they demand all the system resources to be available 24x7 in massively capable infrastructure.

Just the notion of point in time recovery of your system for example, can be incredibly expensive in terms of system resources and equipment. From a hardware perspective we have features like database mirroring, hardware redundancy, multithreading operating systems and built-in resource redundancy for failover contingency. For this reason too, database software companies have developed technologies that produce things like archive logs that facilitate point in time recovery of the database without having to revert to a state that only reflects the system view at the point of the last back up. This capability has improved disaster recovery and confidence that systems can reflect accuracy even after unexpected mishaps.

I said earlier, that batch processes aren't going to go away any time soon, in fact, major analyst firms estimate that some 70-90 percent of enterprise integration requirements are for batch processes and this situation is further compounded by analyst research that suggests that batch processes often represent a significant contribution to planned system downtime. Moving that figure will take some considerable cost and effort - it doesn't necessarily make sense to invest in changing that number even if there is the will to commit to it.  If one considers that batch processes are often bound up in automation tools one then one becomes acutely aware of the fact that these batched automation procedures actually provide greater visibility into the circumstances of the business and can provide reassurance on the integrity of processes. Compliance facets of strictures introduced by legislation like Sarbanes- Oxley and HIPAA are more easily addressed in particular, since processing activity is more easily identifiable through batch processing audit reports that are easily identified and consumed.

My second point around this whole fiasco is regarding testing and test planning. In an ideal environment any change that you're planning to institute in a system should include testing in development environment, performance and regression tested in a QA or pre-production environment and only after all testing is done and all issues addressed, scheduled for application against productive systems.

The RBS incident reminds us that even the most mundane of changes can have very far reaching implications. There will no doubt be a lot of finger pointing and deconstruction of the events that led up to the problem, but most importantly some questions that should be asked are, why did this change happen mid-week and was it tested adequately before the change was agreed to be made?

As the post mortem of the event progresses no doubt we will learn more. For ourselves at least, we should take away a lesson learned and make sure that our own batch and non-batch processes are not rendered at risk by poorly planned, tested and ill-prepared-for changes to our own systems.

Additional Reading:

More Stories By Clinton Jones

Clinton Jones is a Product Manager at Winshuttle. He is experienced in international technology and business process with a focus on integrated business technologies. Clinton also services a technical consultant on technology and quality management as it relates to data and process management and governance. Before coming to Winshuttle, Clinton served as a Technical Quality Manager at SAP. Twitter @winshuttle

IoT & Smart Cities Stories
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...