Get More Out of Your Batch Window
The typical batch window is finite but your business needs aren’t.
Mainframe Batch Processing
Batch processing is used for some of an organization’s most critical and time-sensitive business operations. It is also used across industries for database maintenance, bulk database updates, ETL for populating data warehouses, running analytics on data warehouses, creating backups, archiving of historical data, and so on. Batch jobs perform many read, write, and sort activities on sequential files without manual intervention or input required unless, of course, a job does not end successfully. Batch window processing often takes place during off-hours or non-peak hours, and it is not unusual for OLTP to wait on the completion of the batch processing, as it requires files to be updated or tables to be current.
Pressures on the batch window
The timing of execution and completion within a predefined time window (or windows) is critical to businesses. This is made difficult in today’s connected world when demands for 24/7 OLTP is expected, and global businesses have removed any breathing room within the batch window. Mobile and e-commerce has also put tremendous pressure on batch window processing by the need to be responsive and handle transactions at all hours. Other pressures on the batch window come from the need to handle larger volumes of data, and the need to incorporate additional functions. Naturally, with these changes, the processing time required to complete the batch jobs increases, sometimes exceeding the available batch windows; often leading to extreme congestion in the batch window.
Failure to complete batch processing on time can adversely impact the company’s ability to deliver value and even cause changes in the company’s business model. To complicate matters, there can be statutory limitations associated with completion of these activities, such as crediting interest to customers, producing pay checks for payees, and generating payments to business partners. Additionally, there can be financial penalties associated with Service Level Agreements (SLAs) not being met. Therefore, it is vital that batch window processing take place as efficiently and as quickly as possible for companies to maintain their operations and fulfill their commitments to their customers, employees, business partners, and to meet their legal obligations.
Some estimates show that up to 50% of mainframe workloads are batch, and that means these problems are front-and-center.
Mainframe batch performance and cost optimization techniques
High-Performance In-Memory Technology
High-performance mainframe in-memory technology can be used to accelerate your existing batch applications – particularly those in environments experiencing ultra-high transaction processing rates. It augments the database, as well as existing contemporary batch solutions, like data buffering. In-memory technology works by allowing select data to be accessed using a much shorter code path than most data.
IT Business Intelligence for Batch
IT organizations collect tremendous amounts of data about their own computing resources every day – both mainframe, midrange servers locally, or in third-party datacenters. So much data is collected, that you could call it their own “IT Big Data.” And with the right toolsets, this IT data can be used to reduce the cost of batch running on your mainframe, and can help identify low-priority batch candidates to offload for running on other platforms.
IT business intelligence identifies lower-priority batch workloads that are potential candidates for reprioritization, replatforming or even elimination. This can directly contribute to improved performance for business-critical batch workloads, especially during peak hours.
Soft Capping Automation for Batch
Automated soft capping combines two very powerful needs in the mainframe side of every data center – the need to control costs, and the absolute need to avoid performance capping. This is accomplished by automatically adjusting the capacity settings of individual LAPRs, and borrowing capacity between LPARs.
In this way, it is possible to ensure that critical processing will not be capped, while allowing capping on lower priority LPARs – LPARs dedicated to development, test, staging, etc. Using this technology, it is possible to reduce resource usage by 10–15% or more – without capping your mission-critical workloads.
Want to Learn More?
Mainframe batch optimization
DataKinetics Z Performance and Optimization solutions can help large IT organizations running mission-critical batch processing to reduce their execution times from two times to two orders of magnitude, depending on their business type and the specific characteristics of their batch applications. Each solution can make a significant difference by itself – together, they can make a large dent in batch processing resource usage, reduce batch run times, and even help reduce operations costs. All this without the need for a system upgrade, using the mainframe assets that you have now.
Learn more about the technologies that support mainframe batch processing optimization.
tableBASE can lower your costs and mainframe TCO, dramatically reduce batch processing time and optimize your MIPS usage, offering performance benefits and more powerful and efficient applications.
IT Business Intelligence
IT Business Intelligence collects capacity and performance data from the IT infrastructure and combines these with business information such as costs, which application and organizational unit is using the resources, and for what activities.
Soft Capping Automation
With Soft Capping Automation you can control your workload license charges by dynamically modifying the LPARs Defined Capacity limits taking into account the behavior and needs of all your LPARs.