Pages

Tuesday, April 17, 2018

Compuware continues to lead in Agile DevOps for the mainframe

By Rich Ptak


Image courtesy of Compuware, Inc.

Compuware continues to add to and extend its mainframe solutions as it advances in its campaign to mainstream the mainframe. This time with two major innovations that help their customers preserve, advance and protect their mainframe investments.

Before we get into the innovations, we want to mention Electric Cloud, a new partner, who proactively integrated their service through the Compuware open API. This is the latest example of how Compuware takes an open borders approach where they integrate with a variety of solutions to help customers build out their DevOps toolchains. 


Now, onto the announcements. First, a new product, Compuware zAdviser. It leverages machine learning and intelligent analysis for continuous mainframe DevOps improvements. This new capability provides development managers with multi-level analysis of tool usage and performance data. They focus on the critical DevOps KPI’s (key performance indicators) of application quality, development team efficiency and velocity. All are also key to agile development. Even better, the product is free to Compuware customers. 

Second, is a new GUI for Compuware’s ThruPut Manager, which provides intuitive, actionable insight into how batch jobs are being initiated and executed, as well as their impact on cost. Users can leverage graphical visualizations of batch jobs that are waiting to execute and when they might run. In-depth detail on why a job has been waiting can also be easily obtained.

zAdviser + KPIs + Measurement = Success
Mainframe KPIs are a must if organizations want to successfully compete in the digital age. After all, you can’t improve what you can’t measure and if you’re not continuously improving, you are wasting your time and worse, your customers’ time. Teams must also be able to prioritize and measure the KPIs that will directly impact development and business outcomes. 

A Forrester Consulting study conducted on behalf of Compuware found that over 70% of firms responding had critical customer-facing services reliant on mainframe operations. Providing the customer with an exceptional experience, not simply good, clean code, has become the new measure of operational success.
According to a recent Forrester Consulting study conducted on behalf of Compuware, enterprises are doing a good job of tracking application quality, but they are considerably less concerned with efficiency and velocity. However, in order to modernize their application development strategies to keep pace with changing market conditions, firms must place as much focus on velocity and efficiency as they do quality.

Compuware zAdviser uses machine-learning to identify patterns that impact quality, velocity and efficiency of mainframe development by exposing correlations between a customer’s Compuware product usage and the KPIs. Equipped with empirical data, IT leadership can identify what capabilities within the tools developers can exploit to become better developers.   The day of beating the drum to go faster are long gone with the machine learning. 

ThruPut Manager: Visualization for Batch Execution
Compuware’s ThruPut Manager brought automated optimization to batch processing. ThruPut Manager allocates resource decisions by balancing the needs of multiple interested parties. It involves cost-benefit tradeoffs between risks and costs, such as risking SLA (service level agreement) violations of timely service delivery to avoid a costly increase in software MLC (monthly license cost) charges.

Compuware reports that batch processing jobs account for about 50% of mainframe workloads!

Today’s complex environments compound the problem with a bewildering number of choices, combinations and alternatives to consider in making these decisions. The amount of data, competing interests and number of options means it takes years of experience to achieve even a reasonable level of competence at this task. Further, a lack of such seasoned staff means that these operationally critical decisions are now being left to new-to-the-mainframe staffs lacking that experience.

ThruPut Manager’s new web interface provides operations staff with a visual representation of intelligible information of the cost/benefit tradeoffs as they work to optimize workload timing and resource performance.

In combination with Compuware Strobe, ops staff can more easily identify potential issues. They can manage and balance competing metrics relating to cost, resource allocation, service policies and customer interests to make the best decisions for optimizing the workloads, as well as application performance.

A big part of ThruPut Manager’s advantage is the multiple drill-down views it provides. Starting with an overview, which displays data about the General Services and Productions Services queue, users can drill down to a detailed view of specific job data and job history, as well as where work is getting selected. The GUI also collects and displays the R4HA information for the last eight hours. And, if the Automated Capacity Management feature is constraining less important workload to mitigate the R4HA, this will be displayed on the graph. 

The Final Word
Mainframe workloads continue to increase even as experts steadily leave the workforce and responsibilities shift to mainframe-inexperienced staff. Organizations must constantly work to modernize mainframe environments and remove impediments to innovation to not only increase their business agility, but also attract a new generation of staff to the platform.

Compuware zAdviser provides concrete data that allows mainframe staff to link the results of actions taken to improve performance based on KPI measurements. DevOps management and staff have access to intelligible, visual information on the impact of those changes in detail. 

Compuware ThruPut Manager provides much needed clarity and insight to fine-tune batch execution for optimal value easing budget stresses while fulfilling business imperatives.

These products provide strong evidence of Compuware’s ability to create innovative ways to identify and resolve challenges in mainframe development, management and operations that have long been barriers to its wider use. The entire team deserves a salute for their 14th consecutive quarter of very agile delivery of solutions that are driving the mainframe more and more into the mainstream of 21st century computing. Congratulations once again for your efforts.

Monday, April 16, 2018

Risky Data: GDPR outside the EU

By Bill Moran and Rich Ptak


Image courtesy European Commission
GDPR (General Data Protection Regulations), the new privacy law enacted by the European Union, will come into full force in May 2018. The law is an attempt to enforce some ownership rights and protect the use of an individual's data collected by enterprises. This is the first of a series of articles on concerns and impact of GDPR on companies not physically based in the EU but who deal with EU residents directly (such as selling services or products), or indirectly, doing business with a firm with EU-resident customers. Note, we are not attempting to provide a detailed legal analysis. This is intended to be an advisory and awareness raising commentary for what appears to us as a potentially highly disruptive trend. 

A major driving force behind the GDPR mandates has been the documented abuse, along with the increasingly evident potential for misuse of the collected information. Perhaps best represented in the highly profitable sale of access to customer data by social-media giants with Facebook[1] being just one example.

Add to this growing, wide-spread public awareness of data abuse is the exposure of the casual, if not callus attitude of industry executives, data sellers, as well as buyers, convinced that profitable exploitation of the data is their exclusive right.



It is very likely that GDPR-type restrictions will be initiated and imposed by the US along with other non-EU national governments. The repeated disclosure of personal information obtained from corporate databases by hackers, lends further impetus to such efforts. Anyone doubting the risk can easily find evidence with a simple internet search[2].


GDPR’s initial focus is on returning the ownership and control of personal data to the individual. To that end, GDPR requires that the entity requesting data must obtain explicit, informed consent from the individual[3] for the collection and USE of the requested data. Both the request and consent must be visible and explicit. Specifically, it cannot be buried in a long, detailed statement of intent nor blanket user’s agreement nor in formal terms and conditions for licensing or other contractual arrangement. The expectation is that this will take some significant effort. There are many more details, which will be discussed in upcoming reports. First, let’s look at plans for enforcement.

GDPR establishes severe penalties for companies violating the individual’s data rights, e.g. a fine of up to 4% of an enterprise’s worldwide revenue for repeat offenders. For corporations, with 100s of billions of euros in revenue, this could equal billions of euros. The law applies to the data of both EU citizens and EU residents. Accountability extends to any company anywhere which maintains personal information on EU residents and/or citizens in its system. Personal information is very broadly defined as anything that allows identification of an individual person. This broad definition appears to include even a simple URL.

Our series of articles will focus on issues and actions of concern to companies which may or may not currently do business in the EU but have information on EU citizens/residents in their databases. There are also secondary players, such as suppliers to multinationals that receive from or exchange data about individual EU-residents. Such suppliers will likely be asked to adhere to GPDR requirements or requested to implement GDPR compliant data protection policies. An example is a US-based airline with (TSA mandated) information from a ticket purchase by an EU resident. Virtually any enterprise anywhere doing business with any EU resident falls under GDPR.

We will not focus on issues of the large multinationals with significant EU business who have staff, legal and technical, to address the issues. They are immediately subject to EU laws and have had several years to prepare.

Our next installment will discuss open questions and implementation risks. It will be posted in approximately two weeks.







[2] Searching “hacker obtain personal data from corporate information” returns 48.8 M results and 127M if searching “hacker personal data corporate data”


[3] In the case of a minor the parents or the legal guardian must consent. Here you can see how GPDR requirements will spawn severe problems when an organization tries to implement them. We are not necessarily opposed to the concept but significant effort may be required to implement. What exactly is the process to contact the parents and get this consent? If you ask the child who their parents are will they tell the truth or will they identify someone else who they know will give permission?
 

Tuesday, April 10, 2018

IBM Z Systems – for enterprises of all sizes

By Rich Ptak


                 Picture courtesy of IBM, Inc
When Ross Mauri, General Manager IBM Z, briefed us on their newest offering, he quoted Steve Jobs, “You’ve got to start with the customer experience and work back toward the technology – not the other way around.” Not bad advice.

We long ago learned that selling IT (both products and services) on the basis of technological “speeds ‘n feeds” was a non-starter for many buyers. We found success by listening to clients to understand what the client was trying to achieve, then identifying what they need to succeed. It is apparent that IBM is listening and agrees.

Announced were two new additions to the IBM Z® Family. First is the IBM z14™ ZR1, built to enhance trust in a highly secure cloud. Next is the IBM LinuxONE™ Rockhopper II which offers flexibility and speedy scaling to allow scale-up growth.

 Prior to hearing any details on these new systems, we had a number of informal discussions with mainframe users attending Think 2018. Here is what they were hoping to hear from IBM about mainframes:
  • ·         Significantly increased processing power with multiple configuration options,
  • ·         More flexibility and simplicity in system infrastructure configuration,
  • ·         Standardization that allows semi-customized systems, 
  • ·         Expanded I/O capability,
  • ·         Smaller overall footprint,
  • ·         Pricing transparency,
  • ·         App security.
No real surprises. With this list in mind, let’s examine the market issues IBM is addressing with the newest additions to the Z product family. 

Digital Transformation hits every data center
The appearance of digital transformation forced enterprises to confront and deal with an increasing number of challenges. Threats to security, extreme spikes in workloads, etc. The impact on data centers was significant. Especially being felt in the demand for strong, broad-based security, extensive, intelligent analytics, automated machine-learning capabilities and open, connected and secure cloud services.

When IBM designed and introduced the Z family to address these challenges, they were primarily the concern of large-scale enterprises. Today, digital transformation continues to spread to the extent that these challenges are being experienced in enterprises and businesses of all sizes.

The new additions to the Z family are IBM’s response. While they share common family capabilities, such as pervasive encryption, Secure Service Containers, analytics, machine learning, etc., they also include extensive enhancements to address the most pressing customer and user needs.

New 19” Rack configuration
Design standardization in both the z14 ZR1 and the LinuxONE™ Rockhopper II along with a smaller I/O configuration means customers can choose server, switch and storage elements that fit their needs. For example, both fit in a 19” standard-sized rack leaving a significant (16U) in-frame space available for other components. This gives maximum flexibility and scalability.

For system administrators, a new mobile management software allows remote systems monitoring and management including push notification of events for more efficient operation. 

For those worried about response times for I/O sensitive workloads, the IBM zHyperLink Express offers a direct connect short distance link between z14 servers and FICON storage. IBM has found that it can cut response times by up to 50%. OLTP workloads have much faster access to data. Batch processing windows are reduced as Db2 index splits go faster. The result is increased customer satisfaction and low operational costs.

For those concerned with extra security for software virtual appliances there is IBM Secure Service Container. Available on both Z and LinuxONE, it is a Docker-based container capability that serves as secure platform to build and deliver remote services. Both data and execution code are isolated and protected from threats, internal or external, malicious or inadvertent.
 
Speeds n’ Feeds
This section is for the “speeds ‘n feeds” folks. Here are a few tech specs. You can find more here . For z14 ZR1, the number (4, 12, 24, 30) of processors is fully configurable. The entry level system provides a full 88 MIPS for capacity setting A01. The RAIM memory runs with a minimum of 64 GB to a max of 8TB. IBM expects the largest z14 ZR1 configuration to provide up to 13% more total z/OS and up to 60% more Linux on Z capacity than the largest z13s. 

For LinuxONE Rockhopper II the number of cores is also configurable (4, 12, 24, 30).
The statistics go on and on. In short, these new systems were designed from the very start to meet the demands and needs of real customers.

In Summary
IBM has indeed listened to its customers. Nearly every hot button item on the list we collected from clients has been addressed. Pricing was not mentioned in the session.

However, in a separate briefing just before announcement, IBM indicated that the price points for these systems are set to keep current customers, as well as attract new clients and workloads to mainfrome platforms.  

Other items touched on during the announcement, and apparent at THINK 2018 were aggressive efforts to add partners and alliances to the mainframe ecosystem. There exists a much more visible focus on developers with stronger DevOps products and API enhancements. Also apparent is the aggressive attention paid to enlarging the number of applications and open source solutions running on the mainframe.

So, it was no surprise when Mr. Mauri indicated that last year, IBM had the largest number of new-to-the-mainframe customers in a decade. He also indicated that he now has a dedicated sales force pursuing opportunities.

For our part, we are seeing a lot more interest in the mainframe. From pervasive encryption to the extensive efforts in mainframe education to the increasing success in promoting the mainframe as a mainstream solution, the number of mainframe believers appears to be growing. Congratulations to Ross Mauri, his whole team and partners on their success so far. And, be sure to check out these additions to the Z family.