Wednesday, March 7, 2018

Speeding the advancement of quantum computing

By Rich Ptak

Google’s announcement of a 72-qubit chip has started a lot of noise about Google’s supposed
“quantum supremacy” and quantum industry leadership. We think it’s time to take a breath to understand a few terms and to discuss what’s real (and really available) in the world of quantum computing.

First, strictly speaking, the term Quantum supremacy is the potential ability of quantum computing devices to solve problems that classical computers practically cannot. Realistically, it is also used informally to describe definitively demonstrating/proving that quantum computers can outperform classical computers in problem-solving. It is about identifying the specific problem or class of problems for which quantum computer solutions are found find faster, more accurately, etc. than on a classic computer. That term most definitely does NOT simply refer to the chip with the highest qubit count.

Next, both quantum and classical computing solve problems using algorithms (step-by-step instructions taken to reach a solution). To date, efforts have failed to produce definitive proof of quantum superiority. Part of the problem lies in the limited ability to think quantum – that is, understanding and defining problems in quantum terms. One quantum computing developer who is helping in that arena is IBM, with IBM Q Experience (see below).

Next, let’s dispense with the idea that computing performance is simply related to the number of qubits. Theory says that a 5- or 16-qubit machine should be able to outpace any classical computer, i.e. more qubits equal more power. The NUMBER of qubits is less important than the QUALITY of the qubits. A 16-qubit high-quality array, i.e. stable, long-lived, controllable qubits is more powerful than a 100-, 500-, 2000-qubit device of not-so-stable, shorter-lived, inaccessible, low-quality qubits. IBM’s proposed quantum volume metric is based on such characteristics.

Finally, the real interest is in proving the UTILITY of quantum computing, i.e. the ability to actually do work. To that end, in 2016 the IBM Q Experience[1] introduced free access to quantum computing to the interested public (students, researchers, individuals, etc.). It offers a complete system including a developer’s kit (e.g. IBM’s QISKit[2]) not just a chip. Initially a 5-qubit processor, it added a 16-qubits processor in 2017. Participants include more than 1,500 universities, 300 high schools, and 300 private institutions worldwide. The results are impressive with in excess of 76,000 users running 2.9 million experiments and delivering more than 60 research papers.

Finally, for commercially oriented clients, there is the 20-qubit based IBM Q Network was announced in late 2017. Active participants include a dozen Fortune 500 companies, academic institutions, and national labs, in the list are JP Morgan Chase, Daimler AG, Samsung, Barclays, Honda, Keio University, Oxford University, University of Melbourne, and Oak Ridge National Lab. And a 50-qubit system is planned to be available by end-of-2018.

All this effort is to conclusively demonstrate quantum utility, not just quantum supremacy. Quantum computing will prove its value in implementation and application. That will happen as knowledge and the ability to think in quantum terms spreads into the community. IBM, with its open access to quantum systems, is helping that happen today. That is leadership.

You can find out more about IBM's activities in our  September 2017 write-up on Quantum computing and in the Featured Post (see column on the right) on the IBM Q Network. More is coming. 

Tuesday, January 16, 2018

Mainstream the Mainframe with Automated Multi–platform Continuous Code Quality Management

By Rich Ptak

On January 4, 2018, Compuware marked its 13th quarterly announcement since embarking on its mission to mainstream the mainframe. 

Superstition holds that the 13th of just about anything is bad luck. (Doubt this? Look for the 13th floor in buildings![1]). Exceptions exist, e.g. the accordion was patented on Friday, January 13, 1854. The Los Angeles Hollywood sign was unveiled on July 13th, 1923.

Another such exception is Compuware's announcement of the integration of Topaz with SonarSource SonarQube, a popular continuous code quality solution used by more than 900 digitally innovative enterprise, that provides continuous, automated inspection and analysis of COBOL code. Maintaining code quality through development and inevitable subsequent changes during operations has been a major IT headache forever. In addition, the ability to automatically go through the process of code testing and analysis and get results that will directly link to errors to source code, highlight untested code, and view meaningful metrics for cross-platform code quality have all been a next-generation dram for mainframe developers. Not any longer. 

It's all here and is a significant advance for agile solutions development. Integrating Compuware Topaz for Total Test unit testing, which leverages Xpediter code coverage capabilities, with SonarQube will make life much easier for existing experiences as well as newcomer mainframe DevOps staff. Here's how we see it. 

The Challenge
Despite advances in testing and structural coding processes, shipping error-free application code, consisting of thousands of lines written by an army of developers, is a practical impossibility. Frequent post-deployment updates and modifications further complicate the problem. Today’s blended IT environment means mainframe code must be continuously updated and integrated. This can involve work done by new-to-the mainframe staff, which can increase risk due to unfamiliarity with complex and undocumented code coupled with the integration and consolidation of IT operations. Code reviews, mentoring, and best practice training can lower bug counts and increase quality, but major problems remain in finding and correcting source code errors not to mention poor programming practices. 

Despite monumental efforts, human fallibility frustrates perfectibility, resulting in what has been a highly risky compromise: focusing on major problems while minimizing the rest. The dynamic and evolving IT ecosystem can change vulnerabilities and code priorities making yesterday’s low-risk code a major liability, given the financial and operational liabilities incurred when poor quality code causes a public failure.

Compuware and SonarSource
The new integration automatically feeds code coverage results captured by Topaz for Total Test into SonarQube. The ability to automate code coverage tracking addresses a long-time, fundamental problem in COBOL code testing and management. Specifically, there existed no way to accurately track and validate code coverage of COBOL application testing—with the same ease and employing the same processes—as is available and done with Java and other more mainstream code. Even more significantly, it materially advances DevOps testing and coding process with a new level of capability and precision in the identification of errors, as well as weaknesses in the code.

Topaz for Total Test’s automated unit testing and Xpediter’s code coverage collection capabilities coupled with SonarQube’s analysis and reporting enable developers to gain insight into the coverage of code being promoted for all application components across all platforms. Importantly, Compuware’s solution captures code coverage metrics directly from the source code itself, which is not only more accurate but also eliminates what was once a complex task predicated on knowing the idiosyncrasies of how source listing models map to the actual source code. Figure 1 below shows an example of the data provided in the SonarQube dashboard.

Further, the integration provides the capability to monitor and manage the overall quality of not just coding, but testing as well. Developers get informative, data-driven feedback on code quality. A visual representation of the code being executed is available. They can see exactly what code is and what is NOT being tested.  

Figure 1 SonarQube Dashboard

In short, the developer has more information and better insight into code quality, along with the detailed data to decide how to adjust. And, organizations gain from getting better, more informative test results. An accelerated process and more detailed data reporting will help build stronger communications and operational links between development and operations staff.

The Final Word
Compuware’s mission to “mainstream the mainframe” has succeeded because of their ability to make mainframe utilization and COBOL code maintenance efficient and straightforward, as well as far less intimidating to new users. They have exposed as wrong or eliminated many of the arguments against the mainframe. Their strategy has mainframe DevOps and management transparently operating and participating in IT and business operations. 

Compuware has identified and solved problems that had been ignored or considered to be unsolvable. They integrated with multiple open systems tools and solutions for the mainframe environment. A series of alliances, add-ons, extensions, and new products improved and sped up mainframe developer productivity and simplified operations management. Bringing the latest DevOps tools to the mainframe is helping to attract a new generation of developers. Not incidentally, this has contributed significantly to the breakdown of organizational silos that historically have separated and isolated mainframe operations from enterprise IT and business centers. 

Compuware has attacked hardcore issues in multiple ways, including making mainframe capabilities accessible to the latest in management, DevOps, operation tools and solutions. They partnered with vendors to make mainframe solutions available on the AWS cloud and accessible via web interface services. They made open systems tools and solutions work with mainframe applications and data. They radically changed mainframe operations. Speeding up, improving and simplifying operations and management of the mainframe, all while bridging the gap between mainframe and open systems community activities. 

Compuware delivered significant, game-changing products each quarter for the last three years. They have improved, simplified and sped up operations and management. They have introduced capabilities that were never thought possible; thus driving radical change in mainframe development and operations. With this latest release, they take another big leap to bring together automated unit testing and COBOL code coverage with the ability to get an accurate, unified view of quality metrics and milestones across platforms, thereby providing a data-driven basis for continuous improvement. These improvements and the resulting benefits will be realized in the quality of the code produced, testing process and in the overall quality of the resulting services. 

Compuware advertises itself as delivering “Agility without Compromise” with “simple, elegant solutions that enable a blended development ecosystem.” In our opinion, they are delivering everything that they promise. They are providing solutions that set the standard for IT DevOps for the next decade. Just what you would expect from “The Mainframe Partner for the Next 50 Years.” Kudos to them.

Friday, December 22, 2017

IBM Q Network – moving Quantum Computing from science to problem solver

By Rich Ptak

Image Courtesy of IBM, Inc.
  On December 14th, IBM announced the IBM Q Network, a worldwide collaborative effort to create a connected community of quantum computing involved individuals and organizations. Three relationship options are available to meet the varying interests and needs of potential members that range from start-ups to F500 enterprises, also including universities and research institutions with provisions for interested individuals in engineering, science and business.
The globe spanning cooperative network is linked by the IBM Cloud and enabled by IBM Q[1] to advance and accelerate progress in the race to realize Quantum Advantage (QA). QA occurs when quantum computing can deliver commercial value with demonstrably better, quicker and more accurate solutions than classical computing for substantive, real-life problems. This is an overview and our comments on the IBM Q Network announcement.

IBM Q Experience – building the “Market” for Quantum 

For decades, quantum computing existed primarily as an esoteric exercise in theories and physics. Activities focused on developing quantum science theories were restricted to universities, research institutions and theoretical scientists. Quantum science was a necessary precursor to quantum computing as a technology that could be applied to problem-solving.
More recently, a combination of events (potential exhaustion of Moore’s Law, escalating problem complexity) drove investigation into alternatives to classical computing techniques. Some vendors, including IBM, made the decision to pursue commercially viable quantum computing. The race to develop quantum technology began.
In May 2016, IBM made basic quantum computers available in the IBM Cloud. This, along with other IBM contributions, such as IBM's proposal of Quantum Volume as a more functional metric of computational value than qubits, helped accelerate the evolution of quantum computing science.
More significant was when IBM became the first vendor to provide widespread, free, public access to quantum computers via IBM Q Experience[3]. This enabled a larger, diverse audience to acquire knowledge about and experience with quantum science methodologies, modeling, etc. Today, it provides free access to 5- and 16-qubit quantum computing prototypes residing in the IBM Cloud.
Additional available services, support and tools include QISKit[4]; an open source software development kit with examples tailored for specific types of problems, e.g. chemical structure modeling and simulation. These helped to advance quantum computing away from what was primarily a scientific exercise to a publicly accessible ‘sand-box’ where interested individuals, engineers, businesses and other groups could learn and experiment.
But, the overall quantum “market”, while intensely competitive, remained dispersed, unorganized and unfocused. It was difficult to dispassionately assess progress, compare machines or get a coherent sense of the state of quantum in its progress from science to technology.
With the introduction of IBM Q Network, IBM provides an organizing model that allows full flexibility for creativity and innovation while helping to focus collaborative efforts undertaken by a global community to achieve Quantum Advantage.

The IBM Q Network

IBM Q Network will be a global community of individuals and organizations acting as independent, but collaborating units focused on achieving common goals. It is designed to operate as an ecosystem of loosely linked, coordinated organizations without the constraint and difficulty of imposed overarching management. This is very similar (consciously, or not) to an organizational model first implemented by the early RAND organization, a uniquely successful incubator of innovation.
Relationships are grouped in three categories (Hub, Partners and Members) with specific relationship, responsibility and activity interests. See Figure 1 below. Hubs (IBM Research, Oak Ridge National Laboratory, Keio University, University of Oxford, University of Melbourne) act as regional centers for education, research, development and commercialization of quantum computing. Partners (SAMSUNG, JPMorgan CHASE & Co., DAIMLER) focus on a specific industry or academic field as pioneers in applying quantum computing. Members (Barclays, Honda, Materials Magic (Hitachi), NAGASE) build their own general knowledge of quantum computing while developing a strategy to become quantum ready.
All participants have unique access to the latest IBM Q quantum systems (now on a 20-qubit device, followed shortly by a 50-qubit device) via the IBM Cloud.
                            Figure 1 IBM Q Network - Organizational Collaborators         Image Courtesy of IBM, Inc.
Specific levels of support, involvement with IBM on projects vary with category. For example, Hubs provide access to IBM Q systems, tech support, educational and training resources, community workshops and events.
Partners have direct access to IBM Q system and work with IBM on joint training, development and other projects.
Members can access IBM Q and IBM Q Network community resources through IBM Research. Additional details about resources committed, project activities, funding, etc. are worked out with IBM. These may vary based on agreements between participants and IBM. More details available at the IBM Q Network website[5].      
One more point to be emphasized, the IBM Q Network is also targeted at and provides support for individuals wanting to share ideas, research plans, projects and proposals. It is to be an open network for creative innovation, collaboration and communication.

What does this all mean?

It’s our opinion that IBM Q Network marks a significant advance for quantum computing in general. To date, the quantum space has been chaotic with few standards or consistent benchmarks against which to evaluate competing offerings and claims.
The network leverages existing communications technology to create a global “commons’ for sharing knowledge aimed at putting the science of quantum to work. The resulting global knowledge community resembles Republic of Letters[6], the Europe-wide shared community of thinkers, scientists and innovators which led to an explosion in scientific development and entrepreneurial activities from the 1600s through 1800s.  
IBM’s move to build a global community, loosely coupled but, with a clear set of goals along with a definite, expanding support infrastructure will be a significant advance on the road to fully commercialized quantum computing. It provides a much needed structure for extensive collaboration with minimal constraints dedicated to advancing quantum technology.
IBM has invited all interested parties, including competitor vendors to participate; subject only to an interest in advancing the commercialization of quantum computing and use of the IBM Q resources. Undoubtedly, there will be contractual issues over intellectual property rights, information sharing, ownership, financial arrangements, investment, etc. These will be worked out individually with participants. IBM understands the issues; they are willing to be very flexible in setting terms and conditions. 
Finally, even at its birth, IBM Q Network carries impressive weight. IBM provided some Day 1 statistics about the IBM Q Network. These include: the IBM Q Experience provides access to the first Quantum Computers on the cloud. It has over 60,000 users worldwide located on all 7 continents (including Antarctica). There are over 35 external papers already published. Day 1, the network includes F500 companies, research institutions in Europe, US and Asia, as well over 150 colleges and universities.
We encourage quantum-interested individuals and enterprises to examine the IBM Q Network. The network itself is a major effort to provide a working environment that will attract a diverse community of thought-leaders to advance quantum computing to a productive technology. It is a major gamble by IBM. But we think it will prove to be a decisive one on the road to a successful commercial quantum computing infrastructure. Look for yourself; you’ll see the benefit.

[4] IBM Quantum Information Software, go to link:
[6] Mokyr, Joel 2016. A Culture of Growth: The Origins of the Modern Economy, Princeton, N.J.: Princeton University Press