Pages

Friday, April 25, 2014

IBM Channel Perceptions

If perception really is reality, then if you want to be seen in the most desirable light you’d better take great care when focusing a spotlight on yourself.

Hence, the advertising industry.

Channel partners provide IBM with a large portion of its revenues these days, and as the company increasingly looks to midmarket and, eventually, smaller-than-midmarket sales, the channel is the only way that they will ever have enough feet on the ground to support the revenue stream IBM requires.  The good news is that many of these partners find IBM to be the jewel in the crown of their vendor relationships: compensation is quick, supportive collateral is often quite good, in several areas IBM’s products are top-notch, and channel conflicts have become increasingly rare. 

That’s not the whole story however. IBM is asking a lot from the channel these days and in at least two cases there may be cause for concern. Here’s what I saw at PWLC, viewed in the light of what I subsequently heard both from IBM and from other vendors’ channel partners.

IBM’s channel ecosystem was presented with two major challenges in Q1. First was IBM’s PWLC announcement that the partners should begin shifting their software business from product sales to selling software as a service (SaaS). New business models can be a challenge. Secondly, the acquisition of IBM’s I-series servers by Lenovo caused significant consternation amongst the resellers, and it cropped up during many conversations I had with partners at PWLC. These issues, if not properly thought out, represent significant discontinuities for the channel.
Here’s what I think:
  • Discontinuity #1: shifting software sales to a SaaS model. The challenge here has relatively little to do with salesmanship and a great deal to do with compensation.  While there are lots of good reasons for moving to a services-based revenue flow – not the least of which is that it is increasingly clear that this is how IT wants to consume things – the shift from working towards the “big hit” to getting paid based on subscription sales is nontrivial, and is likely to be as challenging for many partners as it will be for IBM’s internal sales team.

A lot of guidance and perhaps some direct handholding are going to be necessary, particularly in the transitional quarters when partners who have not as yet begun to think along these lines begin sticking their toe in the service provider waters. There’s plenty of evidence out there to indicate that service consumption makes sense when it follows a utility model, and IT managers ought to be able to make a pretty good case to their CFOs about shifting from a Capex model to one based on Opex, but the issue of moving a sales force to an “annuity”-based comp plan has little enough history amongst IT vendors that it will cause even a good reseller to fidget if the company lacks implementation guidelines. 

  • Discontinuity #2: when I-series goes over to Lenovo, where do my marketing dollars go? Channel partners expressed concern to me about the following scenario: “I used to buy $2 million worth of kit from IBM, but half of that was I-series. Now what used to be a $2 million buy from IBM will be split between IBM and Lenovo. Will my marketing dollars come in at a lower rate because of that? And will my overall margins be affected because I now am selling only half of what I used to sell for IBM?”

It’s an old joke (but a truism nonetheless) that resellers are interested in three things – margin, margin, and margin. But these days margin doesn’t just mean the margin on a product. Take away marketing dollars and you are left with two bad choices: either the VAR’s total cost of goods sold increases or the VARs start thinking about cutting back on the marketing effort.

The shift to selling services certainly appears to be on the right side of history, reflecting what many IT managers already feel to be the consumption model of choice for the future. And handing I-series off to Lenovo also seems to make sense as this was a relatively low margin business; at a minimum this should be viewed as being consistent with IBM’s strategy for moving from product-based to services-based businesses. 

Even though the idea rings true, a question remains: will IBM make it possible for VARs, SI’s and other partners to follow them down this new path?
None of this will happen in a vacuum. We know from Meg Whitman that HP will be taking aim at the IBM channel, and it’s a pretty good bet that Oracle will do likewise. Look for both to emphasize the idea that neither company is “retreating from the product business,” and will offer the I-series acquisition to illustrate their interpretation of what IBM is doing.
Based on reality or based on FUD, the channel has cause for concern. 
IBM likely has at least six months to prepare for the I-series handover, but the shift-to-services issue is front and center today. IBM is rolling out a sales training program to help its resellers prepare for the new selling model, and expects to do as many as 400 sales training sessions worldwide by the end of the year. The sessions will be free, and will provide the partners with 30-, 60- and 90-day milestones.
IBM has turned the spotlight on themselves and their partners.  What is interesting now is the work they do behind the scenes to prep for the new route to market.  *

#channel, #cloud, #distribution, #IBM, #HP, #Oracle,  #partners, #resellers, #SaaS, #sales, #SI, #VARs










#channel, #IBM, #i-series, #Lenovo, #reseller, #sales, #SI, #VAR,





Wednesday, April 23, 2014

POWER8 – IBM’s billion dollar bet on servers

Between the Mainframe’s 50th birthday and recent Power Systems Announcements, IBM’s Systems & Technology Group has had a very busy couple of weeks. We expect this to continue with upcoming Watson announcements as IBM continues its significant investment in these platforms. “IBM as a Service” remains a major IBM marketing theme, but the realization of delivery of that service continues to be well-grounded in industry leading infrastructure. 

Today, though, IBM made major announcements about their plans for the Power Systems family and the new POWER8 chip - the basis for their efforts. You can read what we think of the announcement here: http://www.ptakassociates.com/content/. 

Wednesday, April 16, 2014

IBM – The Mainframe at 50, still a leader

It is significant when any product survives 50 years of commercial life;, to do so while remaining both technologically relevant and groundbreaking amidst a few decades of predictions of its eminent demise, it is nearly, but not quite, unbelievable. You can see a replay of the Mainframe50 event, as well as mainframe details as IBM celebrated the IBM 360 announcement on April 8th, 1964. At the event, IBM described the mainframe’s early and on-going impact on business, industry and society as it has driven change and permitted breakthroughs in capabilities and services. They discussed and introduced participants in IBM’s decade long educational project and Master the Mainframe competition for teaching, cultivating and recruiting of the “millennials”, the next generation of mainframers. Finally, they announced new mainframe products and solutions. They introduced new utility-based pricing that saves charges by reducing reported CPU utilization (by up to 65%) due to Mobile transactions.
IBM is adding to and introducing new mainframe solutions and capabilities to bring analytics to where the data resides to realize significant savings in time-to-value, while increasing effectiveness of analysis. They introduced a tightly integrated System z, workload-optimized system for business analytics. For example, building on past success with Linux is the first commercial implementation of Hadoop for Linux on System z – processing 2 Billion records in 2 hours using 2 IFLs (special purpose Linux accelerators). IBM is enhancing and speeding data and file transfer rates with its high performance flash enclosure on the IBM DS887. The result is vastly improved response times when accessing and analyzing data. For MSPs and ESPs, IBM announced a specially priced IBM Enterprise Cloud System. This is a factory-integrated, Linux cloud environment packaged with fully automated Cloud management suite, including support and financing. 
The Mainframe architecture reached 50 by surviving market fads, setbacks and radical changes in styles of computer processing and access. It has adapted and been improved by driving change and delivering high reliability, high volume computing without failure for years on end. It still does. Today, the mainframe is the backend supporting a multiplicity of workloads ranging from global gaming to mobile apps for financial services, logistics and planning. Mainframe customers from around the world told their stories of the mainframe’s contribution and impact. At the website, Mainframe50 Engines of Progress, IBM customers such as, Africa’s Business Connexion, the UK Met Office, Swiss Re, Walmart, VISA and others make a compelling case of mainframe power and capability.
Mainframe innovations have been adopted and have influenced the styles of computing (often without acknowledgement), infrastructure virtualization being just one recent example. It is the workhorse for cloud services, including analytics and transaction processing. It is bringing unprecedented efficiency, reduced costs and agility for enterprise operations taking place on a massive scale.
Skilled staff are necessary to support, maintain and use the mainframe. Along with stories of the end of the mainframe, a staple among IT journalists and pundits have been stories about an aging, disappearing mainframe workforce and lack of new talent knowledgeable about the intricacies of mainframe development, operations, management and administration. IBM along with other mainframers pursued two paths to address these. One path focused on automated solutions for optimized operations and management of the mainframe. We’ve documented a number of these, available from multiple vendors in blogs and commentaries on our website (www.ptakassociates.com).   
The second approach involved a program of interaction and education with universities world-wide.  Started a decade ago, programs are now in place at high schools, colleges and universities in over 67 countries.  A comprehensive program of support and infrastructure aided professors and institutions in building programs and attracting students to develop mainframe skills. Shortly after its inauguration, the program expanded to include the Master the Mainframe competition. This was a focused, time-intensive competition challenging students to develop and submit a mainframe application in a worldwide competition. Since 2005, over 68,000 students from 30 countries on 6 continents have competed. In 2013, in the U.S. and Canada alone, over 2,000 high school students competed. Having met and spoken with several of the more successful contestants, the program provides significant benefits to everyone involved.
 At 50, the mainframe remains the benchmark for reliability, security and stability even as infrastructure technology evolves and improves. The mainframe isn’t right for every task or workload. Just as a jumbo jet or bus aren’t right for every transportation need. But, for suitable workloads and tasks, it is unbeatable. IBM‘s Steve Mills, SVP and Group Executive for Software and Systems, reports they continue to attract 40 to 60 brand-new (to the mainframe) customers each year. This may not sound like much, but added to the existing base, it drives a significant and healthy portion of the business. We don’t expect to be at the 75th anniversary, but are willing to bet the mainframe story will be even more interesting then.

BMC ADDM Goes Big


New technology trends are changing the scale at which IT operates. As enterprises move forward with their initiatives in Cloud, mobile, Big Data and the Internet of Things, managing their ever-expanding and dynamic environments becomes a bigger challenge. The challenges and demands are many, including the exploding number of “things” that must be managed, the accelerating velocity of changes due to the frequency of updates and new applications, widely dispersed locations and the demand for timely IT responsiveness to business needs.

Most of the attention for Cloud, mobile, Big Data and the Internet of Things focuses on the “latest and greatest” innovations and applications. However, the success of these new technologies hinges on the ability to keep them up and performing as expected. And as IT infrastructures increase in scale, IT management systems must also scale and adapt to keep up with the expansion. IT Operations require fresh, accurate data to optimize their effectiveness, decisions and efforts in these increasingly dynamic environments. However, many IT staffs have had to make tradeoffs in the frequency of collecting its discovery data for the sake of balancing the enterprises’ needs of resources, costs and time. The latest release of BMC Atrium Discovery and Dependency Mapping (ADDM), version 10, is delivering capabilities to help IT address these challenges.               
Big Discovery
BMC is calling its new increased scalability of ADDM, Big Discovery. ADDM is deployed as a virtual appliance, which can scale horizontally by clustering as many virtual appliances as needed for discovery performance and speed. For ease of management, the cluster is managed collectively as a single appliance and fault tolerance can be used for reliability.

In conjunction with BMC ADDM’s scalable NoSQL graph database architecture, the new clustering capability enables ADDM to scale its discovery and increase the scan speed, which some customers have reported as much as a 50% improvement with only two cluster members.
The effect of this faster, scalable discovery enables IT staffs to perform discoveries more frequently and more broadly to ensure that they are making operational decisions on fresh and accurate data. The performance enhancements enable IT to maximize their discovery by capturing more data in shorter scan windows, to keep up with the needs of dynamic environments for cloud and mobile. With these new capabilities, customers can minimize the effects on IT infrastructure data accuracy from limitations of time, resources and cost.   

Time to Value
In ADDM release 10, BMC also focused on the time to value by simplifying the deployment of ADDM. Packaged as a virtual appliance and coupled with BMC’s out-of-the-box open pattern library, ADDM is easy to deploy and the new cluster technology stays with that philosophy. During the beta, testers were observed to get clusters deployed and running in just a few minutes. They reported only leveraging the UI for creating a Big Discovery cluster and did not have to seek additional instruction.

ADDM release 10 includes other enhancements, such as a new Knowledge Management user interface, a new user interface for faster upgrades, and more.

 
Response from a beta test user:
“I can only say great stuff!   Testing in lab with scanning cluster and consolidation appliance, everything working fine so far, good performance, clustering gave me nearly 50% more speed.”
 
Roland Pocek
NTT Data
 
The Final Word
The latest release of BMC’s ADDM focuses on helping IT keep its infrastructure management data as accurate as possible, with its increased speed and scalability. At first glance, that may not be as alluring as the latest technology trends like Cloud or mobile, but the accuracy and timeliness of infrastructure data is foundational to ensuring that enterprise applications keep running and performing optimally.

Initial BMC customer reaction to ADDM release 10, seem to be positive. Although the initial reactions mention the speed and operational aspects of the new capabilities, the real benefit is that IT can rely on and put more trust in the data that they use to make decisions and do their jobs effectively and efficiently.                   

Monday, April 7, 2014

HP's Journey Forward

At the recent HP industry analyst conference, a theme woven through Meg Whitman’s presentation was that HP is on a journey. She repeated several times during her speech that they’ve made progress but they still have a lot of work to do. To CEO Whitman’s credit, in the early days of her tenure at HP, she clearly set public expectations for HP’s turnaround when she stated that this was a 5-year effort. This move bought the company the time and space to work on the turnaround, while keeping investors and media at bay temporarily.

The strategies and progress-to-date shared by HP at the analyst conference confirmed that HP’s turnaround plan is still “in progress”. We saw evidence of both the progress, as well as the work that HP still has to do. HP has made progress since Ms. Whitman took over the helm in Fall 2011. Today, HP has settled down from its very public “dramas” and strategy shifts that unfolded before Whitman took over. Led by Whitman, the restructuring and refocusing efforts have stabilized HP, which puts HP in a position to move forward. Getting its financial house in order enables HP to invest in R&D and strategic acquisitions, both of which Whitman stated publicly that it is doing (R&D) and is its intention (acquisitions). And both are necessary investments for HP’s future growth and competitiveness.

However, as challenging as it was for Whitman and her team to get HP to this point, the next steps will be just as hard, if not harder to achieve successfully. Several of the challenges HP faces are part of the turnaround effort, while others are due to market and technology trends. Because of space limitations, a full discussion of all HP’s challenges isn’t possible but here are two of them.  

Growth:

Investors and the market are relentless in demanding continuous growth from public companies. Now that the company is stabilized, the demands for growth cannot be far behind. HP’s size and breadth of products make this a complex issue. However, for discussion’s sake let’s look at one area where HP will be challenged – its x86 server line. 

The challenge is and will be delivering revenue and profit growth from its x86 line, despite the downward pressures on profit margins and increased competition. This challenge is not unique to HP, it’s a challenge faced by all U.S. server companies. IBM’s solution to this issue was to get out of this business by selling their x86 server business to Lenovo. HP will certainly face fierce competition from Lenovo and other low cost Chinese server manufacturers, as well as white box servers. Downward price and margin pressures will continue and possibly intensify.    

However, HP Moonshot’s lower power consumption, lower cost, etc. servers could be one of HP’s solutions to this challenge. Moonshot is an example of HP using innovation to change the economics and form factor of x86 servers. But HP must improve its Moonshot traction in the market by more effectively communicating its value to customers.             

Technology Trends: Cloud, Mobile, Security, Big Data

Two years ago when HP began their turnaround journey with Meg Whitman, several technology trends were on the rise. Out of necessity and practicality, HP scaled back their acquisition activities during this period. However, in the past two years HP’s competitors were and continue to be very active with acquisitions to build up their Cloud, Mobile, Security and Big Data capabilities.

With its financial footing firmly re-established, Ms. Whitman indicated that HP will be engaging in acquisition activity once again.  HP’s temporary hiatus from acquisitions could mean that HP is now playing “catch up” in these trending areas, unless HP Labs has been working overtime developing new Cloud, Mobile, Security and Big Data capabilities. It’s not clear in which of these scenarios HP finds itself in today – on par or behind. The good news for HP is that most of the large vendors’ Cloud, Mobile, Security and Big Data initiatives are still “in-progress” and the end results are still to be determined.           

Moving Forward

HP is in a much better position than it was two years ago - finding itself on firmer footing. Yes, HP has made progress and is moving forward. But the progress must continue as it heads into the next phases of its turnaround.

The question is whether HP can effectively transcend across product and organizational structures to build its cross-HP initiatives, strategies and solutions to address customer needs. During her address to the industry analysts, Meg Whitman mentioned that they are shifting the company from a product and technology focus to a solutions focus. Although this is easier said than done[1], and the objective is not an easy slamdunk, it appears that HP is headed in the right direction.

In any case, it is interesting to watch HP and its competitors building their new initiatives. But the ultimate judges of the success of HP's efforts are and will be their customers.




[1] Over the years, many vendors have said and attempted this product to solution transformation without achieving it.

Wednesday, April 2, 2014

HP talks to the Analysts

By Rich Ptak



HP held its annual analyst event, following earlier similarly-focused events by IBM at PULSE 2014 and CA Technologies in NYC (blogs available about IBM[1] and CA Technologies[2]). HP event details were under non-disclosure, so we’ll focus on broad outlines and our impressions.

All three vendors see Cloud, Big Data/Analytics, Mobility and Security technologies as key influences impacting IT and business operations. Each has a slightly different view of how the effect is made manifest. Identifying them as enablers (CA), drivers (HP) and a combination of both (IBM) to cause a fundamental transformation in the way things get done.

HP CEO, Meg Whitman gave a keynote speech documenting HP’s progress, discussing strategy and plans going forward. HP executives then gave an overview of their activities and plans for the coming year. During breakout sessions, each business group provided details on products and strategies. Meg and her team accomplished a lot to get HP back on track. They have restored confidence, coherence, focus and enthusiasm at HP.

Meg quoted key success measurements, including net debt reduction to $0 and free cash flow of $1.7B as of the last fiscal quarter. HP stock trades at around $32/share (3/24/16) up from just over $10/share when Meg joined. Meg visited customers, finding them "incredibly loyal" and supportive of HP “winning in the market”.  No surprise, but it is reassuring that at eighteen months into Meg’s 5-year plan, things appear to be working.

Every serious IT solution provider recognizes that disruptive transformation is taking place. Waves of change from Cloud, Mobility, Big Data (and Analytics) and Security underlies the upheaval in IT and business operational models akin to what occurred during the moves from mainframe to distributed servers/PCs to internet apps and web-based services. It is an inflection point impacting business and IT and fundamentally transforming everything from business models to operational strategies. IT vendors must help customers deal with that transformation.

HP has identified a New Style of business powering a: “New Style of IT (that) promises simplicity, greater agility, speed and lower costs…customers are looking for help from trusted advisors to understand how they navigate this brave new world…they need comprehensive solutions that solve their toughest business problems, not just a set of disparate IT assets.” 

In describing how to do this, HP focused on the details and prowess of their portfolio. They provided an abundance of technical specifications about a very nice collection of well thought-out tools. They did have some customers briefly describe their success with HP products. While interesting and important, most customers are more interested in the specifics of ‘navigation’, not ‘speeds ‘n feeds’. HP failed to discuss how they help customers prioritize, plan for, acquire, implement and use New IT. 

HP clearly understands significant changes are occurring in decision making. They gave a good description of the new “Millennial” consumer/decision-maker. However, they gave no indication of how they will attract them. Customers are left to figure out for themselves what to do with HP’s tools. Without programs to attract those millennials, HP suffers in comparison to its competition.

For example, for one vendor, Next Generation IT means extending IT’s impact into every business function. Each product release has innovation to achieve that. Another vendor recognized customers got more benefit quicker as product familiarity increased. They focus on providing easier, broader access to solution suites, including free on-line trials, ‘try and buy’, increased customer collaboration in design and development[3], etc.
We did enjoy the demo area, Project Moonshot is HP’s high-density, high efficiency, and low footprint rack-sized appliance to manage and deliver multitudes of software-defined servers. Our interest was piqued by HP HAVEn, a Big Data management and analysis platform that leverages other products (Vertica, Autonomy) to collect, index and query data in multiple forms and formats.

This is clearly a new HP, however….

HP is clearly improving. Their products are technologically sound, some industry leading. The product details and long-range R&D efforts were interesting. If we weren’t in the midst of an ‘inflexion point’ with an uncertain economy, this approach may have been adequate. Unfortunately, as things stand today, HP’s presentations fell short of what is needed. Customers need guidance and structure on how to maximize their return from technology. HP revealed no plans to do that.

HP sees itself, but did not present itself as an integrated company. Products were covered in isolated silos. Neither business nor IT operates that way today. HP’s differentiation from competitors was never made clear. They failed to say how they are uniquely able to help customers be more effective at using HP solutions to solve their problems.

IT vendors are notoriously inept at marketing. The consumerization of IT was supposed to change that. Demonstrating how a vendor is making it easier for customers to continually get more value from IT is more persuasive than chest-beating. HP has a story for solution-focused customers interested in immediate benefit; they just need to tell it. Until then, they will get less attention than they deserve.