New Synergy Research report finds enterprise data center market is strong for now

Conventional wisdom would suggest that in 2019, the public cloud dominates and enterprise data centers are becoming an anachronism of a bygone era, but new data from Synergy Research finds that the enterprise data center market had a growth spurt last year. In fact, Synergy reported that overall spending in enterprise infrastructure, which includes elements […]

Conventional wisdom would suggest that in 2019, the public cloud dominates and enterprise data centers are becoming an anachronism of a bygone era, but new data from Synergy Research finds that the enterprise data center market had a growth spurt last year.

In fact, Synergy reported that overall spending in enterprise infrastructure, which includes elements like servers, switches and routers and network security; grew 13 percent last year and represents a $125 billion business — not too shabby for a market that is supposedly on its deathbed.

Overall these numbers showed that market is still growing, although certainly not nearly as fast the public cloud. Synergy was kind enough to provide a separate report on the cloud market, which grew 32 percent last year to $250 billion annually.

As Synergy analyst John Dinsdale, pointed out, the private data center is not the only buyer here. A good percentage of sales is likely going to the public cloud, who are building data centers at a rapid rate these days. “In terms of applications and levels of usage, I’d characterize it more like there being a ton of growth in the overall market, but cloud is sucking up most of the growth, while enterprise or on-prem is relatively flat,” Dinsdale told TechCrunch.

 

 

Perhaps the surprising data nugget in the report is that Cisco remains the dominant vendor in this market with 23 percent share over the last four quarters. This, even as it tries to pivot to being more of a software and services vendor, spending billions on companies such as AppDynamics, Jasper Technologies and Duo Security in recent years. Yet data still shows that it still dominating in the traditional hardware sector.

Cisco remains the top vendor in the category in spite of losing a couple of percentage points in marketshare over the last year, primarily due to the fact they don’t do great in the server part of the market, which happens to be the biggest overall slice. The next vendor, HPE, is far back at just 11 percent across the six segments.

While these numbers show that companies are continuing to invest in new hardware, the growth is probably not sustainable long term. At AWS Re:invent in November, AWS president Andy Jassy pointed out that a vast majority of data remains in private data centers, but that we can expect that to begin to move more briskly to the public cloud over the next five years. And web scale companies like Amazon often don’t buy hardware off the shelf, opting to develop custom tools they can understand and configure at a highly granular level.

Jassy said that outside the US, companies are one to three years behind this trend, depending on the market, so the shift is still going on, as the much bigger growth in the public cloud numbers indicates.

AWS wants to rule the world

AWS, once a nice little side hustle for Amazon’s eCommerce business, has grown over the years into a behemoth that’s on a $27 billion run rate, one that’s still growing at around 45 percent a year. That’s a highly successful business by any measure, but as I listened to AWS executives last week at their […]

AWS, once a nice little side hustle for Amazon’s eCommerce business, has grown over the years into a behemoth that’s on a $27 billion run rate, one that’s still growing at around 45 percent a year. That’s a highly successful business by any measure, but as I listened to AWS executives last week at their AWS re:Invent conference in Las Vegas, I didn’t hear a group that was content to sit still and let the growth speak for itself. Instead, I heard one that wants to dominate every area of enterprise computing.

Whether it was hardware like the new Inferentia chip and Outposts, the new on-prem servers or blockchain and a base station service for satellites, if AWS saw an opportunity they were not ceding an inch to anyone.

Last year, AWS announced an astonishing 1400 new features, and word was that they are on pace to exceed that this year. They get a lot of credit for not resting on their laurels and continuing to innovate like a much smaller company, even as they own gobs of marketshare.

The feature inflation probably can’t go on forever, but for now at least they show no signs of slowing down, as the announcements came at a furious pace once again. While they will tell you that every decision they make is about meeting customer needs, it’s clear that some of these announcements were also about answering competitive pressure.

Going after competitors harder

In the past, AWS kept criticism of competitors to a minimum maybe giving a little jab to Oracle, but this year they seemed to ratchet it up. In their keynotes, AWS CEO Andy Jassy and Amazon CTO Werner Vogels continually flogged Oracle, a competitor in the database market, but hardly a major threat as a cloud company right now.

They went right for Oracle’s market though with a new on prem system called Outposts, which allows AWS customers to operate on prem and in the cloud using a single AWS control panel or one from VMware if customers prefer. That is the kind of cloud vision that Larry Ellison might have put forth, but Jassy didn’t necessarily see it as going after Oracle or anyone else. “I don’t see Outposts as a shot across the bow of anyone. If you look at what we are doing, it’s very much informed by customers,” he told reporters at a press conference last week.

AWS CEO Andy Jassy at a press conference at AWS Re:Invent last week.

Yet AWS didn’t reserve its criticism just for Oracle. It also took aim at Microsoft, taking jabs at Microsoft SQL Server, and also announcing Amazon FSx for Windows File Server, a tool specifically designed to move Microsoft files to the AWS cloud.

Google wasn’t spared either when launching Inferentia and Elastic Inference, which put Google on notice that AWS wasn’t going to yield the AI market to Google’s TPU infrastructure. All of these tools and much more were about more than answering customer demand, they were about putting the competition on notice in every aspect of enterprise computing.

Upward growth trajectory

The cloud market is continuing to grow at a dramatic pace, and as market leader, AWS has been able to take advantage of its market dominance to this point. Jassy, echoing Google’s Diane Greene and Oracle’s Larry Ellison, says the industry as a whole is still really early in terms of cloud adoption, which means there is still plenty of marketshare left to capture.

“I think we’re just in the early stages of enterprise and public sector adoption in the US. Outside the US I would say we are 12-36 months behind. So there are a lot of mainstream enterprises that are just now starting to plan their approach to the cloud,” Jassy said.

Patrick Moorhead, founder and principal analyst at Moor Insights & Strategy says that AWS has been using its market position to keep expanding into different areas. “AWS has the scale right now to do many things others cannot, particularly lesser players like Google Cloud Platform and Oracle Cloud. They are trying to make a point with the thousands of new products and features they bring out. This serves as a disincentive longer-term for other players, and I believe will result in a shakeout,” he told TechCrunch.

As for the frenetic pace of innovation, Moorhead believes it can’t go on forever. “To me, the question is, when do we reach a point where 95% of the needs are met, and the innovation rate isn’t required. Every market, literally every market, reaches a point where this happens, so it’s not a matter of if but when,” he said.

Certainly areas like the AWS Ground Station announcement, showed that AWS was willing to expand beyond the conventional confines of enterprise computing and into outer space to help companies process satellite data. This ability to think beyond traditional uses of cloud computing resources shows a level of creativity that suggests there could be other untapped markets for AWS that we haven’t yet imagined.

As AWS moves into more areas of the enterprise computing stack, whether on premises or in the cloud, they are showing their desire to dominate every aspect of the enterprise computing world. Last week they demonstrated that there is no area that they are willing to surrender to anyone.

more AWS re:Invent 2018 coverage

Amazon Elastic Inference will reduce deep learning costs by ~75%

AWS today announced Amazon Elastic Inference, a new service that lets customers attach GPU-powered inference acceleration to any Amazon EC2 instance.

Amazon Web Services today announced Amazon Elastic Inference, a new service that lets customers attach GPU-powered inference acceleration to any Amazon EC2 instance and reduces deep learning costs by up to 75 percent.

“What we see typically is that the average utilization of these P3 instances GPUs are about 10 to 30 percent, which is pretty wasteful with elastic inference. You don’t have to waste all that costs and all that GPU,” AWS chief executive Andy Jassy said onstage at the AWS re:Invent conference earlier today. “[Amazon Elastic Inference] is a pretty significant game changer in being able to run inference much more cost-effectively.”

Amazon Elastic Inference will also be available for Amazon SageMaker notebook instances and endpoints, “bringing acceleration to built-in algorithms and to deep learning environments,” the company wrote in a blog post. It will support machine learning frameworks TensorFlow, Apache MXNet and ONNX.

It’s available in three sizes:

  • eia1.medium: 8 TeraFLOPs of mixed-precision performance.
  • eia1.large: 16 TeraFLOPs of mixed-precision performance.
  • eia1.xlarge: 32 TeraFLOPs of mixed-precision performance.

Dive deeper into the new service here.

AWS launches Security Hub to help customers manage security & compliance

AWS unveiled a new security product today at AWS re:Invent, the company’s annual conference for cloud storage enthusiasts.

Amazon Web Services (AWS) unveiled its latest updates to security on its cloud services platform today at AWS re:Invent, the company’s annual conference for database storage enthusiasts.

AWS Security Hub is a new place for businesses to centrally manage compliance and identify security across AWS environment, says AWS chief executive officer Andy Jassy. The service will help AWS users derive insights from attack patterns and techniques so they can take action more quickly.

“This is going to pretty radically change how easy it is to look at what’s happening security-wise across … AWS,” Jassy said. “Whether you’re using AWS security services like Inspector for vulnerability scanning or GuardDuty for network intrusion or Macie for anomalous data patterns or whether you’re using a very large number of third-party software security services in our ecosystem.”

AWS has signed up a number of its partners for the initial roll out, including CrowdStrike, McAfee, Symantec and Tenable.

VMware pulls AWS’s Relational Database Service into the data center

Here’s some unusual news: AWS, Amazon’s cloud computing arm, today announced that it plans to bring its Relational Database Service (RDS) to VMware, no matter whether that’s VMware Cloud on AWS or a privately hosted VMware deployment in a corporate data center. While some of AWS’s competitors have long focused on these kinds of hybrid […]

Here’s some unusual news: AWS, Amazon’s cloud computing arm, today announced that it plans to bring its Relational Database Service (RDS) to VMware, no matter whether that’s VMware Cloud on AWS or a privately hosted VMware deployment in a corporate data center.

While some of AWS’s competitors have long focused on these kinds of hybrid cloud deployments, AWS never really put the same kind of emphasis on this. Clearly, though, that’s starting to change — maybe in part because Microsoft and others are doing quite well in this space.

“Managing the administrative and operational muck of databases is hard work, error-prone and resource intensive,” said AWS CEO Andy Jassy . “It’s why hundreds of thousands of customers trust Amazon RDS to manage their databases at scale. We’re excited to bring this same operationally battle-tested service to VMware customers’ on-premises and hybrid environments, which will not only make database management much easier for enterprises, but also make it simpler for these databases to transition to the cloud.”

With Amazon RDS on VMware, enterprises will be able to use AWS’s technology to run and manage Microsoft SQL Server, Oracle, PostgreSQL, MySQL and MariaDB databases in their own data centers. The idea here, AWS says, is to make it easy for enterprises to set up and manage their databases wherever they want to host their data — and to then migrate it to AWS when they choose to do so.

This new service is currently in private preview, so we don’t know all that much about how this will work in practice or what it will cost. AWS promises, however, that the experience will pretty much be the same as in the cloud and that RDS on VMware will handle all the updates and patches automatically.

Today’s announcement comes about two years after the launch of VMware Cloud on AWS, which was pretty much the reverse of today’s announcement. With VMware Cloud on AWS, enterprises can take their existing VMware deployments and take them to AWS.