How open source software took over the world

Mike Volpi Contributor Share on Twitter Mike Volpi is a general partner at Index Ventures. Before co-founding the firm’s San Francisco office with Danny Rimer, Volpi served as the chief strategy officer at Cisco Systems. It was just 5 years ago that there was an ample dose of skepticism from investors about the viability of open source […]

It was just 5 years ago that there was an ample dose of skepticism from investors about the viability of open source as a business model. The common thesis was that Redhat was a snowflake and that no other open source company would be significant in the software universe.

Fast forward to today and we’ve witnessed the growing excitement in the space: Redhat is being acquired by IBM for $32 billion (3x times its market cap from 2014); Mulesoft was acquired after going public for $6.5 billion; MongoDB is now worth north of $4 billion; Elastic’s IPO now values the company at $6 billion; and, through the merger of Cloudera and Hortonworks, a new company with a market cap north of $4 billion will emerge. In addition, there’s a growing cohort of impressive OSS companies working their way through the growth stages of their evolution: Confluent, HashiCorp, DataBricks, Kong, Cockroach Labs and many others. Given the relative multiples that Wall Street and private investors are assigning to these open source companies, it seems pretty clear that something special is happening.

So, why did this movement that once represented the bleeding edge of software become the hot place to be? There are a number of fundamental changes that have advanced open source businesses and their prospects in the market.

David Paul Morris/Bloomberg via Getty Images

From Open Source to Open Core to SaaS

The original open source projects were not really businesses, they were revolutions against the unfair profits that closed-source software companies were reaping. Microsoft, Oracle, SAP and others were extracting monopoly-like “rents” for software, which the top developers of the time didn’t believe was world class. So, beginning with the most broadly used components of software – operating systems and databases – progressive developers collaborated, often asynchronously, to author great pieces of software. Everyone could not only see the software in the open, but through a loosely-knit governance model, they added, improved and enhanced it.

The software was originally created by and for developers, which meant that at first it wasn’t the most user-friendly. But it was performant, robust and flexible. These merits gradually percolated across the software world and, over a decade, Linux became the second most popular OS for servers (next to Windows); MySQL mirrored that feat by eating away at Oracle’s dominance.

The first entrepreneurial ventures attempted to capitalize on this adoption by offering “enterprise-grade” support subscriptions for these software distributions. Redhat emerged the winner in the Linux race and MySQL (thecompany) for databases. These businesses had some obvious limitations – it was harder to monetize software with just support services, but the market size for OS’s and databases was so large that, in spite of more challenged business models, sizeable companies could be built.

The successful adoption of Linux and MySQL laid the foundation for the second generation of Open Source companies – the poster children of this generation were Cloudera and Hortonworks. These open source projects and businesses were fundamentally different from the first generation on two dimensions. First, the software was principally developed within an existing company and not by a broad, unaffiliated community (in the case of Hadoop, the software took shape within Yahoo!) . Second, these businesses were based on the model that only parts of software in the project were licensed for free, so they could charge customers for use of some of the software under a commercial license. The commercial aspects were specifically built for enterprise production use and thus easier to monetize. These companies, therefore, had the ability to capture more revenue even if the market for their product didn’t have quite as much appeal as operating systems and databases.

However, there were downsides to this second generation model of open source business. The first was that no company singularly held ‘moral authority’ over the software – and therefore the contenders competed for profits by offering increasing parts of their software for free. Second, these companies often balkanized the evolution of the software in an attempt to differentiate themselves. To make matters more difficult, these businesses were not built with a cloud service in mind. Therefore, cloud providers were able to use the open source software to create SaaS businesses of the same software base. Amazon’s EMR is a great example of this.

The latest evolution came when entrepreneurial developers grasped the business model challenges existent in the first two generations – Gen 1 and Gen 2 – of open source companies, and evolved the projects with two important elements. The first is that the open source software is now developed largely within the confines of businesses. Often, more than 90% of the lines of code in these projects are written by the employees of the company that commercialized the software. Second, these businesses offer their own software as a cloud service from very early on. In a sense, these are Open Core / Cloud service hybrid businesses with multiple pathways to monetize their product. By offering the products as SaaS, these businesses can interweave open source software with commercial software so customers no longer have to worry about which license they should be taking. Companies like Elastic, Mongo, and Confluent with services like Elastic Cloud, Confluent Cloud, and MongoDB Atlas are examples of this Gen 3.  The implications of this evolution are that open source software companies now have the opportunity to become the dominant business model for software infrastructure.

The Role of the Community

While the products of these Gen 3 companies are definitely more tightly controlled by the host companies, the open source community still plays a pivotal role in the creation and development of the open source projects. For one, the community still discovers the most innovative and relevant projects. They star the projects on Github, download the software in order to try it, and evangelize what they perceive to be the better project so that others can benefit from great software. Much like how a good blog post or a tweet spreads virally, great open source software leverages network effects. It is the community that is the source of promotion for that virality.

The community also ends up effectively being the “product manager” for these projects. It asks for enhancements and improvements; it points out the shortcomings of the software. The feature requests are not in a product requirements document, but on Github, comments threads and Hacker News. And, if an open source project diligently responds to the community, it will shape itself to the features and capabilities that developers want.

The community also acts as the QA department for open source software. It will identify bugs and shortcomings in the software; test 0.x versions diligently; and give the companies feedback on what is working or what is not.  The community will also reward great software with positive feedback, which will encourage broader use.

What has changed though, is that the community is not as involved as it used to be in the actual coding of the software projects. While that is a drawback relative to Gen 1 and Gen 2 companies, it is also one of the inevitable realities of the evolving business model.

Linus Torvalds was the designer of the open-source operating system Linux.

Rise of the Developer

It is also important to realize the increasing importance of the developer for these open source projects. The traditional go-to-market model of closed source software targeted IT as the purchasing center of software. While IT still plays a role, the real customers of open source are the developers who often discover the software, and then download and integrate it into the prototype versions of the projects that they are working on. Once “infected”by open source software, these projects work their way through the development cycles of organizations from design, to prototyping, to development, to integration and testing, to staging, and finally to production. By the time the open source software gets to production it is rarely, if ever, displaced. Fundamentally, the software is never “sold”; it is adopted by the developers who appreciate the software more because they can see it and use it themselves rather than being subject to it based on executive decisions.

In other words, open source software permeates itself through the true experts, and makes the selection process much more grassroots than it has ever been historically. The developers basically vote with their feet. This is in stark contrast to how software has traditionally been sold.

Virtues of the Open Source Business Model

The resulting business model of an open source company looks quite different than a traditional software business. First of all, the revenue line is different. Side-by-side, a closed source software company will generally be able to charge more per unit than an open source company. Even today, customers do have some level of resistance to paying a high price per unit for software that is theoretically “free.” But, even though open source software is lower cost per unit, it makes up the total market size by leveraging the elasticity in the market. When something is cheaper, more people buy it. That’s why open source companies have such massive and rapid adoption when they achieve product-market fit.

Another great advantage of open source companies is their far more efficient and viral go-to-market motion. The first and most obvious benefit is that a user is already a “customer” before she even pays for it. Because so much of the initial adoption of open source software comes from developers organically downloading and using the software, the companies themselves can often bypass both the marketing pitch and the proof-of-concept stage of the sales cycle. The sales pitch is more along the lines of, “you already use 500 instances of our software in your environment, wouldn’t you like to upgrade to the enterprise edition and get these additional features?”  This translates to much shorter sales cycles, the need for far fewer sales engineers per account executive, and much quicker payback periods of the cost of selling. In fact, in an ideal situation, open source companies can operate with favorable Account Executives to Systems Engineer ratios and can go from sales qualified lead (SQL) to closed sales within one quarter.

This virality allows for open source software businesses to be far more efficient than traditional software businesses from a cash consumption basis. Some of the best open source companies have been able to grow their business at triple-digit growth rates well into their life while  maintaining moderate of burn rates of cash. This is hard to imagine in a traditional software company. Needless to say, less cash consumption equals less dilution for the founders.

Photo courtesy of Getty Images

Open Source to Freemium

One last aspect of the changing open source business that is worth elaborating on is the gradual movement from true open source to community-assisted freemium. As mentioned above, the early open source projects leveraged the community as key contributors to the software base. In addition, even for slight elements of commercially-licensed software, there was significant pushback from the community. These days the community and the customer base are much more knowledgeable about the open source business model, and there is an appreciation for the fact that open source companies deserve to have a “paywall” so that they can continue to build and innovate.

In fact, from a customer perspective the two value propositions of open source software are that you a) read the code; b) treat it as freemium. The notion of freemium is that you can basically use it for free until it’s deployed in production or in some degree of scale. Companies like Elastic and Cockroach Labs have gone as far as actually open sourcing all their software but applying a commercial license to parts of the software base. The rationale being that real enterprise customers would pay whether the software is open or closed, and they are more incentivized to use commercial software if they can actually read the code. Indeed, there is a risk that someone could read the code, modify it slightly, and fork the distribution. But in developed economies – where much of the rents exist anyway, it’s unlikely that enterprise companies will elect the copycat as a supplier.

A key enabler to this movement has been the more modern software licenses that companies have either originally embraced or migrated to over time. Mongo’s new license, as well as those of Elastic and Cockroach are good examples of these. Unlike the Apache incubated license – which was often the starting point for open source projects a decade ago, these licenses are far more business-friendly and most model open source businesses are adopting them.

The Future

When we originally penned this article on open source four years ago, we aspirationally hoped that we would see the birth of iconic open source companies. At a time where there was only one model – Redhat – we believed that there would be many more. Today, we see a healthy cohort of open source businesses, which is quite exciting. I believe we are just scratching the surface of the kind of iconic companies that we will see emerge from the open source gene pool. From one perspective, these companies valued in the billions are a testament to the power of the model. What is clear is that open source is no longer a fringe approach to software. When top companies around the world are polled, few of them intend to have their core software systems be anything but open source. And if the Fortune 5000 migrate their spend on closed source software to open source, we will see the emergence of a whole new landscape of software companies, with the leaders of this new cohort valued in the tens of billions of dollars.

Clearly, that day is not tomorrow. These open source companies will need to grow and mature and develop their products and organization in the coming decade. But the trend is undeniable and here at Index we’re honored to have been here for the early days of this journey.

Amid a legal fight in LA, IBM’s Weather Company launches hyperlocal weather forecasts globally

While IBM is getting sued by the city of Los Angeles, accusing it of covertly mining user data in the Weather Channel app in the US, it’s testing the waters for another hyperlocal weather feature that — coincidentally — relies on data that it picks up from sensors on app users’ smartphones, among other devices, […]

While IBM is getting sued by the city of Los Angeles, accusing it of covertly mining user data in the Weather Channel app in the US, it’s testing the waters for another hyperlocal weather feature that — coincidentally — relies on data that it picks up from sensors on app users’ smartphones, among other devices, combined with AI at IBM’s end to help model the information.

Today at CES, the company announced new service called the Global High-Resolution Atmospheric Forecasting System — GRAF for short — a new weather forecasting system that says it will provide the most accurate weather for anywhere in the world, running every hour, and in increments of every three kilometers everywhere by way of crunching around 10 terabytes of data every day.

The new hyperlocal weather data will start to become available in 2019.

This is a key piece of news particularly for the developing world. There has been some effort already to create and use hyperlocal weather information in the US market using things like in-built sensors that can pick up information on, for example, barometric pressure — the very feature that is now the subject of a lawsuit — but there have been fewer efforts to bring that kind of service to a wider, global audience.

“If you’re a farmer in Kenya or Kansas, you will get a way better weather prediction,” said Ginny Rometty, the CEO of IBM, announcing the service today at CES.

She added that other potential end users of the data could include airlines to better predict when a plane might encounter turbulence or other patterns that could affect a flight; insurance companies managing recovery operations and claims around natural disasters; and utility companies monitoring for faults or preparing for severe weather strains on their systems.

Rometty said that the Weather Channel app’s 100 million users — and, in an estimation from Mary Glackin, the Weather Channel’s VP of business solutions, 300 million monthly active users when considering the wider network of places where the data gets used including Weather.com and Weather Underground — will be providing the data “with consent”. Data sourced from businesses will be coming from customers that are partners and are also likely to become users of the data.

That data in turn will be run through IBM’s Power9 supercomputers, the same ones used in the US Department of Energy’s Summit and Sierra  supercomputers, and modelled using suplementary data from the National Center for Atmospheric Research (NCAR).

The news represents a big step change for the Weather Company and for meteorology research, Glackin said in an interview.

“This is going to be the first significant implementation of GPUs at the Weather Company,” she told me. “The weather community has been slow to adopt to technology, but this is providing much improved performance for us, with higher resolutions and a much finer scale and focus of short-term forecasts.”

The new service of providing hyperlocal data also underscores an interesting turn for IBM as it turns its efforts to building the Weather Channel business into a more global operation, and one that helps deliver more business returns for IBM itself.

Glackin said the Weather Channel app was the most-downloaded weather app in India last year, underscoring how it, like other consumer apps, is seeing more growth outside of the US at the moment after already reaching market saturation in its home market.

Saturation, and some controversy. It’s not clear how the lawsuit in LA will play out, but the fact that it’s been filed definitely points to changing opinions and sensibilities when it comes to the use of personal data, and more generally how consumers and authorities are starting to think about how all that data that we are generating every day on our connected devices is getting used.

IBM is by far not the only company, nor the most vilified, when it comes to this issue, but at a time when the company is still trying to capitalise on the potential of how to commercialise the trove of information and customer connections in its wider business network, this will be something that will impact it as well.

Notably, Rometty closed off her keynote today at CES with a few parting words that reference that.

“As we work on these technologies, all that data that we talked about, that ownership, they belong to the user, and with their permission, we use that,” she said, adding, “These technologies also need to be open and explainable.”

IBM unveils its first commercial quantum computer

At CES, IBM today announced its first commercial quantum computer for use outside of the lab. The 20-qubit system combines into a single package the quantum and classical computing parts it takes to use a machine like this for research and business applications. That package, the IBM Q system, is still huge, of course, but […]

At CES, IBM today announced its first commercial quantum computer for use outside of the lab. The 20-qubit system combines into a single package the quantum and classical computing parts it takes to use a machine like this for research and business applications. That package, the IBM Q system, is still huge, of course, but it includes everything a company would need to get started with its quantum computing experiments, including all the machinery necessary to cool the quantum computing hardware.

While IBM describes it as the first fully integrated universal quantum computing system designed for scientific and commercial use, it’s worth stressing that a 20-qubit machine is nowhere near powerful enough for most of the commercial applications that people envision for a quantum computer with more qubits — and qubits that are useful for more than 100 microseconds. It’s no surprise then, that IBM stresses that this is a first attempt and that the systems are “designed to one day tackle problems that are currently seen as too complex and exponential in nature for classical systems to handle.” Right now, we’re not quite there yet, but the company also notes that these systems are upgradable (and easy to maintain).

“The IBM Q System One is a major step forward in the commercialization of quantum computing,” said Arvind Krishna, senior vice president of Hybrid Cloud and director of IBM Research. “This new system is critical in expanding quantum computing beyond the walls of the research lab as we work to develop practical quantum applications for business and science.”

More than anything, though, IBM seems to be proud of the design of the Q systems. In a move that harkens back to Cray’s supercomputers with its expensive couches, IBM worked with design studios Map Project Office and Universal Design Studio, as well Goppion, the company that has built, among other things, the display cases that house the U.K.’s crown jewels and the Mona Lisa. IBM clearly thinks of the Q system as a piece of art and, indeed, the final result is quite stunning. It’s a nine-foot-tall and nine-foot-wide airtight box, with the quantum computing chandelier hanging in the middle, with all of the parts neatly hidden away.

If you want to buy yourself a quantum computer, you’ll have to work with IBM, though. It won’t be available with free two-day shipping on Amazon anytime soon.

In related news, IBM also announced the IBM Q Network, a partnership with ExxonMobil and research labs like CERN and Fermilab that aims to build a community that brings together the business and research interests to explore use cases for quantum computing. The organizations that partner with IBM will get access to its quantum software and cloud-based quantum computing systems.

CES 2019 coverage - TechCrunch

IBM was awarded the most patents in 2018, but overall grants declined by 3.5 percent

We may have passed the peak of the “patent war” in the mobile industry, but the concept of patents as power is far from disappearing, since they continue to be a strong marker for how a company is breaking new ground in technology, and do sometimes help to safeguard an inventor’s or company’s original work […]

We may have passed the peak of the “patent war” in the mobile industry, but the concept of patents as power is far from disappearing, since they continue to be a strong marker for how a company is breaking new ground in technology, and do sometimes help to safeguard an inventor’s or company’s original work — even if the legal enforcement around patents sometimes gets abused.

Patent research firm IFI Claims today published its annual report on how patent grants played out in the tech industry in the past year, and it’s a mixed picture as well. It found that IBM has once again, for the 26th year running, topped the list, with 9,100 patents, followed by Samsung, Canon, Intel and LG — also the same top five as a year ago. Forty-six percent of all applications came from the US, with Asia accounting for 31 percent and the US for 15 percent.

But overall, the number of patents granted in 2018 dropped 3.5 percent over 2017, with many a company in the top 50 showing declines in their grants.

Notable declines included Sony (ranking at 15) down 21 percent; Google (number 11) down 16 percent, and Qualcomm (number 8) down 12 percent. Facebook, which last year made it into the top 50 for the first time, dropped out of the shortlist altogether. On the other hand, companies out of China on average saw overall gains across their patent portfolios.

IFI’s Larry Cady said that it’s not clear why so many companies collectively saw significant declines — whether it was due to a lull in innovation — something that I’d argue might actually be happening in the wider industry — or a new approach to how a company safeguards its intellectual property, or even a more conservative process at the USPTO.

What he does note is that there is an average cycle of two years between pre-grant applications and grants, and these were down in 2016 and 2017, meaning 2020 may also see some declines. (Applications were up in 2018 to 374,763, meaning the numbers will also bounce back for grants.)

Other notable trends: Ford has really been driving up its tech cred with its turn to autonomous vehicle technology, jumping up five spots to become the only car company in the top 10.

Apple also moved back into the top 10 ranking, even as its overall patents declined by three percent.

And IFI notes that if you combined all the subsidiaries of Samsung, it would have actually surpassed IBM this year for overall patents held, or “ultimate patent ownership,” in the words of the IFI.

Samsung Electronics has 61,608 active patent families, with Canon in second position with 34,905 and IBM rounding out the top list with 34,376.

Unlike IFI’s annual Top U.S. Patent Recipients, this broader ranking measures the size of a patent owner’s global portfolio based on the number of active patent families. A patent family is a set of patent publications filed around the world to cover a single invention.

 

IBM Research develops fingerprint sensor to monitor disease progression

IBM today announced that it has developed a small sensor that sits on a person’s fingernail to help monitor the effectiveness of drugs used to combat the symptoms Parkinson’s and other diseases. Together with the custom software that analyses the data, the sensor measures how the nail warps as the user grips something. Since virtually any activity […]

IBM today announced that it has developed a small sensor that sits on a person’s fingernail to help monitor the effectiveness of drugs used to combat the symptoms Parkinson’s and other diseases. Together with the custom software that analyses the data, the sensor measures how the nail warps as the user grips something. Since virtually any activity involves gripping objects, that creates a lot of data for the software to analyze.

Another way to get this data would be to attach a sensor to the skin and capture motion, as well as the health of muscles and nerves that way. The team notes that skin-based sensors can cause plenty of other problems, including infections, so it decided to look at using data from how a person’s fingernails bend instead.

For the most part, though, fingernails don’t bend all that much, so the sensor had to be rather sensitive. “It turns out that our fingernails deform – bend and move — in stereotypic ways when we use them for gripping, grasping, and even flexing and extending our fingers,” the researchers explain. “This deformation is usually on the order of single digit microns and not visible to the naked eye. However, it can easily detect with strain gauge sensors. For context, a typical human hair is between 50 and 100 microns across and a red blood cell is usually less than 10 microns across.”

In its current version, the researchers glue the prototype to the nail. Since fingernails are pretty tough, there’s very little risk in doing so, especially when compared to a sensor that would sit on the skin. The sensor then talks to a smartwatch that runs machine learning models to detect tremors and other symptoms of Parkinson’s disease. That model can detect what a wearer is doing (opening a doorknob, using a screwdriver, etc.). The data and the model are accurate enough to track when wearers write digits with their fingers.

Over time, the team hopes that it can extend this prototype and the models that analyze the data to recognize other diseases as well. There’s no word on when this sensor could make it onto the market, though.

IBM Africa and Hello Tractor pilot AI/blockchain agtech platform

IBM Research and agtech startup Hello Tractor have developed an AI and blockchain-driven platform for Africa’s farmers. The two companies will pilot the product in 2019.

IBM Research and agtech startup Hello Tractor have developed an AI and blockchain-driven platform for Africa’s farmers. The two companies will pilot the product in 2019 through an ongoing partnership co-financed by IBM.

Dubbed Digital Wallet in beta, the cloud-based service aims to support Hello Tractor’s business of connecting small-scale farmers to equipment and data analytics for better crop production.

“Agriculture is a complex industry that can have so many different variables. We’re bringing a decision tool to the Hello Tractor ecosystem powered by AI and blockchain,” Hello Tractor CEO Jehiel Oliver told TechCrunch.

The startup joined IBM Research to demo the new service at Startup Battlefield Africa in Lagos.

Available to Hello Tractor clients, the online platform will use a digital ledger and machine learning to capture, track, and share data, while “creating end-to-end trust and transparency across the agribusiness value chain,” according to an IBM release.

Digital Wallet will draw on remote and IoT-based weather-sensing methods and AI to help farmers determine crops and inputs, choose when to plant and optimize and predict crop yields.

The cloud-based dashboard also employs a blockchain ledger to improve multiple points of Hello Tractor’s business.

“We’re an agricultural technology company. Our platform connects farmers who need tractor services to tractor owners who own these assets as a business. We create that marketplace to bring supply and demand together,” said Oliver.

The demand stems from the 80 percent of Sub-Saharan Africa’s crops harvested without tractors or machinery and the 50 percent of the continent’s farmers who suffer post-harvest losses annually, according to IBM and the Food and Agricultural Organization.

IBM and Hello Tractor’s Digital Wallet will also loop in data from fleet owners regarding tractor use, track and predict repairs and servicing and build credit profiles to open bank financing for farmers.

Hello Tractor is a connecting service — neither the startup nor its farming clients own tractors. Founded in 2014, the venture began operations in Nigeria and has expanded into Kenya, Mozambique, Senegal, Tanzania and Bangladesh within the last year, according to its CEO. A for-profit entity, Hello Tractor has raised funding from private investors, DFI grants and a seed round.

The company currently generates revenue by selling the tractor-monitoring devices and software subscriptions for its app, according to Oliver. Hello Tractor doesn’t yet charge transactional fees for connecting tractors to farmers, “but we’ll be testing that next year,” he said.

The startup also plans to create broader revenue opportunities from data analytics.

“At this phase we focus primarily on mechanization, but coupling the insights being generated through that device with the IBM platform solutions specifically for agriculture can extend the value we offer our customers and…be monetized,” said Oliver.

He estimates the business of connecting small-scale farmers to tractors as a “multi-billion market” globally and pointed to Nigeria as the African nation with “the largest inventory of arable-uncultivated farmland,” 37 percent of the country, according to World Bank data.

IBM Research’s co-financing to build Digital Wallet does not include any equity stake in Hello Tractor, IBM confirmed.

The collaboration aligns with IBM’s global agricultural strategy, embedded largely in its Watson AI business platform and global agtech partnerships. As TechCrunch covered, IBM partnered with Kenyan agtech startup Twiga earlier this year to introduce to Twiga’s network of vendors a blockchain-enabled working capital platform.

IBM Research views the partnership “as scientific research collaboration,” according to VP Solomon Assefa.

“Through all its touch points — farmers, machinery, dealers, crop yields, data inputs — Hello Tractor is convening the whole agricultural ecosystem,” he said.

As discussed at Startup Battlefield Africa, Africa is shaping its own blockchain-focused startups and use cases — characterized more by utility than speculation. On the crypto-side, there were several 2018 ICOs, including remittance startup SurRemit’s $7 million token launch, payments venture Wala’s $1 million offering and one by South African solar energy startup Sun Exchange.

IBM Research and Hello Tractor teams will continue to build out the blockchain-enabled Digital Wallet on a lab, engineer and business level throughout 2019.

“We’re cultivating the partnership… including the executive and go-to-market side. You also have to focus on how you scale,” said Assefa.

Google launches Istio on GKE

Google today announced an update to GKE, the Google Kubernetes Engine, that brings integrated support for the Istio service mesh to service. Istio support is currently in beta. While Istio isn’t yet the household name that Kubernetes has become in recent years, for many enterprises it’s an important building block for building their cloud-native platforms. […]

Google today announced an update to GKE, the Google Kubernetes Engine, that brings integrated support for the Istio service mesh to service. Istio support is currently in beta.

While Istio isn’t yet the household name that Kubernetes has become in recent years, for many enterprises it’s an important building block for building their cloud-native platforms.

At its core, Istio is an open-source service mesh that helps you connect, monitor and secure microservices on a variety of platforms — one of those being Kubernetes. Istio, and its own sub-components like the Envoy proxy, offer a way to integrate microservices, secure them and aggregate log data while providing an additional abstraction layer over orchestration services like Kubernetes .

“We truly believe that Istio will play a key role in helping you make the most of your microservices,” write Chen Goldberg, Google Cloud director of Engineering, and Jennifer Lin, Google Cloud director of Product Management, in today’s announcement. “One way Istio does this is to provide improved visibility and security, making working with containerized workloads easier. With Istio on GKE, we are the first major cloud provider to offer direct integration to a Kubernetes service and simplified lifecycle management for your containers.”

Goldberg and Lin also stress that Istio allows developers and operators to manage applications as services and not as lots of different infrastructure components. In addition, they note that Istio allows you to encrypt all your network traffic. Unsurprisingly, Istio on GKE also comes with an integration into Stackdriver, Google Cloud’s monitoring and logging service.

Istio first launched in the middle of 2017. The project is the result of a collaboration between Google, IBM and Lyft. It hit its version 1.0 release this summer, at the end of July, and companies like Datadog, SolarWinds and others have since built plugins to integrate it into their service. The Cloud Foundry project, too, is making Istio a core part of its service by using it as the core of its new traffic routing stack.

The Cloud Native Computing Foundation adds etcd to its open-source stable

The Cloud Native Computing Foundation (CNCF), the open-source home of projects like Kubernetes and Vitess, today announced that its technical committee has voted to bring a new project on board. That project is etcd, the distributed key-value store that was first developed by CoreOS (now owned by Red Hat, which in turn will soon be […]

The Cloud Native Computing Foundation (CNCF), the open-source home of projects like Kubernetes and Vitess, today announced that its technical committee has voted to bring a new project on board. That project is etcd, the distributed key-value store that was first developed by CoreOS (now owned by Red Hat, which in turn will soon be owned by IBM). Red Hat has now contributed this project to the CNCF.

Etcd, which is written in Go, is already a major component of many Kubernetes deployments, where it functions as a source of truth for coordinating clusters and managing the state of the system. Other open-source projects that use etcd include Cloud Foundry, and companies that use it in production include Alibaba, ING, Pinterest, Uber, The New York Times and Nordstrom.

“Kubernetes and many other projects like Cloud Foundry depend on etcd for reliable data storage. We’re excited to have etcd join CNCF as an incubation project and look forward to cultivating its community by improving its technical documentation, governance and more,” said Chris Aniszczyk, COO of CNCF, in today’s announcement. “Etcd is a fantastic addition to our community of projects.”

Today, etcd has well over 450 contributors and nine maintainers from eight different companies. The fact that it ended up at the CNCF is only logical, given that the foundation is also the host of Kubernetes. With this, the CNCF now plays host to 17 projects that fall under its “incubated technologies” umbrella. In addition to etcd, these include OpenTracing, Fluentd, Linkerd, gRPC, CoreDNS, containerd, rkt, CNI, Jaeger, Notary, TUF, Vitess, NATS Helm, Rook and Harbor. Kubernetes, Prometheus and Envoy have already graduated from this incubation stage.

That’s a lot of projects for one foundation to manage, but the CNCF community is also extraordinarily large. This week alone about 8,000 developers are converging on Seattle for KubeCon/CloudNativeCon, the organization’s biggest event yet, to talk all things containers. It surely helps that the CNCF has managed to bring competitors like AWS, Microsoft, Google, IBM and Oracle under a single roof to collaboratively work on building these new technologies. There is a risk of losing focus here, though, something that happened to the OpenStack project when it went through a similar growth and hype phase. It’ll be interesting to see how the CNCF will manage this as it brings on more projects (with Istio, the increasingly popular service mesh, being a likely candidate for coming over to the CNCF as well).

IBM selling Lotus Notes/Domino business to HCL for $1.8B

IBM announced last night that it is selling the final components from its 1995 acquisition of Lotus to Indian firm HCL for $1.8 billion. IBM paid $3.5 billion for Lotus back in the day. The big pieces here are Lotus Notes, Domino and Portal. These were a big part of IBM’s enterprise business for a long […]

IBM announced last night that it is selling the final components from its 1995 acquisition of Lotus to Indian firm HCL for $1.8 billion.

IBM paid $3.5 billion for Lotus back in the day. The big pieces here are Lotus Notes, Domino and Portal. These were a big part of IBM’s enterprise business for a long time, but last year Big Blue began to pull away, selling the development part to HCL, while maintaining control of sales and marketing.

This announcement marks the end of the line for IBM involvement. With the development of the platform out of its control, and in need of cash after spending $34 billion for Red Hat, perhaps IBM simply decided it no longer made sense to keep any part of this in-house.

As for HCL, it sees an opportunity to continue to build the Notes/Domino business, and it’s seizing it with this purchase. “The large-scale deployments of these products provide us with a great opportunity to reach and serve thousands of global enterprises across a wide range of industries and markets,” C Vijayakumar, president and CEO at HCL Technologies said in a statement announcing the deal.

Alan Lepofsky, an analyst at Constellation Research who keeps close watch on the enterprise collaboration space says that the sale could represent a fresh start for software that IBM hasn’t really been paying close attention to for some time. “HCL is far more interested in Notes/Domino than IBM has been for a decade. They are investing heavily, trying to rejuvenate the brand,” Lepofsky told TechCrunch.

While this software may feel long in tooth, Notes and Domino are still in use in many corners of the enterprise, and this is especially true in EMEA (Europe, Middle East and Africa) AP (Asia Pacific). Lepofsky said.

He added that IBM appears to be completely exiting the collaboration space with this sale. “It appears that IBM is done with collaboration, out of the game,” he said.

This move makes sense for IBM, which is moving in a different direction as it develops its cloud business. The Red Hat acquisition in October, in particular shows that the company wants to embrace private and hybrid cloud deployments, and older software like Lotus Notes and Domino don’t really play a role in that world.

The deal, which is subject to regulatory approval processes, is expected to close in the middle of next year.

Red Hat acquires hybrid cloud data management service NooBaa

Red Hat is in the process of being acquired by IBM for a massive $34 billion, but that deal hasn’t closed yet and, in the meantime, Red Hat is still running independently and making its own acquisitions, too. As the company today announced, it has acquired Tel Aviv-based NooBaa, an early-stage startup that helps enterprises manage […]

Red Hat is in the process of being acquired by IBM for a massive $34 billion, but that deal hasn’t closed yet and, in the meantime, Red Hat is still running independently and making its own acquisitions, too. As the company today announced, it has acquired Tel Aviv-based NooBaa, an early-stage startup that helps enterprises manage their data more easily and access their various data providers through a single API.

NooBaa’s technology makes it a good fit for Red Hat, which has recently emphasized its ability to help enterprise more effectively manage their hybrid and multicloud deployments. At its core, NooBaa is all about bringing together various data silos, which should make it a good fit in Red Hat’s portfolio. With OpenShift and the OpenShift Container Platform, as well as its Ceph Storage service, Red Hat already offers a range of hybrid cloud tools, after all.

“NooBaa’s technologies will augment our portfolio and strengthen our ability to meet the needs of developers in today’s hybrid and multicloud world,” writes Ranga Rangachari, the VP and general manager for storage and hyperconverged infrastructure at Red Hat, in today’s announcement. “We are thrilled to welcome a technical team of nine to the Red Hat family as we work together to further solidify Red Hat as a leading provider of open hybrid cloud technologies.”

While virtually all of Red Hat’s technology is open source, NooBaa’s code is not. The company says that it plans to open source NooBaa’s technology in due time, though the exact timeline has yet to be determined.

NooBaa was founded in 2013. The company has raised some venture funding from the likes of Jerusalem Venture Partners and OurCrowd, with a strategic investment from Akamai Capital thrown in for good measure. The company never disclosed the size of that round, though, and neither Red Hat nor NooBaa are disclosing the financial terms of the acquisition.