Peltarion raises $20M for its AI platform

Peltarion, a Swedish startup founded by former execs from companies like Spotify, Skype, King, TrueCaller and Google, today announced that it has raised a $20 million Series A funding round led by Euclidean Capital, the family office for hedge fund billionaire James Simons. Previous investors FAM and EQT Ventures also participated, and this round brings […]

Peltarion, a Swedish startup founded by former execs from companies like Spotify, Skype, King, TrueCaller and Google, today announced that it has raised a $20 million Series A funding round led by Euclidean Capital, the family office for hedge fund billionaire James Simons. Previous investors FAM and EQT Ventures also participated, and this round brings the company’s total funding to $35 million.

There is obviously no dearth of AI platforms these days. Peltarion focus on what it calls “operational AI.” The service offers an end-to-end platform that lets you do everything from pre-processing your data to building models and putting them into production. All of this runs in the cloud and developers get access to a graphical user interface for building and testing their models. All of this, the company stresses, ensures that Peltarion’s users don’t have to deal with any of the low-level hardware or software and can instead focus on building their models.

“The speed at which AI systems can be built and deployed on the operational platform is orders of magnitude faster compared to the industry standard tools such as TensorFlow and require far fewer people and decreases the level of technical expertise needed,” Luka Crnkovic-Friis, of Peltarion’s CEO and co-founder, tells me. “All this results in more organizations being able to operationalize AI and focusing on solving problems and creating change.”

In a world where businesses have a plethora of choices, though, why use Peltarion over more established players? “Almost all of our clients are worried about lock-in to any single cloud provider,” Crnkovic-Friis said. “They tend to be fine using storage and compute as they are relatively similar across all the providers and moving to another cloud provider is possible. Equally, they are very wary of the higher-level services that AWS, GCP, Azure, and others provide as it means a complete lock-in.”

Peltarion, of course, argues that its platform doesn’t lock in its users and that other platforms take far more AI expertise to produce commercially viable AI services. The company rightly notes that, outside of the tech giants, most companies still struggle with how to use AI at scale. “They are stuck on the starting blocks, held back by two primary barriers to progress: immature patchwork technology and skills shortage,” said Crnkovic-Friis.

The company will use the new funding to expand its development team and its teams working with its community and partners. It’ll also use the new funding for growth initiatives in the U.S. and other markets.

Amazon’s NYC educational investments will continue, despite cancellation of New York HQ2

Amazon’s plans to invest in New York area engineering training programs and other local educational initiatives are not being canceled, despite Amazon’s announcement today that it will no longer open one of its HQ2 locations in New York City. The retailer decided to end its plans for the New York headquarters after significant backlash from local […]

Amazon’s plans to invest in New York area engineering training programs and other local educational initiatives are not being canceled, despite Amazon’s announcement today that it will no longer open one of its HQ2 locations in New York City. The retailer decided to end its plans for the New York headquarters after significant backlash from local politicians and citizens alike who, as Amazon put it, “have made it clear that they oppose our presence.”

The deal Amazon had brokered with New York politicians had included up to $1.5 billion in grants and tax breaks in the state, in exchange for bringing 25,000 new jobs to the NYC area.

But Amazon jobs weren’t all the company was investing in – the company had also recently said it would fund educational programs and training at New York area high schools and colleges.

Specifically, Amazon said it would fund computer science classes in more than 130 New York City area high schools, including both introductory and Advanced Placement (AP) classes. The classes would be offered across all five NYC boroughs, including more than 30 schools in Queens – the planned location for the new headquarters.

These classes were to be funded by Amazon’s Future Engineer program, which works to bring computer science courses to over 100,000 underprivileged kids in 2,000 low-income high schools in the U.S.

In addition, Amazon said it was teaming up with area colleges and universities, including LaGuardia Community College (LAGCC), the City University of New York (CUNY) and the State University of New York (SUNY) to create a cloud computing certificate program for students across New York.

This program was supported by Amazon’s AWS Educate program.

The Educate program is currently used by more than 1,500 institutions to train students in cloud computing by offering them hands-on experience in AWS technology. The students can then apply for jobs at Amazon and elsewhere, upon completion.

Amazon has not officially commented on how the HQ2 news will impact these programs in New York, but sources familiar with the situation told TechCrunch that both educational programs are continuing – regardless of what’s happened with HQ2.

Though obviously meant to help build a pipeline for the NYC HQ2, the programs’ larger goals are about creating new engineering talent who know how to work with Amazon’s cloud computing platform, AWS.

Though these students will now not have a direct exit to a New York-area HQ2, Amazon still has over 5,000 employees in Brooklyn, Manhattan, and Staten Island, the company said today in its HQ2 announcement – and it plans to grow those teams in the years ahead.

That means it can’t hurt to continue to build the talent pipeline in New York. After all, Amazon could still woo program grads to other East Coast locations, including Northern Virginia and Nashville, as well as to its other 17 offices and hubs across the U.S. and Canada.

 

It isn’t just apps. China’s cinemas broke records during Lunar New Year

China celebrated Lunar New Year last week as hundreds of millions of people travelled to their hometowns. While many had longed to see their separated loved ones, others dreaded the weeklong holiday as relatives awkwardly caught up with them with questions like: “Why are you not married? How much do you earn?” Luckily, there are […]

China celebrated Lunar New Year last week as hundreds of millions of people travelled to their hometowns. While many had longed to see their separated loved ones, others dreaded the weeklong holiday as relatives awkwardly caught up with them with questions like: “Why are you not married? How much do you earn?”

Luckily, there are ways to survive the festive time in this digital age. Smartphone usage during this period has historically surged. Short video app TikTok’s China version Douyin noticeably took off by acquiring 42 million new users over the first week of last year’s holiday, a report from data analytics firm QuestMobile shows. Tencent’s mobile game blockbuster Honor of Kings similarly gained 76 percent DAUs during that time, according to another QuestMobile report.

People also hid away by immersing themselves in the cinema during the Lunar New Year, a movie-going period akin to the American holiday season. This year, China wrapped up the first six days of the New Year with a record-breaking 5.8 billion ($860 million) yuan box office, according to data collected by Maoyan, Alibaba’s movie ticketing service slated for an initial public offering.

The new benchmark, however, did not reflect an expanding viewership. Rather, it came from price hikes in movie tickets, market research firm EntGroup suggests. On the first day of Year of the Pig, tickets were sold at an average of 45 yuan ($6.65), up from 39 yuan last year. That certainly put some price-sensitive audience off — though not by a huge margin as there wasn’t much to do otherwise. (Shops were closed. Fireworks and firecrackers, which are traditionally set off during the New Year to drive bad spirits away, are also banned in most Chinese cities for safety concerns.) Cinemas across China sold 31.69 million tickets on the first day, a slight decline from last year’s 32.63 million.

Dawn of Chinese sci-fi

wandering earth 2

Image source: The Wandering Earth via Weibo

Many Chinese companies don’t return to work until this Thursday, so the box office results are still being announced. Investment bank Nomura put the estimated total at 6.2 billion yuan. What’s also noticeable about this year’s film-inspired holiday peak is the fervor that sci-fi The Wandering Earth whipped up.

American audiences may find in the Chinese film elements of Interstellar’s space adventures, but The Wandering Earth will likely resonate better with the Chinese audience. Adapted from the novel of Hugo Award-winning Chinese author Liu Cixin, the film tells the story of the human race seeking a new home as the aging sun is about to devour the earth. A group of Chinese astronauts, scientists and soldiers eventually work out a plan to postpone the apocalypse — a plot deemed to have stoke Chinese viewers’ sense of pride, though the rescue also involves participation from other nations.

The film, featuring convincing special effects, is also widely heralded as the dawn of Chinese-made sci-fi films. The sensation gave rise to a wave of patriotic online reviews like “If you are Chinese, go watch The Wandering Earth” though it’s unclear whether the discourse was genuine or have been manipulated.

Alibaba’s movie powerhouse

This record-smashing holiday has also been a big win for Alibaba, the Chinese internet outfit best known for ecommerce and increasingly cloud computing. Its content production segment Alibaba Pictures has backed five of the movies screened during the holiday, one of which being the blockbuster The Wandering Earth that also counts Tencent as an investor.

Tech giants with online streaming services are on course to upend China’s film and entertainment industry, a sector traditionally controlled by old-school production houses. In its most recent quarter, Alibaba increased its stake to take majority control in Alibaba Pictures, the film production business it acquired in 2014. Tencent and Baidu have also spent big bucks on content creation. While Tencent zooms in on video games and anime, Baidu’s Netflix-style video site iQiyi has received wide acclaim for house-produced dramas like Yanxi Palace, a smash hit drama about backstabbing concubines that was streamed over 15 billion times.

Seeing all the entertainment options on the table, the Chinese government made a pre-emptive move against the private players by introducing a news app designed for propaganda purposes in the weeks leading to the vacation.

“The timing of the publishing of this app might be linked to the upcoming Chinese New Year Festival, which the Chinese Communist Party sees as an opportunity and a necessity to spread their ideology,” Kristin Shi-Kupfer, director of German think tank MERICS, told TechCrunch earlier. “[It] may be hoping that people would use the holiday season to take a closer look, but probably also knowing that most people would rather choose other sources to relax, consume and travel.”

IBM brings Watson to any cloud

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a […]

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a managed service.

“Clients are really struggling with infusing AI into their applications because the data is distributed in multiple places,” IBM Watson’s CTO and chief architects Ruchir Puri told me when I asked him for IBM’s reasoning behind this move. “It’s in these hybrid environments, they’ve got multiple cloud implementations, they have data in their private cloud as well. They have been struggling because the providers of AI have been trying to lock them into a particular implementation that is not suitable to this hybrid cloud environment.”

So with this decision of bringing Watson to any cloud, IBM wants to give these businesses the option to bring AI to their data, which is significantly harder and costlier to move, after all. Purir also stressed that many enterprises have long wanted to use AI to make their operations more efficient, but they needed to run their AI tools in an environment that they control and feel comfortable with.

At the core of the technical specifications for running Watson in their public or private cloud is IBM Cloud Private, the company’s private cloud platform that uses open source technologies for running tools and services like Kubernetes and Cloud Foundry. That’s the platform that allows enterprises to then run Watson, too (which itself runs on containers, too).

Right now, the focus of this fire launch is on Watson Assistant and Watson OpenScale. “The capabilities we are releasing right now are based on our two flagship products. That addresses a very large domain of use cases that we come across,” said Puri. “In the remaining part of the year, we will being the rest of the capabilities [to the platform]. For example, Watson Knowledge Studio will come along with it as well, as well as Watson’s natural language understanding capabilities that we currently have available in our public cloud environment will be ported on to it as well.”

With that, Puri argues, IBM will offer enterprises a full spectrum of tools for developing and running AI models using structured and unstructured data, as well as a full monitoring and lifecycle management suite.

In addition to this, IBM also today announced that it is launching a new version of its Watson Machine Learning Accelerator that brings high-performance GPU clustering to Power Systems and X86 systems and which promises to accelerate AI performance up to 10x.

The company also today announced IBM Business Automation Intelligence with Watson, though it didn’t quite delve into the details. This new service, the company says, will give business leaders the ability “to apply AI directly to applications, strengthening the workforce, from clerical to knowledge workers, to intelligently automate work from the mundane to the complex.” I’m not really sure what that means, but I’m sure the business leaders who will buy this service will figure it out.

IBM brings Watson to any cloud

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a […]

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a managed service.

“Clients are really struggling with infusing AI into their applications because the data is distributed in multiple places,” IBM Watson’s CTO and chief architects Ruchir Puri told me when I asked him for IBM’s reasoning behind this move. “It’s in these hybrid environments, they’ve got multiple cloud implementations, they have data in their private cloud as well. They have been struggling because the providers of AI have been trying to lock them into a particular implementation that is not suitable to this hybrid cloud environment.”

So with this decision of bringing Watson to any cloud, IBM wants to give these businesses the option to bring AI to their data, which is significantly harder and costlier to move, after all. Purir also stressed that many enterprises have long wanted to use AI to make their operations more efficient, but they needed to run their AI tools in an environment that they control and feel comfortable with.

At the core of the technical specifications for running Watson in their public or private cloud is IBM Cloud Private, the company’s private cloud platform that uses open source technologies for running tools and services like Kubernetes and Cloud Foundry. That’s the platform that allows enterprises to then run Watson, too (which itself runs on containers, too).

Right now, the focus of this fire launch is on Watson Assistant and Watson OpenScale. “The capabilities we are releasing right now are based on our two flagship products. That addresses a very large domain of use cases that we come across,” said Puri. “In the remaining part of the year, we will being the rest of the capabilities [to the platform]. For example, Watson Knowledge Studio will come along with it as well, as well as Watson’s natural language understanding capabilities that we currently have available in our public cloud environment will be ported on to it as well.”

With that, Puri argues, IBM will offer enterprises a full spectrum of tools for developing and running AI models using structured and unstructured data, as well as a full monitoring and lifecycle management suite.

In addition to this, IBM also today announced that it is launching a new version of its Watson Machine Learning Accelerator that brings high-performance GPU clustering to Power Systems and X86 systems and which promises to accelerate AI performance up to 10x.

The company also today announced IBM Business Automation Intelligence with Watson, though it didn’t quite delve into the details. This new service, the company says, will give business leaders the ability “to apply AI directly to applications, strengthening the workforce, from clerical to knowledge workers, to intelligently automate work from the mundane to the complex.” I’m not really sure what that means, but I’m sure the business leaders who will buy this service will figure it out.

IBM brings Watson to any cloud

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a […]

IBM today announced that it is freeing its Watson-branded AI services like the Watson Assistant for building conversational interfaces and Watson OpenScale for managing the AI lifecycle from its own cloud and allowing enterprises to take its platform and running it their own data centers. In a way, you can think of this as Watson as a managed service.

“Clients are really struggling with infusing AI into their applications because the data is distributed in multiple places,” IBM Watson’s CTO and chief architects Ruchir Puri told me when I asked him for IBM’s reasoning behind this move. “It’s in these hybrid environments, they’ve got multiple cloud implementations, they have data in their private cloud as well. They have been struggling because the providers of AI have been trying to lock them into a particular implementation that is not suitable to this hybrid cloud environment.”

So with this decision of bringing Watson to any cloud, IBM wants to give these businesses the option to bring AI to their data, which is significantly harder and costlier to move, after all. Purir also stressed that many enterprises have long wanted to use AI to make their operations more efficient, but they needed to run their AI tools in an environment that they control and feel comfortable with.

At the core of the technical specifications for running Watson in their public or private cloud is IBM Cloud Private, the company’s private cloud platform that uses open source technologies for running tools and services like Kubernetes and Cloud Foundry. That’s the platform that allows enterprises to then run Watson, too (which itself runs on containers, too).

Right now, the focus of this fire launch is on Watson Assistant and Watson OpenScale. “The capabilities we are releasing right now are based on our two flagship products. That addresses a very large domain of use cases that we come across,” said Puri. “In the remaining part of the year, we will being the rest of the capabilities [to the platform]. For example, Watson Knowledge Studio will come along with it as well, as well as Watson’s natural language understanding capabilities that we currently have available in our public cloud environment will be ported on to it as well.”

With that, Puri argues, IBM will offer enterprises a full spectrum of tools for developing and running AI models using structured and unstructured data, as well as a full monitoring and lifecycle management suite.

In addition to this, IBM also today announced that it is launching a new version of its Watson Machine Learning Accelerator that brings high-performance GPU clustering to Power Systems and X86 systems and which promises to accelerate AI performance up to 10x.

The company also today announced IBM Business Automation Intelligence with Watson, though it didn’t quite delve into the details. This new service, the company says, will give business leaders the ability “to apply AI directly to applications, strengthening the workforce, from clerical to knowledge workers, to intelligently automate work from the mundane to the complex.” I’m not really sure what that means, but I’m sure the business leaders who will buy this service will figure it out.

Microsoft Azure sets its sights on more analytics workloads

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine […]

Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It’s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that’s just a first step. After that, you also need the analytics tools to make all of this data useful.

Today, it’s Microsoft turn to shine the spotlight on its data analytics services. The actual news here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.

Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what’s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data — and then using it for building analytics and AI services.

(Photo credit:Josh Edelson/AFP/Getty Images)

“AI is a top priority for every company around the globe,” Julia White, Microsoft’s corporate VP for Azure, told me. “And as we are working with our customers on AI, it becomes clear that their analytics often aren’t good enough for building an AI platform.” These companies are generating plenty of data, which then has to be pulled into analytics systems. She stressed that she couldn’t remember a customer conversation in recent months that didn’t focus on AI. “There is urgency to get to the AI dream,” White said, but the growth and variety of data presents a major challenge for many enterprises. “They thought this was a technology that was separate from their core systems. Now it’s expected for both customer-facing and line-of-business applications.”

Data Lake Storage helps with managing this variety of data since it can handle both structured and unstructured data (and is optimized for the Spark and Hadoop analytics engines). The service can ingest any kind of data — yet Microsoft still promises that it will be very fast. “The world of analytics tended to be defined by having to decide upfront and then building rigid structures around it to get the performance you wanted,” explained White. Data Lake Storage, on the other hand, wants to offer the best of both worlds.

Likewise, White argued that while many enterprises used to keep these services on their on-premises servers, many of them are still appliance-based. But she believes the cloud has now reached the point where the price/performance calculations are in its favor. It took a while to get to this point, though, and to convince enterprises. White noted that for the longest time, enterprises that looked at their analytics projects thought $300 million projects took forever, tied up lots of people and were frankly a bit scary. “But also, what we had to offer in the cloud hasn’t been amazing until some of the recent work,” she said. “We’ve been on a journey — as well as the other cloud vendors — and the price performance is now compelling.” And it sure helps that if enterprises want to meet their AI goals, they’ll now have to tackle these workloads, too.

Google’s still not sharing cloud revenue

Google has shared its cloud revenue exactly once over the last several years. Silence tends to lead to speculation to fill the information vacuum. Luckily there are some analyst firms who try to fill the void, and it looks like Google’s cloud business is actually trending in the right direction, even if they aren’t willing […]

Google has shared its cloud revenue exactly once over the last several years. Silence tends to lead to speculation to fill the information vacuum. Luckily there are some analyst firms who try to fill the void, and it looks like Google’s cloud business is actually trending in the right direction, even if they aren’t willing to tell us an exact number.

When Google last reported its cloud revenue, last year about this time, they indicated they had earned $1 billion in revenue for the quarter, which included Google Cloud Platform and G Suite combined. Diane Greene, who was head of Google Cloud at the time, called it an “elite business.” but in reality it was pretty small potatoes compared to Microsoft’s and Amazon’s cloud numbers, which were pulling in $4-$5 billion a quarter between them at the time. Google was looking at a $4 billion run rate for the entire year.

Google apparently didn’t like the reaction it got from that disclosure so it stopped talking about cloud revenue. Yesterday when Google’s parent company, Alphabet, issued its quarterly earnings report, to nobody’s surprise, it failed to report cloud revenue yet again, at least not directly.

Google CEO Sundar Pichai gave some hints, but never revealed an exact number. Instead he talked in vague terms calling Google Cloud “a fast-growing multibillion-dollar business.” The only time he came close to talking about actual revenue was when he said, “Last year, we more than doubled both the number of Google Cloud Platform deals over $1 million as well as the number of multiyear contracts signed. We also ended the year with another milestone, passing 5 million paying customers for our cloud collaboration and productivity solution, G Suite.”

OK, it’s not an actual dollar figure, but it’s a sense that the company is actually moving the needle in the cloud business. A bit later in the call, CFO Ruth Porat threw in this cloud revenue nugget. “We are also seeing a really nice uptick in the number of deals that are greater than $100 million and really pleased with the success and penetration there. At this point, not updating further.” She is not updating further. Got it.

That brings us to a company that guessed for us, Canalys. While the firm didn’t share its methodology, it did come up with a figure of $2.2 billion for the quarter. Given that the company is closing larger deals and was at a billion last year, this figure feels like it’s probably in the right ballpark, but of course it’s not from the horse’s mouth, so we can’t know for certain.

Frankly, I’m a little baffled why Alphabet’s shareholders actually let the company get away with this complete lack of transparency. It seems like people would want to know exactly what they are making on that crucial part of the business, wouldn’t you? As a cloud market watcher, I know I would. So we’re left to companies like Canalys to fill in the blanks, but it’s certainly not as satisfying as Google actually telling us. Maybe next quarter.

BetterCloud can now manage any SaaS application

BetterCloud began life as a way to provide an operations layer for G Suite. More recently, after a platform overhaul, it began layering on a handful of other SaaS applications. Today, the company announced, it is now possible to add any SaaS application to its operations dashboard and monitor usage across applications via an API. […]

BetterCloud began life as a way to provide an operations layer for G Suite. More recently, after a platform overhaul, it began layering on a handful of other SaaS applications. Today, the company announced, it is now possible to add any SaaS application to its operations dashboard and monitor usage across applications via an API.

As founder and CEO David Politis explains, a tool like Okta provides a way to authenticate your SaaS app, but once an employee starts using it, BetterCloud gives you visibility into how it’s being used.

“The first order problem was identity, the access, the connections. What we’re doing is we’re solving the second order problem, which is the interactions,” Politis explained. In his view, companies lack the ability to monitor and understand the interactions going on across SaaS applications, as people interact and share information, inside and outside the organization. BetterCloud has been designed to give IT control and security over what is occurring in their environment, he explained.

He says they can provide as much or as little control as a company needs, and they can set controls by application or across a number of applications without actually changing the user’s experience. They do this through a scripting library. BetterCloud comes with a number of scripts and provides log access to give visibility into the scripting activity.

If a customer is looking to use this data more effectively, the solution includes a Graph API for ingesting data and seeing the connections across the data that BetterCloud is collecting. Customers can also set event triggers or actions based on the data being collected as certain conditions are met.

All of this is possible because the company overhauled the platform last year to allow BetterCloud to move beyond G Suite and plug other SaaS applications into it. Today’s announcement is the ultimate manifestation of that capability. Instead of BetterCloud building the connectors, it’s providing an API to let its customers do it.

The company was founded in 2011 and has raised over $106 million, according to Crunchbase.

Tencent moves into automotive with $150M joint venture

China’s internet firms are getting pally with giant state-owned automakers as they look to deploy their artificial intelligence and cloud computing services across traditional industries. Ride-hailing startup Didi Chuxing, which owns Uber China, announced earlier this week a new joint venture with state-owned BAIC. Hot on the heels came another entity set up between Tencent and the […]

China’s internet firms are getting pally with giant state-owned automakers as they look to deploy their artificial intelligence and cloud computing services across traditional industries. Ride-hailing startup Didi Chuxing, which owns Uber China, announced earlier this week a new joint venture with state-owned BAIC. Hot on the heels came another entity set up between Tencent and the GAC Group.

GAC, which is owned by the Guangzhou municipal government in southern China, announced Thursday in a filing it will jointly establish a mobility company with social media and gaming behemoth Tencent, Guangzhou Public Transport Group alongside other investors.

The announcement followed an agreement between Tencent and GAC in 2017 to team up on internet-connected cars and smart driving, a deal that saw the carmaker tapping into Tencent’s expertise in mobile payments, social networking, big data and cloud services. Tencent, which is most famous for its instant messenger WeChat, went through a major restructuring last October to place more focus on enterprise-facing services, and the GAC tie-up appears to fit nicely into that pivot.

The fresh venture will bank a capital infusion of 1 billion yuan ($149 million) with GAC owning a 35 percent stake. Tencent and Guangzhou Public Transport will take up 25 percent and 10 percent, respectively.

A flurry of Chinese internet service providers have made forays into the automotive industry, marketing their digital and machine learning capabilities at old-school automakers. Besides Tencent, GAC has also recruited telecommunications equipment maker Huawei and voice assistant startup iFlytec to upgrade its vehicles. Search titan Baidu, on the other hand, operates an open platform for autonomous driving cars and has chosen state-owned Hongqi to test out its autonomous driving solutions. Ecommerce behemoth Alibaba has also set foot in transportation with a smart sedan jointly developed with state-owned SAIC.