As people increasingly use their mobile phones and other devices to shop, it has become imperative for vendors to improve the shopping experience, making it as simple as possible, given the small footprint. One way to do that is using artificial intelligence. Today, Salesforce announced some AI-enhanced APIs designed to keep us engaged as shoppers. […]
As people increasingly use their mobile phones and other devices to shop, it has become imperative for vendors to improve the shopping experience, making it as simple as possible, given the small footprint. One way to do that is using artificial intelligence. Today, Salesforce announced some AI-enhanced APIs designed to keep us engaged as shoppers.
For starters, the company wants to keep you shopping. That means providing an intelligent recommendation engine. If you searched for a particular jacket, you might like these similar styles, or this scarf and gloves. That’s fairly basic as shopping experiences go, but Salesforce didn’t stop there. It’s letting developers embed this ability to recommend products in any app whether that’s maps, social or mobile.
That means shopping recommendations could pop up anywhere developers think it makes sense like on your maps app. Whether consumers see this as a positive thing, Salesforce says when you add intelligence to the shopping experience, it increases sales anywhere from 7-16 percent, so however you feel about it, it seems to be working.
The company also wants to make it simple to shop. Instead of entering a long faceted search as has been the traditional way of shopping in the past — footwear, men’s, sneakers, red — you can take a picture of a sneaker (or anything you like) and the visual search algorithm should recognize it and make recommendations based on that picture. It reduces data entry for users, which is typically a pain on the mobile device, even if it has been simplified by checkboxes.
Salesforce has also made inventory availability as a service, allowing shoppers to know exactly where the item they want is available in the world. If they want to pick up in-store that day, it shows where the store is on a map and could even embed that into your ride-sharing app to indicate exactly where you want to go. The idea is to create this seamless experience between consumer desire and purchase.
Finally, Salesforce has added some goodies to make developers happy too including the ability to browse the Salesforce API library and find the ones that make most sense for what they are creating. This includes code snippets to get started. It may not seem like a big deal, but as companies the size of Salesforce increase their API capabilities (especially with the Mulesoft acquisition), it’s harder to know what’s available. The company has also created a sandboxing capability to let developers experiment and build capabilities with these APIs in a safe way.
Plaid, a startup that has made a name building APIs for financial services apps aimed at consumers, expanded its horizon today with the purchase of Quovo, a similar company with a focus on the investments side of the financial service business. Bloomberg reported the price tag could be as much as $200 million, but the […]
Plaid, a startup that has made a name building APIs for financial services apps aimed at consumers, expanded its horizon today with the purchase of Quovo, a similar company with a focus on the investments side of the financial service business.
Bloomberg reported the price tag could be as much as $200 million, but the company told TechCrunch that it is not sharing the price. It seems that Plaid, which has raised almost $310 million, including $250 million on a $2.65 billion valuation just last month, was eager to put the money to work.
While Plaid helps link your checking and savings account to modern financial apps like Venmo, Acorns and Robinhood, Quovo’s APIs are aimed at the investment side of the financial services market with up and coming customers like Betterment, Wealthfront and SoFi, and established players like Stifel, Vanguard, Empower Retirement and John Hancock.
The combined companies plan to offer a full range of financial services APIs. “Together, we’ll build a single platform that developers and large companies alike can use to build any financial application—from payments to lending to wealth management,” the company wrote in a blog post announcing the deal.
Each company helps developers build financial services applications by providing the tools to integrate with accounts, so that developers don’t have to build these links from scratch. Much like Twilio helps build communications into an app and Stripe helps add payments functionality, Plaid builds easy integration with checking and savings accounts. Now it will also provide similar integration for investments.
The deal is expected to close this week and the Quovo team is expected to join Plaid shortly. Quovo CEO, Lowell Putnam, will continue to run the Quovo team and lead strategy for that part of the product set, according to a company spokesperson.
Quovo was founded in 2010 and has raised $21 million, including $4.8 million Series B last May, according to Crunchbase data. If the price is accurate, it would seem that the company gave a nice return on the $21 million investors made in the company.
Today, many companies provide developer access to their services via APIs. Moesif, a San Francisco startup, wants to help these companies gain insight into their customer’s API usage patterns. Today, the company announced a $3.5 million seed round. The investment was led by Merus Capital with participation by Heavybit, Fresco Capital and Zach Coelius, who sold […]
Today, many companies provide developer access to their services via APIs. Moesif, a San Francisco startup, wants to help these companies gain insight into their customer’s API usage patterns. Today, the company announced a $3.5 million seed round.
Moesif co-founder and CEO Derric Gilling says Moesif is akin to Mixpanel or Google Analytics, except instead of tracking web or mobile analytics, it looks at API usage. “As more and more companies are using and creating these APIs, there comes a point where you need to understand how your customers are using them, any problems they are running into and how do you actually decrease developer churn.”
Heat map showing API usage by region. Screenshot: Moesif
The company is aiming at two primary types of users. First of all, there are developers who can use the monitoring features to understand when there are issues with the API. These folks have access to the free tier.
Moesif also targets business units like product management, sales and marketing, who use the tool to understand who’s using the API, how often, and with machine learning, understand who is likely to stop using the product based on how they are using it. The tool can tie into other business systems like Mailchimp or a CRM tool to get a more complete picture of customers as they use the API.
The product was released last year and Gilling says his company already has 2000 customers, which includes both the free and pay tiers. He said they have had particular success with SaaS and FinTech companies, both of which make heavy use of APIs. Customers include PowerSchool, Schwab and InsideSales.
While the company currently consists of the three founders, flush with the seed investment, it intends to hire around 10 people in the next six months including a VP of engineering, additional developers and sales and marketing folks.
With an API strategy in place, businesses can start to realize the full potential AI has to offer.
Anirudh Pandit is head of Solution Engineering, NA-West, office of the CTO, MuleSoft.
Technology has been the cornerstone of economic growth around the world for hundreds of years. It has underpinned the last three industrial revolutions and is now the driving factor in today’s Fourth Industrial Revolution — marked by emerging technologies in a variety of fields.
Unsurprisingly, artificial intelligence is one of the key technologies driving this new revolution. As described in the 1950s by the father of modern computer science, Alan Turing, “What we want is a machine that can learn from experience.” His paper, “Computing Machinery and Intelligence,” is the earliest description of neural networks and how computer intelligence should be measured. While the concept of AI isn’t new, we’re only on the cusp of seeing AI drive real business value in the enterprise.
Businesses today are trying to augment and improve their customer, partner and employee experiences by leveraging AI. However, what many have yet to realize is that AI is only as good as the APIs that support it.
For example, we’re seeing the rise of conversational commerce, where consumers can interact with businesses and their services via digital voice assistants such as Alexa and Siri. Two very important things occur here. First, the voice assistant uses AI and machine learning technology — or algorithms that are trained using massive amounts of existing data — to understand voice commands. Second, the voice assistant acts on those commands by calling back-end services with APIs that do the actionable work. This can include getting product information from a database or placing an order with the order management system. APIs truly bring AI to life and, without them, the value of AI models cannot be unlocked for the enterprise.
However, many businesses are adopting AI as a point solution to help customers with queries via a chatbot or with making recommendations via an AI and machine learning-based platform. These point solutions don’t have the ability to influence the entire customer journey. The customer journey in today’s digital world is complex, with interactions spanning many different applications, data sources and devices. It is very hard for businesses to unlock and integrate data across all the application silos in their enterprise (e.g. ERP, CRM, mainframes, databases) to create a 360-degree view of the customer.
So, how do businesses go about unlocking these information systems to make AI a reality? The answer is an API strategy. With the ability to securely share data across systems regardless of format or source, APIs become the nervous system of the enterprise. As a result of making appropriate API calls, applications that interact with AI models can now take actionable steps, based on the insights provided by the AI system — or the brain.
How APIs can bring AI to life
The key to building a successful AI-based platform is to invest in delivering consistent APIs that are easily discoverable and consumable by developers across the organization. Fortunately, with the emergence of API marketplaces, software developers don’t have to break a sweat to create everything from scratch. Instead, they can discover and reuse the work done by others internally and externally to accelerate development work.
Additionally, APIs help train the AI system by enabling access to the right information. APIs also provide the ability for AI systems to act across the entire customer journey by enabling a communication channel — the nervous system — with the broader application landscape. By calling appropriate APIs, developers can act on insights provided by the AI system. For example, Alexa or Siri cannot place an order for a customer directly in the back-end ERP system without a bridge. An API can serve as that bridge, as well as be reused for other application interactions to that ERP system down the road.
At their core, APIs are developed to play a specific role — unlocking data from legacy systems, composing data into processes or delivering an experience. By unlocking data that exists in siloed systems, businesses end up democratizing the availability of data across the enterprise. Developers can then choose information sources to train the AI models and connect the AI systems into the enterprise’s broader application network to take action.
Using AI to enhance the customer journey
As AI systems and APIs get leveraged together to build adaptive and actionable platforms, the customer journey changes dramatically. Consider this scenario: A bank offers a mobile app that targets customers looking to buy or sell a home. In the app, customers can simply point at the property they are interested in and immediately rich data comes together via APIs to provide historical information on property sales, nearby listings and market trends. Customers can then interact with an AI-powered digital assistant on the app to start the loan application process, including getting lender approval and mortgage rates. All the data captured from the mobile app can then feed the mortgage origination process to reduce errors and provide a fast and superior experience to the customer.
Businesses haven’t truly realized the full potential of AI systems at a strategic level, where they are building adaptive platforms that truly create differentiated value for their customers. Most organizations are leveraging AI to analyze large volumes of data and generate insights on customer engagement, though it’s not strategic enough. Strategic value can be realized when these AI systems are plugged into the enterprise’s wider application network to drive personalized, 1:1 customer journeys. With an API strategy in place, businesses can start to realize the full potential AI has to offer.
When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce. When you […]
When Salesforce bought Mulesoft last spring for the tidy sum of $6.5 billion, it looked like money well spent for the CRM giant. After all, it was providing a bridge between the cloud and the on-prem data center and that was a huge missing link for a company with big ambitions like Salesforce.
When you want to rule the enterprise, you can’t be limited by where data lives and you need to be able to share information across disparate systems. Partly that’s a simple story of enterprise integration, but on another level it’s purely about data. Salesforce introduced its intelligence layer, dubbed Einstein, at Dreamforce in 2016.
With Mulesoft in the fold, it’s got access to data cross systems wherever it lives, in the cloud or on-prem. Data is the is the fuel of artificial intelligence, and Salesforce has been trying desperately to get more data for Einstein since its inception.
It lost out on LinkedIn to Microsoft, which flexed its financial muscles and reeled in the business social network for $26.5 billion a couple of years ago. It’s undoubtedly a rich source of data that the company longed for. Next, it set its sights on Twitter (although Twitter was ultimately never sold, of course). After board and stockholder concerns, the company walked away.
Each of these forays was all about the data, and frustrated, Salesforce went back to the drawing board. While Mulesoft did not supply the direct cache of data that a social network would have, it did provide a neat way for them to get at backend data sources, the very type of data that matters most to its enterprise customers.
Today, they have extended that notion beyond pure data access to a graph. You can probably see where this is going. The idea of a graph, the connections between say a buyer and the things they tend to buy or a person on a social network and people they tend to interact with can be extended even to the network/API level and that is precisely the story that Salesforce is trying to tell this week at the Dreamforce customer conference in San Francisco.
Visualizing connections in a data integration network in Mulesoft. Screenshot: Salesforce/Mulesoft
Maureen Fleming, program vice president for integration and process automation research at IDC says that it is imperative that organizations view data as a strategic asset and act accordingly. “Very few companies are getting all the value from their data as they should be, as it is locked up in various applications and systems that aren’t designed to talk to each other. Companies who are truly digitally capable will be able to connect these disparate data sources, pull critical business-level data from these connections, and make informed business decisions in a way that delivers competitive advantage,” Fleming explained in a statement.
Configuring data connections on Mulesoft Anypoint Platform. Gif: Salesforce/Mulesoft
It’s hard to underestimate the value of this type of data is to Salesforce, which has already put Mulesoft to work internally to help build the new Customer 360 product announced today. It can point to how it’s providing this very type of data integration to which Fleming is referring on its own product set.
Bret Taylor, president and chief product officer at Salesforce, says that for his company all of this is ultimately about enhancing the customer experience. You need to be able to stitch together these different computing environments and data silos to make that happen.
“In the short term, [customer] infrastructure is often fragmented. They often have some legacy applications on premise, they’ll have some cloud applications like Salesforce, but some infrastructure in on Amazon or Google and Azure, and to actually transform the customer experience, they need to bring all this data together. And so it’s a really a unique time for integration technologies, like Mulesoft because it enables you to create a seamless customer experience, no matter where that
data lives, and that means you don’t need to wait for infrastructure to be perfect before you can transform your customer experience.”
Twilio, a company best known for supplying a communications APIs for developers has a product called Twilio Flex for building sophisticated customer service applications on top of Twilio’s APIs. Today, it announced it was acquiring Ytica (pronounced Why-tica) to provide an operational and analytical layer on top of the customer service solution. The companies would […]
Twilio, a company best known for supplying a communications APIs for developers has a product called Twilio Flex for building sophisticated customer service applications on top of Twilio’s APIs. Today, it announced it was acquiring Ytica (pronounced Why-tica) to provide an operational and analytical layer on top of the customer service solution.
The companies would not discuss the purchase price, but Twilio indicated it does not expect the acquisition to have a material impact on its “results, operations or financial condition.” In other words, it probably didn’t cost much.
Ytica, which is based in Prague, has actually been a partner with Twilio for some time, so coming together in this fashion really made a lot of sense, especially as Twilio has been developing Flex.
Twilio Flex is an app platform for contact centers, which offers a full stack of applications and allows users to deliver customer support over multiple channels, Al Cook, general manager of Twilio Flex explained. “Flex deploys like SaaS, but because it’s built on top of APIs, you can reach in and change how Flex works,” he said. That is very appealing, especially for larger operations looking for a flexible, cloud-based solution without the baggage of on-prem legacy products.
What the product was lacking, however, was a native way to manage customer service representatives from within the application, and understand through analytics and dashboards, how well or poorly the team was doing. Having that ability to measure the effectiveness of the team becomes even more critical the larger the group becomes, and Cook indicated some Flex users are managing enormous groups with 10,000-20,000 employees.
Ytica provides a way to measure the performance of customer service staff, allowing management to monitor and intervene and coach when necessary. “It made so much sense to join together as one team. They have huge experience in the contact center, and a similar philosophy to build something customizable and programmable in the cloud,” Cook said.
While Ytica works with other vendors beyond Twilio, CEO Simon Vostrý says that they will continue to support those customers, even as they join the Twilio family. “We can run Flex and can continue to run this separately. We have customers running on other SaaS platforms, and we will continue to support them,” he said.
The company will remain in Prague and become a Twilio satellite office. All 14 employees are expected to join the Twilio team and Cook says plans are already in the works to expand the Prague team.
Facebook users are complaining the company has removed the cross-posted tweets they had published to their profiles as Facebook updates. The posts’ removal took place following the recent API change that prevented Twitter users from continuing to automatically publish their tweets to Facebook. According to the affected parties, both the Facebook posts themselves, as well as […]
Facebook users are complaining the company has removed the cross-posted tweets they had published to their profiles as Facebook updates. The posts’ removal took place following the recent API change that prevented Twitter users from continuing to automatically publish their tweets to Facebook. According to the affected parties, both the Facebook posts themselves, as well as the conversation around those posts that had taken place directly on Facebook, are now gone. Reached for comment, Facebook says it’s aware of the issue and is looking into it.
TechCrunch was alerted to the problem by a reader who couldn’t find any information about the issue in Facebook’s Help Center. We’ve since confirmed the issue ourselves with several affected parties and confirmed it with Facebook.
Given the real-time nature of social media — and how difficult it is to pull up old posts — it’s possible that many of the impacted Facebook users have yet to realize their old posts have been removed.
In fact, we only found a handful of public complaints about the deletions, so far.
@facebook I used the Twitter for Facebook app for years, and I realize it's not working and isn't going to. But I just discovered all the Facebook updates it put have been deleted and dissappeared from my timeline! Is there a way to retrieve this?
A recent update to the Facebook Platform Policies ended the ability to automatically post Tweets to our Facebook profile or page and all of our previous Twitter posts were deleted by Facebook. #dfwwx#txwx#planohttps://t.co/sAOsbdBjVO
Above: selected complaints from Twitter about the data loss
Above: a comment on TechCrunch following our post on the API changes
Some of those who were impacted were very light Facebook users and had heavily relied on the cross-posting to keep their Facebook accounts active. As a result of the mass removals, their Facebook profiles are now fairly empty.
TechCrunch editor Matthew Panzarino is one of those here who was impacted. He points out that the ability to share tweets to Facebook was a useful way to reach people who weren’t on Twitter in order to continue a discussion with a different audience.
“I’ve had tweet cross-posting turned on for years, from the early days of it even existing. This just removed thousands of posts from my Facebook silently, with no warning,” Matthew told me. “Even though the posts didn’t originate on Facebook, I often had ongoing conversations about the posts once my Facebook friends (and audience) saw them. Many of them would never see them on Twitter either because they don’t follow me or they don’t use it,” he said.
“It’s wild to have all of that context just vanish,” he added.
Since then, Facebook has been trying to plug up the holes in its platform to prevent further data misuse. One of the changes it made was to stop third-parties from being able to post to Facebook as the logged-in user.
For existing apps, like Twitter, that permission was revoked on August 1, 2018.
Above: Twitter’s cross-posting feature, on the day it was disabled by the Facebook API change
Before the API changes, Twitter users were able to visit the “Apps” section from Twitter on the web, then authenticate with Facebook to have their tweets cross-posted to Facebook’s social network. Once enabled, the tweets would appear on the user’s page as a Facebook post they had published, and their friends could then like and comment on the post as any other.
In theory, the API changes should only have prevented Twitter users from continuing to cross-post their tweets to Facebook automatically. It shouldn’t have also deleted the existing posts from Facebook users’ profiles and business users’ Facebook Pages.
This is a breach of trust from a company that’s in the process of trying to repair a broken trust with its users across a number of fronts, including data misuse. Regardless of whatever new policy is in effect around apps and how they can post to Facebook, no one would have ever expected that Facebook would actually remove their old posts without warning.
We’re hoping that the problem is a bug that Facebook can resolve, and not something that will result in permanent data loss.
Facebook tells us while it doesn’t have further information about the problem at this time, it should have more to share tonight or tomorrow about what’s being done.
Nylas, a startup that helps developers integrate email content into applications via an API, announced a $16 million Series B today led by Spark Capital. Other investors joining in included Slack Fund, Industry Ventures, and ScaleUp along with existing investors 8VC, Great Oaks Capital, Rubicon Capital and John Chambers’ personal fund. Today’s investment brings the […]
Nylas, a startup that helps developers integrate email content into applications via an API, announced a $16 million Series B today led by Spark Capital.
Other investors joining in included Slack Fund, Industry Ventures, and ScaleUp along with existing investors 8VC, Great Oaks Capital, Rubicon Capital and John Chambers’ personal fund. Today’s investment brings the total raised to $30 million.
The Nylas API works in a similar way to Stripe or Twilio, but instead of helping developers connect to payments or communications with a couple of lines of code, Nylas helps them connect to email, calendar and contact information. The idea behind any API like this is to give developers who lack expertise in a particular area outside the core purpose of their application, easy access to a particular type of functionality.
Company CEO Gleb Polyakov says that prior to Nylas, there really wasn’t an effective way to connect to email systems without a lot of technical wrangling. “Every person who is using the Internet has an email address, and there’s an immense amount of data that lives in the mail box, in the calendar, in your address book. And up until now, companies have been unable to effectively use that data,” he told TechCrunch.
It seems like a must-have kind of ability to connect to this type of information from any application, but most companies have shied away from a comprehensive approach because it’s hard to do, says company co-founder and CTO Andrea Spanger.
“We have essentially built adapters for the native protocols for each email system: Gmail, Microsoft Exchange, open source iMap servers and all the different extensions that are available on the different iMap implementations. And the key part is that with these adapters, we can talk to backend providers like Google, GoDaddy and Yahoo, Spanger explained.
This capability could be useful for developers in lots of scenarios such as pulling data for a CRM tool from an email exchange between a salesperson and a customer, or to coordinate meetings around the calendars of several individuals and an open meeting room that works for all of their schedules.
The company, which has been around for five years, currently has 35 employees with offices in New York and San Francisco. With the new funding, they expect to double that number by the end of the year, as it adds engineering and builds out its sales and marketing team. While much of the marketing up to now has been inbound from developers, they want to expand their customer base by marketing directly to companies.
It currently counts 200 customers and thousands of developers using the product. Customers include Comcast, Hyundai, News Corp, Salesloft and Dialpad.
Twitter tried to downplay the impact deactivating its legacy APIs would have on its community and the third-party Twitter clients preferred by many power users, by saying that “less than 1%” of Twitter developers were using these old APIs. Twitter is correct in its characterization of the size of this developer base, but it’s overlooking […]
Twitter tried to downplay the impact deactivating its legacy APIs would have on its community and the third-party Twitter clients preferred by many power users, by saying that “less than 1%” of Twitter developers were using these old APIs. Twitter is correct in its characterization of the size of this developer base, but it’s overlooking millions of third-party app users in the process. According to data from Sensor Tower, 6 million App Store and Google Play users installed the top five third-party Twitter clients between January 2014 to July 2018.
Over the past year, these top third-party apps were downloaded 500,000 times.
This data is largely free of reinstalls, the firm also said.
The top third-party Twitter apps users installed over the past three and a half years have included: Twitterrific, Echofon, TweetCaster, Tweetbot, and Ubersocial.
Of course, some portion of those users may have since switched to the Twitter’s native app for iOS or Android, or they may run both a third-party app and Twitter’s own app in parallel.
Even if only some of these six million users remain, they represent a small, vocal, and – in some cases, prominent – user base. It’s one that is very upset right now, too. And for a company that just posted a loss of 1 million users during its last earnings, it seems odd that Twitter would not figure out a way to accommodate this crowd, or even bring them onboard its new API platform to make money from them.
Twitter, apparently, is weighing data and facts, not user sentiment and public perception when it made this decision. But some things have more value than numbers on a spreadsheet. They are part of a company’s history and culture. Of course, Twitter has every right to blow all that up and move on, but that doesn’t make it the right decision.
To be fair, Twitter is not lying when it says this is a small group. The third-party user base is tiny compared with Twitter’s native app user base. During the same time that 6 million people were downloading third-party apps, the official Twitter app was installed a whopping 560 million times across iOS and Android. That puts the third-party apps’ share of installs at about 1.1% of the total.
That user base may have been shrinking over the years, too. During the past year, while the top third-party apps were installed half a million times, Twitter’s app was installed 117 million times. This made third-party apps’ share only about 0.4% of downloads, giving the official app a 99% market share.
But third-party app developers and the apps’ users are power users. Zealots, even. Evangelists.
Twitter itself credited them with pioneering “product features we all know and love” like the mute option, pull-to-refresh, and more. That means the apps’ continued existence brings more value to Twitter’s service than numbers alone can show.
They are part of Twitter’s history. You can even credit one of the apps for Twitter’s logo! Initially, Twitter only had a typeset version of its name. Then Twitterrific came along and introduced a bird for its logo. Twitter soon followed.
These third-party apps also play a role in retaining users who struggle with the new user experience Twitter has adopted – its algorithmic timeline. Instead, the apps offer a chronological view of tweets, as some continue to prefer.
Twitter decision to cripple these developers’ apps is shameful.
It shows a lack of respect for Twitter’s history, its power user base, its culture of innovation, and its very own nature as a platform, not a destination.