Lunewave is pitching a new sensor offering better vision for autonomous vehicles

The investment arms of BMW and the Chinese search technology giant, Baidu, along with a large original equipment manufacturer for the auto industry and a slew of technology investors have all come together to back Lunewave, a startup developing new sensor technologies for autonomous vehicles. The $5 million seed round which the company just closed […]

The investment arms of BMW and the Chinese search technology giant, Baidu, along with a large original equipment manufacturer for the auto industry and a slew of technology investors have all come together to back Lunewave, a startup developing new sensor technologies for autonomous vehicles.

The $5 million seed round which the company just closed will serve as a launching pad to get its novel radar technology, based on the concept of a Luneburg antenna, to market.

First developed in the 1940s, Lunewave’s spin the antenna technology involves leveraging 3D printing to create new architectures that enable more powerful antennas with greater range and accuracy than the sensing technologies currently on the market, according to the company’s chief executive John Xin.

Lunewave was co-founded by brothers John and Hao Xin and is based off of research that Hao had been conducting as a professor at the University of Arizona. Hao previously spent years working in the defense community for companies like Raytheon and Rockwell Scientific after graduating with a doctorate from the Massachusetts Institute of Technology in 2000.

Younger brother John took a more entrepreneurial approach, working in consulting and financial services for companies like PriceWaterhouseCoopers and Liberty Mutual.

Lunewave represents the culmination of nine years of research the elder Xin spent at the University of Arizona applying 3D printing to boost the power of the Luneburg antenna. With so much intellectual firepower behind it, Hao was able to convince his younger brother to join him on the entrepreneurial journey.

He has a strong desire to commercialize his inventions,” John Xin said of his older brother. “He wants to see it in everyday life.”

 Image courtesy of Driving-Tests.org

Now the company has $5 million in new funding to take the technology that Hao Xin has dedicated so much time and effort to develop and bring it to market. 

“With a single 3D printer in the laboratory version we can produce 100 per day,” John Xin told me. “With an industrial printer you can print 1000 per day.”

The first market for the company’s new technology will be autonomous vehicles — and more specifically autonomous cars.

Lunewave is focused on the eyes of the vehicle, says John Xin. Currently, autonomous technologies rely on a few different sensing systems. There are LIDAR technologies which use lasers to illuminate a target and measure the reflected pulses with a sensor; camera technologies which rely on — well — camera technologies; and radar which uses electromagnetic waves to detect objects.

Startups developing and refining these technologies have raised hundreds of millions of dollars to tackle the autonomous vehicle market. In June, the camera sensing technology developer Light raised over $120 million from SoftBank. Meanwhile, LIDAR technology developers like Quanergy and Leddartech have raised $134 million and $117 million respectively and some studies have claimed that the market for LIDAR technologies was already a $5.2 billion last year alone.

Most companies working with autonomous cars these days use some combination of these technologies, but the existing products on the market have significant limitations, according to Lunewave’s chief executive.

John Xin argues that the Lunewave technology can detect more objects in a wider field of view and at greater distances than existing products thanks to the unique properties of the Luneburg antenna.

Think of the antenna as a giant golf ball with a 360 field of “view” that can detect objects at greater distances than existing Lidar technologies because of the distance constraints on laser technologies.

Xin with a Lunewave prototype

“LIDAR right now is at the end of the day because of its short wavelength. It does not function as well in poor weather conditions. Penetration of shorter wave lengths would be very difficult in poor weather conditions,” Xin said. “Our radar technology has the ability to function across all weather conditions. Our hardware architecture of our Lunenberg antenna has the best distance and the spherical nature of the device has the 360 detection capacity.”

The company came out with its minimum viable product in 2017 — the same year that it launched. It was one of the early companies in the UrbanX accelerator — a collaboration between Mini and Urban.us — and is part of BMW’s startup garage program.

The company raised $5 million in two structures. Its seed financing was a $3.75 million equity round led by the automotive investment specialist McCombs Fraser with participation from Ekistic Ventures, Urban.us, Plug and Play, Shanda Capital, Lighthouse Ventures, Baidu Ventures and BMW iVentures. But a portion of its capital came in the form of a $1.25 million non-dilutive government grant through the National Science Foundation . “In late 2016 that’s what helped us to jumpstart the company,” said Xin.

Now, the company just needs to fulfill Hao Xin’s dream of taking the product to market.

“We have the product,” John Xin said. “It’s not just taking in money. Now it’s about [proof of concepts] and pilots.”

Mercedes-Benz’s vision for autonomy is flexible and fugly

Mercedes-Benz shared on Monday its vision for how people and packages will someday move in dense urban environments. It’s called Vision Urbanetic—an all-electric autonomous concept vehicle that can change from a toaster-looking cargo van to a dung beetle-esque (or it is bike helmet) people mover. The Vision Urbanetic joins a growing list of fugly autonomous […]

Mercedes-Benz shared on Monday its vision for how people and packages will someday move in dense urban environments. It’s called Vision Urbanetic—an all-electric autonomous concept vehicle that can change from a toaster-looking cargo van to a dung beetle-esque (or it is bike helmet) people mover.

The Vision Urbanetic joins a growing list of fugly autonomous vehicle concepts to debut in the past two years. But that’s not really the point here.

Moving past the hot takes on its looks, the Urbanetic shows where Mercedes and other automakers are headed. This is a concept, not plans for a production vehicle, after all.

Mercedes-Benz Vision Urbanetic.

Mercedes’s vision of a powertrain platform that can house several different vehicle bodies is not unique. Automakers are increasingly moving towards a universal powertrain platform for some of its production vehicles to improve manufacturing efficiencies and reduce costs.

The difference here is that the vehicle bodies could be changed on the fly by a team of workers back at a mobility hub, as depicted in the video below.

The system is based on an autonomous driving platform onto which the respective bodies (people mover or cargo) are fixed. The underlying platform incorporates all the driving functions, which means the autonomous chassis could make its way to its next job location without a body attached, the company said.

The people-mover body type has space for up to twelve passengers, while the cargo module has a storage volume of 353 cubic feet, can be divided into two levels and transport up to 10 palettes.

The idea presents new logistics and infrastructure challenges that any company with plans to deploy a commercial autonomous vehicle ride-hailing fleet will also face. If this vision were ever to become reality, Mercedes would need hubs located near urban centers, where the Urbanetic vehicles would be housed, maintained and charged. This is also where the body type would be swapped out, depending on needs at that time.

Mercedes seems to have thought through some of this. The vehicle bodies could be swapped out automatically or manually, and take a few minutes, Mercedes said. It also outlined a dynamic communications system that would be able to capture and process data in real time to determine what kinds of vehicles are needed, and where. For instance, it could identify a crowd of people gathered in a certain area or capture local information that a concert would soon be over and then deploy more ride-hailing vehicles to that location.

Mercedes said the vehicles could be used in restricted areas such as factory site or airport.

AutoX is using its self-driving vehicles to deliver groceries

Autonomous vehicle startup AutoX has launched a grocery delivery and mobile store pilot in a partnership with GrubMarket.com and local high-end grocery store DeMartini Orchard. The pilot will initially be limited to an area of about 400 homes in north San Jose. The company, which employees nearly 90 people, has just two autonomous vehicles that […]

Autonomous vehicle startup AutoX has launched a grocery delivery and mobile store pilot in a partnership with GrubMarket.com and local high-end grocery store DeMartini Orchard.

The pilot will initially be limited to an area of about 400 homes in north San Jose. The company, which employees nearly 90 people, has just two autonomous vehicles that will be used for the initial launch. Eventually, AutoX aims to expand the pilot west to Mountain View and Palo Alto, with more delivery partners joining soon.

Once customers in the prescribed area have downloaded the app, they can make an order. For now, these orders must be placed the day before delivery. Or, when the AutoX car arrives, the window rolls down to reveal AutoX’s selections from which customers can choose.

The idea is to offer two shopping experiences with self-driving cars, AutoX COO Jewel Li explained in a statement. Customers can order goods from an app and get them delivered by a self-driving vehicle. Or the self-driving vehicle can bring a shelf of goods that customers can pick and choose from right outside their house.

Unlike many other startups racing to deploy autonomous vehicles, AutoX is focused on delivering things, not people.

“We don’t think it makes sense for people to drive around these two-ton vehicles to go pick up an apple,” AutoX director of business and operations Hugo Fozzati told TechCrunch. “These errands are creating congestion and a ton of pollution. We want to focus on something that’s going to have a lot of impact.”

The company, which launched in September 2016, has raised $43 million from strategic and financial investors. AutoX is based in San Jose and also has offices in China.

Of course, AutoX is hardly the only autonomous vehicle delivery company to emerge in the past two years. Starship Technologies, Mountain View, Calif.-based NuroRobomart and Chinese retail powerhouse, Alibaba are just a few that have unveiled their own vision for autonomous delivery.

The pilot is the first step in AutoX founder and CEO Jianxiong Xiao’s mission to open up autonomous vehicles to everyone. It’s a goal the company contends can be reached using economical (and better) hardware. The company does use light detection and ranging radar, known as LiDAR. But instead of loading up its self-driving vehicles with numerous expensive LiDAR units, AutoX relies more on cameras, which it argues have better resolution. The company’s proprietary AI algorithms tie everything together.

“It’s the first step of our mission to democratize autonomy, also a testament to our cutting edge AI and all its potential capabilities,” Xiao said about the pilot program. “We believe self-driving car technologies will fundamentally change people’s daily lives for the better.”

Carmera, the mapping startup for autonomous vehicles, raises $20 million

Autonomous vehicles need more than a brain to operate safely in a world filled with obstacles. They need maps. Or more specifically, self-driving vehicles need maps that constantly refresh and can deliver important information — like that sudden lane closure due to construction or a double-parked vehicle — so they can take the safest and […]

Autonomous vehicles need more than a brain to operate safely in a world filled with obstacles. They need maps. Or more specifically, self-driving vehicles need maps that constantly refresh and can deliver important information — like that sudden lane closure due to construction or a double-parked vehicle — so they can take the safest and most efficient route possible.

This specific need has provided an opening for startups in what once looked like a locked-up mapping market dominated by a few giants.

Carmera, a New York-based mapping and data analytics startup, is one of them. The company, which came out of stealth two years ago, has now raised $20 million in a Series B funding round led by GV, formerly known as Google Ventures. Carmera previously raised $6.5 million.

The company announced the funding raise Thursday along with a few other updates, including a new feature on its autonomous mapping product and a partnership with New York City. The capital will be used to hire more talent and expand.

“We’ll be doing the most aggressive hiring we’ve ever done this next year,” Carmera co-founder and CEO Ro Gupta told TechCrunch, adding that the company will mostly focus on building out its New York and Seattle offices. Carmera, which has about 25 employees, plans to have more than 50 by the end of next year.

“The money also allows us to be more prospective than simply reacting to customer needs,” Gupta added.

In other words, Carmera can move into new markets where it suspects there will be a need in the future, not just wait for a call from their customers. One of those customers is Voyage, the autonomous driving startup that currently operates self-driving cars in retirement communities.

Carmera has an interesting business model, and one that’s likely attractive to investors looking for startups with a present-day revenue stream. The company describes itself as a street intelligence platform for autonomy. Its main product is the Carmera autonomous map, a high-definition map for autonomous vehicle customers like automakers, suppliers and robotaxis.

The twist here is that the company uses data gleaned from its other product — a fleet-monitoring service used by commercial customers with vehicles driven by humans — to keep those AV maps fresh. The fleet product is a telematics and video monitoring service used by professional fleets that want to manage risk with their vehicles and drivers.

These fleets of camera-equipped human-driven vehicles deliver new information to the autonomous map as they go about their daily business in cities. Carmera calls this a “pro-sourcing” swarm.

The startup has now added a real-time events and change-management engine to its autonomous map that Gupta contends is a major leap forward because it not only provides more detailed information to self-driving vehicles, it gives these driverless vehicles a suggested path.

In some mapping products, there’s generally a base map and then a dynamic overlay. The problem, Gupta explains, is that when things change, like a lane closure, the dynamic map only flags it, leaving it up to the vehicle to figure out what to do next.

“That works fine when humans are driving, it just doesn’t go far enough for AVs,” Gupta said. “What they need to know is how do I path plan around it?”

Carmera’s real-time events and change-management feature

The map will detect a change in milliseconds, classify it within seconds and then validate and redraw the base map within minutes, according to Carmera. The company is giving companies deploying autonomous vehicles API access to this data at every stage.

Carmera also has a “site intelligence product,” a jargon term that means the company provides spatial data and street analytics (like how pedestrians move within a particular intersection) to urban planners.

Carmera announced Thursday it will begin sharing data such as historical pedestrian analytics and real-time construction detection with New York City’s Department of Transportation. Carmera will get access to key city data sets in return. The partnership with NYC DOT follows an earlier-data sharing initiative with the Downtown Brooklyn Partnership.