Figure rides the humanoid robot hype wave to $2.6B valuation and OpenAI/Microsoft partnerships

Today Figure confirmed long-standing rumors that it’s raising more money than God. The Bay Area–based robotics firm has announced a $675 million Series B on a $2.6 billion valuation. The lineup of investors is equally impressive, including Microsoft, OpenAI Startup Fund, Nvidia, Amazon Industrial Innovation Fund, Jeff Bezos (through Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures and ARK Invest. It’s an absolutely mind-boggling sum of money for what remains a small startup, with an 80-person headcount. That last bit will almost certainly change with this round.

That’s not to say that Figure didn’t already have a lot to work with. Founder Brett Adcock notably bootstrapped the company to the tune of $100 million. Last May, it added $70 million in the form of a Series A. I used to think “Figure” was a reference to the robot’s humanoid design and perhaps an homage to a startup that’s figuring things out. Now it seems it’s a reference to the astronomical funding figure it’s raised thus far.

Figure is very much a product of its time. It’s a young company, having only launched in 2022. It did so, however, with the ambitious goal of creating a walking bipedal robot in a year’s time. The company told TechCrunch that it hit that date. We didn’t see walking video at the time, but it has since surfaced.

Image Credits: Figure

Humanoid robots are having a moment. They’ve been showcased by Tesla (though I’d temper your expectations somewhat on that), Apptronik and 1X, among others. Amazon recently began a small pilot with Agility’s Digit robot, which seems to have found its groove supplementing human labor in brownfield warehouses and fulfillment centers.

Most — including Figure — are working toward that same goal. Upfront costs are just one reason it makes a lot more sense to focus on the workplace before the home. It’s also one of many reasons it’s important to properly calibrate your expectations of what a system like this can — and can’t — do. Some companies (namely Tesla again) have perhaps set unrealistic expectations about the current state of the art. I’m speaking primary of generalized AI, which many roboticists believe is, say, five years out (though that could well prove optimistic).

“General purpose” gets tossed around a lot when discussing these robots. In essence, it refers to systems that can quickly pick up tasks the way humans do. Traditional robotics system are single purpose, meaning they do one thing really well a number of times. Multipurpose system are certainly out there, and APIs like the kind provided by Boston Dynamics for Spot will go a ways toward expanding that functionality.

The eventual goal of generalized AI is, in fact, a big driver for the humanoid form factor. Robots built for a single function are difficult to adapt, while, in theory, a robot built like us can do anything we can.

When I visited Figure’s HQ last year, the company had recently built a demo area in the center of the office. The space’s primary use was showcasing the robot for potential clients and investors. Tellingly, it was set up to resemble a warehouse or factory. Most people believe that warehouse work is the first step to broader adoption and is perhaps the eventual arrival of a home robot. After all, corporations will happily invest a good chunk of money into a product they believe will save them money in the long run. Also, it’s much easier to fill a day’s work with one or two extremely repetitive tasks. Consumers will almost certainly demand something indistinguishable from generalization before paying the equivalent of a new car to buy one.

It’s worth noting that today’s news also finds Figure signing a partnership with generative AI pioneer, OpenAI. The goal of the deal is to “develop next generation AI models for humanoid robots,” according to Figure. The near-term application for LLMs is the ability to create more natural methods of communication between robot and their human colleagues. The company notes, “The collaboration aims to help accelerate Figure’s commercial timeline by enhancing the capabilities of humanoid robots to process and reason from language.”

Natural language allows people to give the systems commands and gives humans a better understanding of what the robot is doing (hence the ability to “reason” in language). These are, after all, much more complex systems than a human-piloted forklift, for example. If they’re going to operate autonomously, you’re going to need a more direct method of communication — especially on a busy warehouse or factory floor. Language process allows for human assistance in correcting mistakes.

Image Credits: Figure

“We’ve always planned to come back to robotics and we see a path with Figure to explore what humanoid robots can achieve when powered by highly capable multimodal models,” says OpenAI VP, Peter Welinder. “We’re blown away by Figure’s progress to date and we look forward to working together to open up new possibilities for how robots can help in everyday life.”

Another thing that makes the deal interesting is OpenAI’s investment in direct competitor, 1X. One wonders whether such a deal is OpenAI rethinking its investments, or if this is simply the company playing the field. My guess at the moment is the latter. If you’re in OpenAI’s position, you might as well work with as many promising companies as you can, and Figure has certainly demonstrated some real progress in the eight months since it took its first steps.

Take this video posted a little over a week ago. Figure says the robot’s operations are roughly 16.7% the speed of a human doing the same task. That is, it’s very slow and methodical — deliberate, even. That much is clear from the video. And it’s always good to see a robot operating at actual speed in a demo video, no matter how well produced it happens to be. People have told me in hushed tones that some folks try to pass off sped up videos without disclosing as much. It’s the kind of thing that feeds into consumers’ already unrealistic expectations of what robots can do.

Microsoft’s investment finds the company utilizing Azure for storage, training and ‘AI infrastructure. “We are excited to collaborate with Figure and work towards accelerating AI breakthroughs,” says Microsoft Corporate VP, Jon Tinter. “Through our work together, Figure will have access to Microsoft‘s AI infrastructure and services to support the deployment of humanoid robots to assist people with real world applications.”

Somewhat interestingly, Figure was not included in Bill Gates’ recent list of exciting robotics startups, though two other humanoid companies (Agility and Apptronik) were.

The Amazon Innovation Fund’s participation in this round is also particularly notable, as it can often serve as a pipeline to real-world deployment in fulfillment centers — take Agility as a key example.

The autonomous part is important as well, given the propensity to pass off tele-op for autonomy. One of the reasons autonomy is so difficult in cases like this is all the variations you can’t account for. While warehouses tend to be fairly structured environments, any number of things can occur in the real world that will knock a task off-kilter. And the less structured these tasks become, the larger the potential for error. A lot of questions remain, including how many takes it took to get this right. One thing this absolutely has going for it is the fact that the action is captured in one continuous shot, meaning the company didn’t cobble together a series of actions through creative editing.

Image Credits: Figure

Mechatronics are easier to judge in a short video than AI and autonomy, and from that perspective, the Figure 01 robot appears quite dexterous. In fact, if you look at the angle and positioning of the arms, you’ll notice that it’s performing the carry in a manner that would be quite uncomfortable for most people. It’s important to note that just because the robot looks like a person doesn’t mean that it has to behave exactly like one. My educated guess is that the positioning of the tote has to do with the robot’s center of gravity and perhaps the fact that it appears to be extremely top heavy.

Figure says the money will go toward accelerating its go-to-market. The company has already signed a deal with BMW for robotics deployment.

Leave a Comment

Your email address will not be published. Required fields are marked *

Tips Clear
Scroll to Top