Posts

Zoom to start first phase of E2E encryption rollout next week

Zoom will begin rolling out end-to-end encryption to users of its videoconferencing platform from next week, it said today.

The platform, whose fortunes have been supercharged by the pandemic-driven boom in remote working and socializing this year, has been working on rebooting its battered reputation in the areas of security and privacy since April — after it was called out on misleading marketing claims of having E2E encryption (when it did not). E2E is now finally on its way though.

“We’re excited to announce that starting next week, Zoom’s end-to-end encryption (E2EE) offering will be available as a technical preview, which means we’re proactively soliciting feedback from users for the first 30 days,” it writes in a blog post. “Zoom users — free and paid — around the world can host up to 200 participants in an E2EE meeting on Zoom, providing increased privacy and security for your Zoom sessions.”

Zoom acquired Keybase in May, saying then that it was aiming to develop “the most broadly used enterprise end-to-end encryption offering”.

However, initially, CEO Eric Yuan said this level of encryption would be reserved for fee-paying users only. But after facing a storm of criticism the company enacted a swift U-turn — saying in June that all users would be provided with the highest level of security, regardless of whether they are paying to use its service or not.

Zoom confirmed today that Free/Basics users who want to get access to E2EE will need to participate in a one-time verification process — in which it will ask them to provide additional pieces of information, such as verifying a phone number via text message — saying it’s implementing this to try to reduce “mass creation of abusive accounts”.

“We are confident that by implementing risk-based authentication, in combination with our current mix of tools — including our work with human rights and children’s safety organizations and our users’ ability to lock down a meeting, report abuse, and a myriad of other features made available as part of our security icon — we can continue to enhance the safety of our users,” it writes.

Next week’s roll out of a technical preview is phase 1 of a four-stage process to bring E2E encryption to the platform.

This means there are some limitations — including on the features that are available in E2EE Zoom meetings (you won’t have access to join before host, cloud recording, streaming, live transcription, Breakout Rooms, polling, 1:1 private chat, and meeting reactions); and on the clients that can be used to join meetings (for phase 1 all E2EE meeting participants must join from the Zoom desktop client, mobile app, or Zoom Rooms). 

The next phase of the E2EE rollout — which will include “better identity management and E2EE SSO integration”, per Zoom’s blog — is “tentatively” slated for 2021.

From next week, customers wanting to check out the technical preview must enable E2EE meetings at the account level and opt-in to E2EE on a per-meeting basis.

All meeting participants must have the E2EE setting enabled in order to join an E2EE meeting. Hosts can enable the setting for E2EE at the account, group, and user level and can be locked at the account or group level, Zoom notes in an FAQ.

The AES 256-bit GCM encryption that’s being used is the same as Zoom currently uses but here combined with public key cryptography — which means the keys are generated locally, by the meeting host, before being distributed to participants, rather than Zoom’s cloud performing the key generating role.

“Zoom’s servers become oblivious relays and never see the encryption keys required to decrypt the meeting contents,” it explains of the E2EE implementation.

If you’re wondering how you can be sure you’ve joined an E2EE Zoom meeting a dark padlock will be displayed atop the green shield icon in the upper left corner of the meeting screen. (Zoom’s standard GCM encryption shows a checkmark here.)

Meeting participants will also see the meeting leader’s security code — which they can use to verify the connection is secure. “The host can read this code out loud, and all participants can check that their clients display the same code,” Zoom notes.

Armory nabs $40M Series C as commercial biz on top of open-source Spinnaker project takes off

As companies continue to shift more quickly to the cloud, pushed by the pandemic, startups like Armory that work in the cloud-native space are seeing an uptick in interest. Armory is a company built to be a commercial layer on top of the open-source continuous delivery project Spinnaker. Today, it announced a $40 million Series C.

B Capital led the round, with help from new investors Lead Edge Capital and Marc Benioff along with previous investors Insight Partners, Crosslink Capital, Bain Capital Ventures, Mango Capital, Y Combinator and Javelin Venture Partners. Today’s investment brings the total raised to more than $82 million.

“Spinnaker is an open-source project that came out of Netflix and Google, and it is a very sophisticated multi-cloud and software delivery platform,” company co-founder and CEO Daniel R. Odio told TechCrunch.

Odio points out that this project has the backing of industry leaders, including the three leading public cloud infrastructure vendors Amazon, Microsoft and Google, as well as other cloud players like CloudFoundry and HashiCorp. “The fact that there is a lot of open-source community support for this project means that it is becoming the new standard for cloud-native software delivery,” he said.

In the days before the notion of continuous delivery, companies moved forward slowly, releasing large updates over months or years. As software moved to the cloud, this approach no longer made sense and companies began delivering updates more incrementally, adding features when they were ready. Adding a continuous delivery layer helped facilitate this move.

As Odio describes it, Armory extends the Spinnaker project to help implement complex use cases at large organizations, including around compliance and governance and security. It is also in the early stages of implementing a SaaS version of the solution, which should be available next year.

While he didn’t want to discuss customer numbers, he mentioned JPMorgan Chase and Autodesk as customers, along with less specific allusions to “a Fortune Five technology company, a Fortune 20 Bank, a Fortune 50 retailer and a Fortune 100 technology company.”

The company currently has 75 employees, but Odio says business has been booming and he plans to double the team in the next year. As he does, he says that he is deeply committed to diversity and inclusion.

“There’s actually a really big difference between diversity and inclusion, and there’s a great Vernā Myers quote that diversity is being asked to the party and inclusion is being asked to dance, and so it’s actually important for us not only to focus on diversity, but also focus on inclusion because that’s how we win. By having a heterogeneous company, we will outperform a homogeneous company,” he said.

While the company has moved to remote work during COVID, Odio says they intend to remain that way, even after the current crisis is over. “Now obviously COVID been a real challenge for the world, including us. We’ve gone to a fully remote-first model, and we are going to stay remote-first even after COVID. And it’s really important for us to be taking care of our people, so there’s a lot of human empathy here,” he said.

But at the same time, he sees COVID opening up businesses to move to the cloud and that represents an opportunity for his business, one that he will focus on with new capital at his disposal. “In terms of the business opportunity, we exist to help power the transformation that these enterprises are undergoing right now, and there’s a lot of urgency for us to execute on our vision and mission because there is a lot of demand for this right now,” he said.

DroneDeploy teams with Boston Dynamics to deliver inside-outside view of job site

DroneDeploy, a cloud software company that uses drone footage to help industries like agriculture, oil and gas and construction get a bird’s-eye view of a site to build a 3D picture, announced a new initiative today that combines drone photos with cameras on the ground or even ground robots from a company like Boston Dynamics for what it is calling a 360 Walkthrough.

Up until today’s announcement, DroneDeploy could use drone footage from any drone to get a picture of what a site looked like outside, uploading those photos and stitching them together into a 3D model that is accurate within an inch, according to DroneDeploy CEO Mike Winn.

Winn says that while there is great value in getting this type of view of the outside of a job site, customers were hungry for a total picture that included inside and out, and the platform which is simply processing photos transmitted from drones could be adapted fairly easily to accommodate photos coming from cameras on other devices.

“Our customers are also looking to get data from the interiors, and they’re looking for one digital twin, one digital reconstruction of their entire site to understand what’s going on to share across their company with the safety team and with executives that this is the status of the job site today,” Winn explained.

He adds that this is even more important during COVID when access to job sites has been limited, making it even more important to understand the state of the site on a regular basis.

“They want fewer people on those job sites, only the essential workers doing the work. So for anyone who needs information about the site, if they can get that information from a desktop or the 3D model or a kind of street view of the job site, it can really help in this COVID environment, but it also makes it much more efficient,” Winn said.

He said that while companies could combine this capability with fixed cameras on the inside of a site, they don’t give the kind of coverage a ground robot could, and the Boston Dynamics robot is capable of moving around a rough job site with debris scattered around.

DroneDeploy bird's eye view of job site showing path taken through the site.

Image Credits: DroneDeploy

While Winn sees the use of the Boston Dynamics robot as more of an end goal, he says that more likely for the immediate future you will have a human walking through the job site with a camera to capture the footage to complete the inside-outside picture for the DroneDeploy software.

“All customers already want to adopt robots to collect this data, and you can imagine a Boston Dynamics robot [doing this], but that’s the end state of course. Today we’re supporting the human walk-through as well, a person with a 360 camera walking through the job site, probably doing it once a week to document the status of the job sites,” he said.

DroneDeploy launched in 2013 and has raised more than $100 million, according to Winn. He reports his company has over 5,000 customers, with drone flight time increasing by 2.5x YoY this year as more companies adopt drones as a way to cope with COVID.

Twilio’s $3.2B Segment acquisition is about helping developers build data-fueled apps

The pandemic has forced businesses to change the way they interact with customers. Whether it’s how they deliver goods and services, or how they communicate, there is one common denominator, and that’s that everything is being forced to be digitally driven much faster.

To some extent, that’s what drove Twilio to acquire Segment for $3.2 billion today. (We wrote about the deal over the weekend. Forbes broke the story last Friday night.) When you get down to it, the two companies fit together well, and expand the platform by giving Twilio customers access to valuable customer data. Chee Chew, Twilio’s chief product officer, says while it may feel like the company is pivoting in the direction of customer experience, they don’t necessarily see it that way.

“A lot of people have thought about us as a communications company, but we think of ourselves as a customer engagement company. We really think about how we help businesses communicate more effectively with their customers,” Chew told TechCrunch.

Laurie McCabe, co-founder and partner at SMB Group, sees the move related to the pandemic and the need companies have to serve customers in a more fully digital way. “More customers are realizing that delivering a great customer experience is key to survive through the pandemic, and thriving as the economy recovers — and are willing to spend to do this even in uncertain times,” McCabe said.

Certainly Chew recognized that Segment gives them something they were lacking by providing developers with direct access to customer data, and that could lead to some interesting applications.

“The data capabilities that Segment has are providing a full view of the customer. It really layers across everything we do. I think of it as a horizontal add across the channels and extending beyond. So I think it really helps us advance in a different sort of way […] towards getting the holistic view of the customer and enabling our customers to build intelligence services on top,” he said.

Brent Leary, founder and principal analyst at CRM Essentials, sees Segment helping to provide a powerful data-fueled developer experience. “This move allows Twilio to impact the data-insight-interaction-experience transformation process by removing friction from developers using their platform,” Leary explained. In other words, it gives developers that ability that Chew alluded to, to use data to build more varied applications using Twilio APIs.

Paul Greenberg, author of CRM at the Speed of Light, and founder and principal analyst at 56 Group, agrees, saying, “Segment gives Twilio the ability to use customer data in what is already a powerful unified communications platform and hub. And since it is, in effect, APIs for both, the flexibility [for developers] is enormous,” he said.

That may be so, but Holger Mueller, an analyst at Constellation Research, says the company has to be seeing that the pure communication parts of the platform like SMS are becoming increasingly commoditized, and this deal, along with the SendGrid acquisition in 2018, gives Twilio a place to expand its platform into a much more lucrative data space.

“Twilio needs more growth path and it looks like its strategy is moving up the stack, at least with the acquisition of Segment. Data movement and data residence compliance is a huge headache for enterprises when they build their next generation applications,” Mueller said.

As Chew said, early on the problems were related to building SMS messages into applications and that was the problem that Twilio was trying to solve because that’s what developers needed at the time, but as it moves forward, it wants to provide a more unified customer communications experience, and Segment should help advance that capability in a big way for them.

Twilio is buying customer data startup Segment for between $3B and $4B

Sources have told TechCrunch that Twilio intends to acquire customer data startup Segment for between $3 and $4 billion. Forbes broke the story on Friday night, reporting a price tag of $3.2 billion.

We have heard from a couple of industry sources that the deal is in the works and could be announced as early as Monday.

Twilio and Segment are both API companies. That means they create an easy way for developers to tap into a specific type of functionality without writing a lot of code. As I wrote in a 2017 article on Segment, it provides a set of APIs to pull together customer data from a variety of sources:

Segment has made a name for itself by providing a set of APIs that enable it to gather data about a customer from a variety of sources like your CRM tool, customer service application and website and pull that all together into a single view of the customer, something that is the goal of every company in the customer information business.

While Twilio’s main focus since it launched in 2008 has been on making it easy to embed communications functionality into any app, it signaled a switch in direction when it released the Flex customer service API in March 2018. Later that same year, it bought SendGrid, an email marketing API company for $2 billion.

Twilio’s market cap as of Friday was an impressive $45 billion. You could see how it can afford to flex its financial muscles to combine Twilio’s core API mission, especially Flex, with the ability to pull customer data with Segment and create customized email or ads with SendGrid.

This could enable Twilio to expand beyond pure core communications capabilities and it could come at the cost of around $5 billion for the two companies, a good deal for what could turn out to be a substantial business as more and more companies look for ways to understand and communicate with their customers in more relevant ways across multiple channels.

As Semil Shah from early stage VC firm Haystack wrote in the company blog yesterday, Segment saw a different way to gather customer data, and Twilio was wise to swoop in and buy it.

Segment’s belief was that a traditional CRM wasn’t robust enough for the enterprise to properly manage its pipe. Segment entered to provide customer data infrastructure to offer a more unified experience. Now under the Twilio umbrella, Segment can continue to build key integrations (like they have for Twilio data), which is being used globally inside Fortune 500 companies already.

Segment was founded in 2011 and raised over $283 million, according to Crunchbase data. Its most recent raise was $175 million in April on a $1.5 billion valuation.

Twilio stock closed at $306.24 per share on Friday up $2.39%.

Segment declined to comment on this story. We also sent a request for comment to Twilio, but hadn’t heard back by the time we published.  If that changes, we will update the story.

How Roblox completely transformed its tech stack

Picture yourself in the role of CIO at Roblox in 2017.

At that point, the gaming platform and publishing system that launched in 2005 was growing fast, but its underlying technology was aging, consisting of a single data center in Chicago and a bunch of third-party partners, including AWS, all running bare metal (nonvirtualized) servers. At a time when users have precious little patience for outages, your uptime was just two nines, or less than 99% (five nines is considered optimal).

Unbelievably, Roblox was popular in spite of this, but the company’s leadership knew it couldn’t continue with performance like that, especially as it was rapidly gaining in popularity. The company needed to call in the technology cavalry, which is essentially what it did when it hired Dan Williams in 2017.

Williams has a history of solving these kinds of intractable infrastructure issues, with a background that includes a gig at Facebook between 2007 and 2011, where he worked on the technology to help the young social network scale to millions of users. Later, he worked at Dropbox, where he helped build a new internal network, leading the company’s move away from AWS, a major undertaking involving moving more than 500 petabytes of data.

When Roblox approached him in mid-2017, he jumped at the chance to take on another major infrastructure challenge. While they are still in the midst of the transition to a new modern tech stack today, we sat down with Williams to learn how he put the company on the road to a cloud-native, microservices-focused system with its own network of worldwide edge data centers.

Scoping the problem

Grid AI raises $18.6M Series A to help AI researchers and engineers bring their models to production

Grid AI, a startup founded by the inventor of the popular open-source PyTorch Lightning project, William Falcon, that aims to help machine learning engineers work more efficiently, today announced that it has raised an $18.6 million Series A funding round, which closed earlier this summer. The round was led by Index Ventures, with participation from Bain Capital Ventures and firstminute. 

Falcon co-founded the company with Luis Capelo, who was previously the head of machine learning at Glossier. Unsurprisingly, the idea here is to take PyTorch Lightning, which launched about a year ago, and turn that into the core of Grid’s service. The main idea behind Lightning is to decouple the data science from the engineering.

The time argues that a few years ago, when data scientists tried to get started with deep learning, they didn’t always have the right expertise and it was hard for them to get everything right.

“Now the industry has an unhealthy aversion to deep learning because of this,” Falcon noted. “Lightning and Grid embed all those tricks into the workflow so you no longer need to be a PhD in AI nor [have] the resources of the major AI companies to get these things to work. This makes the opportunity cost of putting a simple model against a sophisticated neural network a few hours’ worth of effort instead of the months it used to take. When you use Lightning and Grid it’s hard to make mistakes. It’s like if you take a bad photo with your phone but we are the phone and make that photo look super professional AND teach you how to get there on your own.”

As Falcon noted, Grid is meant to help data scientists and other ML professionals “scale to match the workloads required for enterprise use cases.” Lightning itself can get them partially there, but Grid is meant to provide all of the services its users need to scale up their models to solve real-world problems.

What exactly that looks like isn’t quite clear yet, though. “Imagine you can find any GitHub repository out there. You get a local copy on your laptop and without making any code changes you spin up 400 GPUs on AWS — all from your laptop using either a web app or command-line-interface. That’s the Lightning “magic” applied to training and building models at scale,” Falcon said. “It is what we are already known for and has proven to be such a successful paradigm shift that all the other frameworks like Keras or TensorFlow, and companies have taken notice and have started to modify what they do to try to match what we do.”

The service is now in private beta.

With this new funding, Grid, which currently has 25 employees, plans to expand its team and strengthen its corporate offering via both Grid AI and through the open-source project. Falcon tells me that he aims to build a diverse team, not in the least because he himself is an immigrant, born in Venezuela, and a U.S. military veteran.

“I have first-hand knowledge of the extent that unethical AI can have,” he said. “As a result, we have approached hiring our current 25 employees across many backgrounds and experiences. We might be the first AI company that is not all the same Silicon Valley prototype tech-bro.”

“Lightning’s open-source traction piqued my interest when I first learned about it a year ago,” Index Ventures’ Sarah Cannon told me. “So intrigued in fact I remember rushing into a closet in Helsinki while at a conference to have the privacy needed to hear exactly what Will and Luis had built. I promptly called my colleague Bryan Offutt who met Will and Luis in SF and was impressed by the ‘elegance’ of their code. We swiftly decided to participate in their seed round, days later. We feel very privileged to be part of Grid’s journey. After investing in seed, we spent a significant amount with the team, and the more time we spent with them the more conviction we developed. Less than a year later and pre-launch, we knew we wanted to lead their Series A.”

As IBM spins out legacy infrastructure management biz, CEO goes all in on the cloud

When IBM announced this morning that it was spinning out its legacy infrastructure services business, it was a clear signal that new CEO Arvind Krishna, who took the reins in April, was ready to fully commit his company to the cloud.

The move was a continuation of the strategy the company began to put in place when it bought Red Hat in 2018 for the princely sum of $34 billion. That purchase signaled a shift to a hybrid-cloud vision, where some of your infrastructure lives on-premises and some in the cloud — with Red Hat helping to manage it all.

Even as IBM moved deeper into the hybrid cloud strategy, Krishna saw the financial results like everyone else and recognized the need to focus more keenly on that approach. In its most recent earnings report overall IBM revenue was $18.1 billion, down 5.4% compared to the year-ago period. But if you broke out just IBM’s cloud and Red Hat revenue, you saw some more promising results: cloud revenue was up 30 percent to $6.3 billion, while Red Hat-derived revenue was up 17%.

Even more, cloud revenue for the trailing 12 months was $23.5 billion, up 20%.

You don’t need to be a financial genius to see where the company is headed. Krishna clearly saw that it was time to start moving on from the legacy side of IBM’s business, even if there would be some short-term pain involved in doing so. So the executive put his resources into (as they say) where the puck is going. Today’s news is a continuation of that effort.

The managed infrastructure services segment of IBM is a substantial business in its own right, generating $19 billion annually, according to the company, but Krishna was promoted to CEO to clean house, taking over from Ginni Rometti to make hard decisions like this.

While its cloud business is growing, Synergy Research data has IBM public cloud market share mired in single digits with perhaps 4 or 5%. In fact, Alibaba has passed its market share, though both are small compared to the market leaders Amazon, Microsoft and Google.

Like Oracle, another legacy company trying to shift more to the cloud infrastructure business, IBM has a ways to go in its cloud evolution.

As with Oracle, IBM has been chasing the market leaders — Google at 9%, Microsoft 18% and AWS with 33% share of public cloud revenue (according to Synergy) — for years now without much change in its market share. What’s more, IBM competes directly with Microsoft and Google, which are also going after that hybrid cloud business with more success.

While IBM’s cloud revenue is growing, its market share needle is stuck and Krishna understands the need to focus. So, rather than continue to pour resources into the legacy side of IBM’s business, he has decided to spin out that part of the company, allowing more attention for the favored child, the hybrid cloud business.

It’s a sound strategy on paper, but it remains to be seen if it will have a material impact on IBM’s growth profile in the long run. He is betting that it will, but then what choice does he have?

IBM plans to spin off infrastructure services as a separate $19B business

IBM, a company that originally made its name out of its leadership in building myriad enterprise hardware (quite literally: its name is an abbreviation for International Business Machines), is taking one more step away from that legacy and deeper into the world of cloud services. The company today announced that it plans to spin off its managed infrastructure services unit as a separate public company, a $19 billion business in annual revenues, to help it focus more squarely on newer opportunities in hybrid cloud applications and artificial intelligence.

Infrastructure services include a range of managed services based around legacy infrastructure and digital transformation related to it. It includes things like testing and assembly, but also product engineering and lab services, among other things. A spokesperson confirmed to me that the deal will not include the company’s servers business, only infrastructure services.

IBM said it expects to complete the process — a tax-free spin-off for shareholders — by the end of 2021. It has not yet given a name to “NewCo” but it said that out of the gate the spun-off company will have 90,000 employees, 4,600 big enterprise clients in 115 countries, a backlog of $60 billion in business “and more than twice the scale of its nearest competitor” in the area of infrastructure services.

Others that compete against it include the likes of BMC and Microsoft. The remaining IBM business is about three times as big: it currently generates some $59 billion in annual revenues.

At the same time that IBM announced the news, it also gave some updated guidance for Q3, which it plans to report officially later this month. It said it expects revenues of $17.6 billion, with GAAP diluted earnings per share from continuing operations of $1.89, and operating (non-GAAP) earnings per share of $2.58. As a point of comparison, in Q3 2019 it reported revenues of $18 billion. And last quarter IBM reported revenues of $18.1 billion. Tellingly, the division that contains infrastructure services saw declines last quarter.

The market seems to like the news: IBM shares are trading up some 10% ahead of the market opening.

The move is a significant shift for the company and underscores a bigger sea change in how enterprise IT has evolved and looks to continue changing in the future.

IBM is betting that legacy infrastructure and the servicing of it, while continuing to net revenues, will not grow as it has in the past, and as companies continue with their modernization (or “digital transformation,” as consultants like to refer to it today), they will turn increasingly to outsourced infrastructure and using cloud services, both to run their businesses and to build the services that interface with consumers. IBM, meanwhile, is in a race competing against the likes of Microsoft and Google in cloud services, and so doubling down on that part of the business is another way to focus on it for growth.

But IBM, often referred to as “Big Blue”, is also using the announcement as the start of an effort to streamline its business to spur growth (maybe we’ll have to rename it “Medium Blue”).

“IBM is laser-focused on the $1 trillion hybrid cloud opportunity,” said Arvind Krishna, IBM CEO, in a statement. “Client buying needs for application and infrastructure services are diverging, while adoption of our hybrid cloud platform is accelerating. Now is the right time to create two market-leading companies focused on what they do best. IBM will focus on its open hybrid cloud platform and AI capabilities. NewCo will have greater agility to design, run and modernize the infrastructure of the world’s most important organizations. Both companies will be on an improved growth trajectory with greater ability to partner and capture new opportunities – creating value for clients and shareholders.”

Its $34 billion purchase of Red Hat in 2019 is perhaps its most notable investment in recent times in IBM’s own transformation.

“We have positioned IBM for the new era of hybrid cloud,” said Ginni Rometty, IBM Executive Chairman in a statement. “Our multi-year transformation created the foundation for the open hybrid cloud platform, which we then accelerated with the acquisition of Red Hat. At the same time, our managed infrastructure services business has established itself as the industry leader, with unrivaled expertise in complex and mission-critical infrastructure work. As two independent companies, IBM and NewCo will capitalize on their respective strengths. IBM will accelerate clients’ digital transformation journeys, and NewCo will accelerate clients’ infrastructure modernization efforts. This focus will result in greater value, increased innovation, and faster execution for our clients.”

More to come.

Headroom, which uses AI to supercharge videoconferencing, raises $5M

Videoconferencing has become a cornerstone of how many of us work these days — so much so that one leading service, Zoom, has graduated into verb status because of how much it’s getting used.

But does that mean videoconferencing works as well as it should? Today, a new startup called Headroom is coming out of stealth, tapping into a battery of AI tools — computer vision, natural language processing and more — on the belief that the answer to that question is a clear — no bad Wi-Fi interruption here — “no.”

Headroom not only hosts videoconferences, but then provides transcripts, summaries with highlights, gesture recognition, optimised video quality and more, and today it’s announcing that it has raised a seed round of $5 million as it gears up to launch its freemium service into the world.

You can sign up to the waitlist to pilot it, and get other updates here.

The funding is coming from Anna Patterson of Gradient Ventures (Google’s AI venture fund); Evan Nisselson of LDV Capital (a specialist VC backing companies building visual technologies); Yahoo founder Jerry Yang, now of AME Cloud Ventures; Ash Patel of Morado Ventures; Anthony Goldbloom, the co-founder and CEO of Kaggle.com; and Serge Belongie, Cornell Tech associate dean and professor of Computer Vision and Machine Learning.

It’s an interesting group of backers, but that might be because the founders themselves have a pretty illustrious background with years of experience using some of the most cutting-edge visual technologies to build other consumer and enterprise services.

Julian Green — a British transplant — was most recently at Google, where he ran the company’s computer vision products, including the Cloud Vision API that was launched under his watch. He came to Google by way of its acquisition of his previous startup Jetpac, which used deep learning and other AI tools to analyze photos to make travel recommendations. In a previous life, he was one of the co-founders of Houzz, another kind of platform that hinges on visual interactivity.

Russian-born Andrew Rabinovich, meanwhile, spent the last five years at Magic Leap, where he was the head of AI, and before that, the director of deep learning and the head of engineering. Before that, he too was at Google, as a software engineer specializing in computer vision and machine learning.

You might think that leaving their jobs to build an improved videoconferencing service was an opportunistic move, given the huge surge of use that the medium has had this year. Green, however, tells me that they came up with the idea and started building it at the end of 2019, when the term “COVID-19” didn’t even exist.

“But it certainly has made this a more interesting area,” he quipped, adding that it did make raising money significantly easier, too. (The round closed in July, he said.)

Given that Magic Leap had long been in limbo — AR and VR have proven to be incredibly tough to build businesses around, especially in the short to medium-term, even for a startup with hundreds of millions of dollars in VC backing — and could have probably used some more interesting ideas to pivot to; and that Google is Google, with everything tech having an endpoint in Mountain View, it’s also curious that the pair decided to strike out on their own to build Headroom rather than pitch building the tech at their respective previous employers.

Green said the reasons were two-fold. The first has to do with the efficiency of building something when you are small. “I enjoy moving at startup speed,” he said.

And the second has to do with the challenges of building things on legacy platforms versus fresh, from the ground up.

“Google can do anything it wants,” he replied when I asked why he didn’t think of bringing these ideas to the team working on Meet (or Hangouts if you’re a non-business user). “But to run real-time AI on video conferencing, you need to build for that from the start. We started with that assumption,” he said.

All the same, the reasons why Headroom are interesting are also likely going to be the ones that will pose big challenges for it. The new ubiquity (and our present lives working at home) might make us more open to using video calling, but for better or worse, we’re all also now pretty used to what we already use. And for many companies, they’ve now paid up as premium users to one service or another, so they may be reluctant to try out new and less-tested platforms.

But as we’ve seen in tech so many times, sometimes it pays to be a late mover, and the early movers are not always the winners.

The first iteration of Headroom will include features that will automatically take transcripts of the whole conversation, with the ability to use the video replay to edit the transcript if something has gone awry; offer a summary of the key points that are made during the call; and identify gestures to help shift the conversation.

And Green tells me that they are already also working on features that will be added into future iterations. When the videoconference uses supplementary presentation materials, those can also be processed by the engine for highlights and transcription too.

And another feature will optimize the pixels that you see for much better video quality, which should come in especially handy when you or the person/people you are talking to are on poor connections.

“You can understand where and what the pixels are in a video conference and send the right ones,” he explained. “Most of what you see of me and my background is not changing, so those don’t need to be sent all the time.”

All of this taps into some of the more interesting aspects of sophisticated computer vision and natural language algorithms. Creating a summary, for example, relies on technology that is able to suss out not just what you are saying, but what are the most important parts of what you or someone else is saying.

And if you’ve ever been on a videocall and found it hard to make it clear you’ve wanted to say something, without straight-out interrupting the speaker, you’ll understand why gestures might be very useful.

But they can also come in handy if a speaker wants to know if he or she is losing the attention of the audience: The same tech that Headroom is using to detect gestures for people keen to speak up can also be used to detect when they are getting bored or annoyed and pass that information on to the person doing the talking.

“It’s about helping with EQ,” he said, with what I’m sure was a little bit of his tongue in his cheek, but then again we were on a Google Meet, and I may have misread that.

And that brings us to why Headroom is tapping into an interesting opportunity. At their best, when they work, tools like these not only supercharge videoconferences, but they have the potential to solve some of the problems you may have come up against in face-to-face meetings, too. Building software that actually might be better than the “real thing” is one way of making sure that it can have staying power beyond the demands of our current circumstances (which hopefully won’t be permanent circumstances).