Posts

Etsy’s 2-year migration to the cloud brought flexibility to the online marketplace

Founded in 2005, Etsy was born before cloud infrastructure was even a thing.

As the company expanded, it managed all of its operations in the same way startups did in those days — using private data centers. But a couple of years ago, the online marketplace for crafts and vintage items decided to modernize and began its journey to the cloud.

That decision coincided with the arrival of CTO Mike Fisher in July 2017. He was originally brought in as a consultant to look at the impact of running data centers on Etsy’s ability to innovate. As you might expect, he concluded that it was having an adverse impact and began a process that would lead to him being hired to lead a long-term migration to the cloud.

That process concluded last month. This is the story of how a company born in data centers made the switch to the cloud, and the lessons it offers.

Stuck in a hardware refresh loop

When Fisher walked through the door, Etsy operated out of private data centers. It was not even taking advantage of a virtualization layer to maximize the capacity of each machine. The approach meant IT spent an inordinate amount of time on resource planning.

YC-backed Turing uses AI to help speed up the formulation of new consumer packaged goods

One of the more interesting and useful applications of artificial intelligence technology has been in the world of biotechnology and medicine, where now more than 220 startups (not to mention universities and bigger pharma companies) are using AI to accelerate drug discovery by using it to play out the many permutations resulting from drug and chemical combinations, DNA and other factors.

Now, a startup called Turing — which is part of the current cohort at Y Combinator due to present in the next Demo Day on March 22 — is taking a similar principle but applying it to the world of building (and “discovering”) new consumer packaged goods products.

Using machine learning to simulate different combinations of ingredients plus desired outcomes to figure out optimal formulations for different goods (hence the “Turing” name, a reference to Alan Turing’s mathematical model, referred to as the Turing machine), Turing is initially addressing the creation of products in home care (e.g. detergents), beauty and food and beverage.

Turing’s founders claim that it is able to save companies millions of dollars by reducing the average time it takes to formulate and test new products, from an average of 12 to 24 months down to a matter of weeks.

Specifically, the aim is to reduce all the time it takes to test combinations, giving R&D teams more time to be creative.

“Right now, they are spending more time managing experiments than they are innovating,” Manmit Shrimali, Turing’s co-founder and CEO, said.

Turing is in theory coming out of stealth today, but in fact it has already amassed an impressive customer list. It is already generating revenues by working with eight brands owned by one of the world’s biggest CPG companies, and it is also being trialed by another major CPG behemoth (Turing is not disclosing their names publicly, but suffice it to say, they and their brands are household names).

“Turing aims to become the industry norm for formulation development and we are here to play the long game,” Shrimali said. “This requires creating an ecosystem that can help at each stage of growing and scaling the company, and YC just does this exceptionally well.”

Turing is co-founded by Shrimali and Ajith Govind, two specialists in data science that worked together on a previous startup called Dextro Analytics. Dextro had set out to help businesses use AI and other kinds of business analytics to help with identifying trends and decision making around marketing, business strategy and other operational areas.

While there, they identified a very specific use case for the same principles that was perhaps even more acute: the research and development divisions of CPG companies, which have (ironically, given their focus on the future) often been behind the curve when it comes to the “digital transformation” that has swept up a lot of other corporate departments.

“We were consulting for product companies and realised that they were struggling,” Shrimali said. Add to that the fact that CPG is precisely the kind of legacy industry that is not natively a tech company but can most definitely benefit from implementing better technology, and that spells out an interesting opportunity for how (and where) to introduce artificial intelligence into the mix.

R&D labs play a specific and critical role in the world of CPG.

Before eventually being shipped into production, this is where products are discovered; tested; tweaked in response to input from customers, marketing, budgetary and manufacturing departments and others; then tested again; then tweaked again; and so on. One of the big clients that Turing works with spends close to $400 million in testing alone.

But R&D is under a lot of pressure these days. While these departments are seeing their budgets getting cut, they continue to have a lot of demands. They are still expected to meet timelines in producing new products (or often more likely, extensions of products) to keep consumers interested. There are a new host of environmental and health concerns around goods with huge lists of unintelligible ingredients, meaning they have to figure out how to simplify and improve the composition of mass-market products. And smaller direct-to-consumer brands are undercutting their larger competitors by getting to market faster with competitive offerings that have met new consumer tastes and preferences.

“In the CPG world, everyone was focused on marketing, and R&D was a blind spot,” Shrimali said, referring to the extensive investments that CPG companies have made into figuring out how to use digital to track and connect with users, and also how better to distribute their products. “To address how to use technology better in R&D, people need strong domain knowledge, and we are the first in the market to do that.”

Turing’s focus is to speed up the formulation and testing aspects that go into product creation to cut down on some of the extensive overhead that goes into putting new products into the market.

Part of the reason why it can take upwards of years to create a new product is because of all the permutations that go into building something and making sure it works as consistently as a consumer would expect it to (which still being consistent in production and coming in within budget).

“If just one ingredient is changed in a formulation, it can change everything,” Shrimali noted. And so in the case of something like a laundry detergent, this means running hundreds of tests on hundreds of loads of laundry to make sure that it works as it should.

The Turing platform brings in historical data from across a number of past permutations and tests to essentially virtualise all of this: It suggests optimal mixes and outcomes from them without the need to run the costly physical tests, and in turn this teaches the Turing platform to address future tests and formulations. Shrimali said that the Turing platform has already saved one of the brands some $7 million in testing costs.

Turing’s place in working with R&D gives the company some interesting insights into some of the shifts that the wider industry is undergoing. Currently, Shrimali said one of the biggest priorities for CPG giants include addressing the demand for more traceable, natural and organic formulations.

While no single DTC brand will ever fully eat into the market share of any CPG brand, collectively their presence and resonance with consumers is clearly causing a shift. Sometimes that will lead to acquisitions of the smaller brands, but more generally it reflects a change in consumer demands that the CPG companies are trying to meet. 

Longer term, the plan is for Turing to apply its platform to other aspects that are touched by R&D beyond the formulations of products. The thinking is that changing consumer preferences will also lead to a demand for better “formulations” for the wider product, including more sustainable production and packaging. And that, in turn, represents two areas into which Turing can expand, introducing potentially other kinds of AI technology (such as computer vision) into the mix to help optimise how companies build their next generation of consumer goods.

Nvidia acquires data storage and management platform SwiftStack

Nvidia today announced that it has acquired SwiftStack, a software-centric data storage and management platform that supports public cloud, on-premises and edge deployments.

The company’s recent launches focused on improving its support for AI, high-performance computing and accelerated computing workloads, which is surely what Nvidia is most interested in here.

“Building AI supercomputers is exciting to the entire SwiftStack team,” says the company’s co-founder and CPO Joe Arnold in today’s announcement. “We couldn’t be more thrilled to work with the talented folks at NVIDIA and look forward to contributing to its world-leading accelerated computing solutions.”

The two companies did not disclose the price of the acquisition, but SwiftStack had previously raised about $23.6 million in Series A and B rounds led by Mayfield Fund and OpenView Venture Partners. Other investors include Storm Ventures and UMC Capital.

SwiftStack, which was founded in 2011, placed an early bet on OpenStack, the massive open-source project that aimed to give enterprises an AWS-like management experience in their own data centers. The company was one of the largest contributors to OpenStack’s Swift object storage platform and offered a number of services around this, though it seems like in recent years it has downplayed the OpenStack relationship as that platform’s popularity has fizzled in many verticals.

SwiftStack lists the likes of PayPal, Rogers, data center provider DC Blox, Snapfish and Verizon (TechCrunch’s parent company) on its customer page. Nvidia, too, is a customer.

SwiftStack notes that it team will continue to maintain an existing set of open source tools like Swift, ProxyFS, 1space and Controller.

“SwiftStack’s technology is already a key part of NVIDIA’s GPU-powered AI infrastructure, and this acquisition will strengthen what we do for you,” says Arnold.

Google Cloud announces four new regions as it expands its global footprint

Google Cloud today announced its plans to open four new data center regions. These regions will be in Delhi (India), Doha (Qatar), Melbourne (Australia) and Toronto (Canada) and bring Google Cloud’s total footprint to 26 regions. The company previously announced that it would open regions in Jakarta, Las Vegas, Salt Lake City, Seoul and Warsaw over the course of the next year. The announcement also comes only a few days after Google opened its Salt Lake City data center.

GCP already had a data center presence in India, Australia and Canada before this announcement, but with these newly announced regions, it now offers two geographically separate regions for in-country disaster recovery, for example.

Google notes that the region in Doha marks the company’s first strategic collaboration agreement to launch a region in the Middle East with the Qatar Free Zones Authority. One of the launch customers there is Bespin Global, a major managed services provider in Asia.

“We work with some of the largest Korean enterprises, helping to drive their digital transformation initiatives. One of the key requirements that we have is that we need to deliver the same quality of service to all of our customers around the globe,” said John Lee, CEO, Bespin Global. “Google Cloud’s continuous investments in expanding their own infrastructure to areas like the Middle East make it possible for us to meet our customers where they are.”

Datastax acquires The Last Pickle

Data management company Datastax, one of the largest contributors to the Apache Cassandra project, today announced that it has acquired The Last Pickle (and no, I don’t know what’s up with that name either), a New Zealand-based Cassandra consulting and services firm that’s behind a number of popular open-source tools for the distributed NoSQL database.

As Datastax Chief Strategy Officer Sam Ramji, who you may remember from his recent tenure at Apigee, the Cloud Foundry Foundation, Google and Autodesk, told me, The Last Pickle is one of the premier Apache Cassandra consulting and services companies. The team there has been building Cassandra-based open source solutions for the likes of Spotify, T Mobile and AT&T since it was founded back in 2012. And while The Last Pickle is based in New Zealand, the company has engineers all over the world that do the heavy lifting and help these companies successfully implement the Cassandra database technology.

It’s worth mentioning that Last Pickle CEO Aaron Morton first discovered Cassandra when he worked for WETA Digital on the special effects for Avatar, where the team used Cassandra to allow the VFX artists to store their data.

“There’s two parts to what they do,” Ramji explained. “One is the very visible consulting, which has led them to become world experts in the operation of Cassandra. So as we automate Cassandra and as we improve the operability of the project with enterprises, their embodied wisdom about how to operate and scale Apache Cassandra is as good as it gets — the best in the world.” And The Last Pickle’s experience in building systems with tens of thousands of nodes — and the challenges that its customers face — is something Datastax can then offer to its customers as well.

And Datastax, of course, also plans to productize The Last Pickle’s open-source tools like the automated repair tool Reaper and the Medusa backup and restore system.

As both Ramji and Datastax VP of Engineering Josh McKenzie stressed, Cassandra has seen a lot of commercial development in recent years, with the likes of AWS now offering a managed Cassandra service, for example, but there wasn’t all that much hype around the project anymore. But they argue that’s a good thing. Now that it is over ten years old, Cassandra has been battle-hardened. For the last ten years, Ramji argues, the industry tried to figure out what the de factor standard for scale-out computing should be. By 2019, it became clear that Kubernetes was the answer to that.

“This next decade is about what is the de facto standard for scale-out data? We think that’s got certain affordances, certain structural needs and we think that the decades that Cassandra has spent getting harden puts it in a position to be data for that wave.”

McKenzie also noted that Cassandra provides users with a number of built-in features like support for mutiple data centers and geo-replication, rolling updates and live scaling, as well as wide support across programming languages, give it a number of advantages over competing databases.

“It’s easy to forget how much Cassandra gives you for free just based on its architecture,” he said. “Losing the power in an entire datacenter, upgrading the version of the database, hardware failing every day? No problem. The cluster is 100 percent always still up and available. The tooling and expertise of The Last Pickle really help bring all this distributed and resilient power into the hands of the masses.”

The two companies did not disclose the price of the acquisition.

Honeywell says it will soon launch the world’s most powerful quantum computer

“The best-kept secret in quantum computing.” That’s what Cambridge Quantum Computing (CQC) CEO Ilyas Khan called Honeywell‘s efforts in building the world’s most powerful quantum computer. In a race where most of the major players are vying for attention, Honeywell has quietly worked on its efforts for the last few years (and under strict NDA’s, it seems). But today, the company announced a major breakthrough that it claims will allow it to launch the world’s most powerful quantum computer within the next three months.

In addition, Honeywell also today announced that it has made strategic investments in CQC and Zapata Computing, both of which focus on the software side of quantum computing. The company has also partnered with JPMorgan Chase to develop quantum algorithms using Honeywell’s quantum computer. The company also recently announced a partnership with Microsoft.

Honeywell has long built the kind of complex control systems that power many of the world’s largest industrial sites. It’s that kind of experience that has now allowed it to build an advanced ion trap that is at the core of its efforts.

This ion trap, the company claims in a paper that accompanies today’s announcement, has allowed the team to achieve decoherence times that are significantly longer than those of its competitors.

“It starts really with the heritage that Honeywell had to work from,” Tony Uttley, the president of Honeywell Quantum Solutions, told me. “And we, because of our businesses within aerospace and defense and our business in oil and gas — with solutions that have to do with the integration of complex control systems because of our chemicals and materials businesses — we had all of the underlying pieces for quantum computing, which are just fabulously different from classical computing. You need to have ultra-high vacuum system capabilities. You need to have cryogenic capabilities. You need to have precision control. You need to have lasers and photonic capabilities. You have to have magnetic and vibrational stability capabilities. And for us, we had our own foundry and so we are able to literally design our architecture from the trap up.”

The result of this is a quantum computer that promises to achieve a quantum Volume of 64. Quantum Volume (QV), it’s worth mentioning, is a metric that takes into account both the number of qubits in a system as well as decoherence times. IBM and others have championed this metric as a way to, at least for now, compare the power of various quantum computers.

So far, IBM’s own machines have achieved QV 32, which would make Honeywell’s machine significantly more powerful.

Khan, whose company provides software tools for quantum computing and was one of the first to work with Honeywell on this project, also noted that the focus on the ion trap is giving Honeywell a bit of an advantage. “I think that the choice of the ion trap approach by Honeywell is a reflection of a very deliberate focus on the quality of qubit rather than the number of qubits, which I think is fairly sophisticated,” he said. “Until recently, the headline was always growth, the number of qubits running.”

The Honeywell team noted that many of its current customers are also likely users of its quantum solutions. These customers, after all, are working on exactly the kind of problems in chemistry or material science that quantum computing, at least in its earliest forms, is uniquely suited for.

Currently, Honeywell has about 100 scientists, engineers and developers dedicated to its quantum project.

Stack Overflow expands its Teams service with new integrations

Most developers think of Stack Overflow as a question and answer site for their programming questions. But over the last few years, the company has also built a successful business in its Stack Overflow for Teams product, which essentially offers companies a private version of its Q&A product. Indeed, the Teams product now brings in a significant amount of revenue for the company and the new executive team at Stack Overflow is betting that it can help the company grow rapidly in the years to come.

To make Teams even more attractive to businesses, the company today launched a number of new integrations with Jira (Enterprise and Business), GitHub (Enterprise and Business) and Microsoft Teams (Enterprise). These join existing integrations with Slack, Okta and the Business tier of Microsoft Teams.

“I think the integrations that we have been building are reflective of that developer workflow and all of the number of tools that someone who is building and leveraging technology has to interact with,” Stack Overflow Chief Product Officer Teresa Dietrich told me. “When we think about integrations, we think about the vertical right, and I think that ‘developer workflow’ is one of those industry verticals that we’re thinking about. ChatOps is obviously another one, as you can see from our Slack and Teams integration. And the JIRA and GitHub [integrations] that we’re building are really at the core of a developer workflow.”

Current Stack Overflow for Teams customers include the likes of Microsoft, Expensify and Wix. As the company noted, 65 percent of its existing Teams customers use GitHub, so it’s no surprise that it is building out this integration.

Ampere launches new chip built from ground up for cloud workloads

Ampere, the chip startup run by former Intel President Renee James, announced a new chip today that she says is designed specifically to optimize for cloud workloads.

Ampere VP of product Jeff Wittich says the new chip is called the Ampere Altra, and it’s been designed with some features that should make it attractive to cloud providers. This involves three main focuses including high performance, scalability and power efficiency — all elements that would be important to cloud vendors operating at scale.

The Altra is an ARM chip with some big features.”It’s 64-bit ARM cores or 160 cores in a two-socket platforms –we support both one socket and two socket [configurations]. We are running at 3 GHz turbo, and that’s 3 GHz across all of the cores because of the way that cloud delivers compute, you’re utilizing all the cores as much of the time as possible. So our turbo performance was optimized for all of the cores being able to sustain it all the time,” Wittich explained.

The company sees this chip as a kind of workhorse for the cloud. “We’ve really looked at this as we’re designing a general purpose CPU that is built for the cloud environment, so you can utilize that compute the way the cloud utilizes that type of compute. So it supports the vast array of all of the workloads that run in the cloud,” he said.

Founder and CEO James says the company has been working with their cloud customers to give them the kind of information they need to optimize the chip for their individual workloads at a granular configuration level, something the hyper scalers in particular really require.

“Let’s go do what we can to build the platform that delivers the raw power and performance, the kind of environment that you’re looking for, and then have a design approach that enables them to work with us on what’s important and the kind of control, that kind of feature set that’s unique because each one of them have their own software environment,” James explained.

Among the companies working with Ampere early on have been Oracle (an investor, according to Crunchbase) and Microsoft, among others.

James says one of the unforeseen challenges of delivering this chip is possible disruptions in the supply chain due to the Corona-19 virus and its impact in Asia where many of the parts come from, and the chips are assembled.

She says the company has taken that into consideration and has been able to build up a worldwide supply chain she hopes will help with hiccups that might occur because of supply chain slow downs.

Thoma Bravo completes $3.9B Sophos acquisition

Thoma Bravo announced today that it has closed its hefty $3.9 billion acquisition of security firm Sophos, marking yet another private equity deal in the books.

The deal was originally announced in October. Stockholders voted to approve the deal in December.

They were paid $7.40 USD per share for their trouble, according to the company, and it indicated that as part of the closing, the stock had ceased trading on the London Stock Exchange. It also pointed out that investors who got in at the IPO price in June 2015 made a 168% premium on that investment.

Sophos hopes its new owner can help the company continue to modernize the platform. “With Thoma Bravo as a partner, we believe we can accelerate our progress and get to the future even faster, with dramatic benefits for our customers, our partners and our company as a whole,” Sophos CEO Kris Hagerman said in a statement. Whether it will enjoy those benefits or not, time will tell.

As for the buyer, it sees a company with a strong set of channel partners that it can access to generate more revenue moving forward under the Thoma Bravo umbrella. Sophos currently partners with 53,000 resellers and managed service providers, and counts more than 420,000 companies as customers. The platform currently helps protect 100 million users, according to the company. The buyer believes it can help build on these numbers.

The company was founded way back in 1985, and raised over $500 million before going public in 2015, according to PitchBook data. Products include Managed Threat Response, XG Firewall and Intercept X Endpoint.

Thought Machine nabs $83M for a cloud-based platform that powers banking services

The world of consumer banking has seen a massive shift in the last ten years. Gone are the days where you could open an account, take out a loan, or discuss changing the terms of your banking only by visiting a physical branch. Now, you can do all this and more with a few quick taps on your phone screen — a shift that has accelerated with customers expecting and demanding even faster and more responsive banking services.

As one mark of that switch, today a startup called Thought Machine, which has built cloud-based technology that powers this new generation of services on behalf of both old and new banks, is announcing some significant funding — $83 million — a Series B that the company plans to use to continue investing in its platform and growing its customer base.

To date, Thought Machine’s customers are primarily in Europe and Asia — they include large, legacy outfits like Standard Chartered, Lloyds Banking Group, and Sweden’s SEB through to “challenger” (AKA neo-) banks like Atom Bank. Some of this financing will go towards boosting the startup’s activities in the US, including opening an office in the country later this year and moving ahead with commercial deals.

The funding is being led by Draper Esprit, with participation also from existing investors Lloyds Banking Group, IQ Capital, Backed and Playfair.

Thought Machine, which started in 2014 and now employs 300, is not disclosing its valuation but Paul Taylor, the CEO and founder, noted that the market cap is currently “increasing healthily.” In its last round, according to PitchBook estimates, the company was valued at around $143 million, which, at this stage of funding, puts this latest round potentially in the range of between $220 million and $320 million.

Thought Machine is not yet profitable, mainly because it is in growth mode, said Taylor. Of note, the startup has been through one major bankruptcy restructuring, although it appears that this was mainly for organisational purposes: all assets, employees and customers from one business controlled by Taylor were acquired by another.

Thought Machine’s primary product and technology is called Vault, a platform that contains a range of banking services: checking accounts, savings accounts, loans, credit cards and mortgages. Thought Machine does not sell directly to consumers, but sells by way of a B2B2C model.

The services are provisioned by way of smart contracts, which allows Thought Machine and its banking customers to personalise, vary and segment the terms for each bank — and potentially for each customer of the bank.

Food for Thought (Machine)

It’s a little odd to think that there is an active market for banking services that are not built and owned by the banks themselves. After all, aren’t these the core of what banks are supposed to do?

But one way to think about it is in the context of eating out. Restaurants’ kitchens will often make in-house what they sell and serve. But in some cases, when it makes sense, even the best places will buy in (and subsequently sell) food that was crafted elsewhere. For example, a restaurant will re-sell cheese or charcuterie, and the wine is likely to come from somewhere else, too.

The same is the case for banks, whose “Crown Jewels” are in fact not the mechanics of their banking services, but their customer service, their customer lists, and their deposits. Better banking services (which may not have been built “in-house”) are key to growing these other three.

“There are all sorts of banks, and they are all trying to find niches,” said Taylor. Indeed, the startup is not the only one chasing that business. Others include Mambu, Temenos and Italy’s Edera.

In the case of the legacy banks that work with the startup, the idea is that these behemoths can migrate into the next generation of consumer banking services and banking infrastructure by cherry-picking services from the VaultOS platform.

“Banks have not kept up and are marooned on their own tech, and as each year goes by, it comes more problematic,” noted Taylor.

In the case of neobanks, Thought Machine’s pitch is that it has already built the rails to run a banking service, so a startup — “new challengers like Monzo and Revolut that are creating quite a lot of disruption in the market” (and are growing very quickly as a result) — can integrate into these to get off the ground more quickly and handle scaling with less complexity (and lower costs).

Money talks

Taylor was new to fintech when he founded Thought Machine, but he has a notable track record in the world of tech that you could argue played a big role in his subsequent foray into banking.

Formerly an academic specialising in linguistics and engineering, his first startup, Rhetorical Systems, commercialised some of his early speech-to-text research and was later sold to Nuance in 2004.

His second entrepreneurial effort, Phonetic Arts, was another speech startup, aimed at tech that could be used in gaming interactions. In 2010, Google approached the startup to see if it wanted to work on a new speech-to-text service it was building. It ended up acquiring Phonetic Arts, and Taylor took on the role of building and launching Google Now, with that voice tech eventually making its way to Google Maps, accessibility services, the Google Assistant and other places where you speech-based interaction makes an appearance in Google products.

While he was working for years in the field, the step changes that really accelerated voice recognition and speech technology, Taylor said, were the rapid increases in computing power and data networks that “took us over the edge” in terms of what a machine could do, specifically in the cloud.

And those are the same forces, in fact, that led to consumers being able to run our banking services from smartphone apps, and for us to want and expect more personalised services overall. Taylor’s move into building and offering a platform-based service to address the need for multiple third-party banking services follows from that, and also is the natural heir to the platform model you could argue Google and other tech companies have perfected over the years.

Draper Esprit has to date built up a strong portfolio of fintech startups that includes Revolut, N26, TransferWise and Freetrade. Thought Machine’s platform approach is an obvious complement to that list. (Taylor did not disclose if any of those companies are already customers of Thought Machine’s, but if they are not, this investment could be a good way of building inroads.)

“We are delighted to be partnering with Thought Machine in this phase of their growth,” said Vinoth Jayakumar, Investment Director, Draper Esprit, in a statement. “Our investments in Revolut and N26 demonstrate how banking is undergoing a once in a generation transformation in the technology it uses and the benefit it confers to the customers of the bank. We continue to invest in our thesis of the technology layer that forms the backbone of banking. Thought Machine stands out by way of the strength of its engineering capability, and is unique in being the only company in the banking technology space that has developed a platform capable of hosting and migrating international Tier 1 banks. This allows innovative banks to expand beyond digital retail propositions to being able to run every function and type of financial transaction in the cloud.”

“We first backed Thought Machine at seed stage in 2016 and have seen it grow from a startup to a 300-person strong global scale-up with a global customer base and potential to become one of the most valuable European fintech companies,” said Max Bautin, Founding Partner of IQ Capital, in a statement. “I am delighted to continue to support Paul and the team on this journey, with an additional £15 million investment from our £100 million Growth Fund, aimed at our venture portfolio outperformers.”