Nvidia acquires data storage and management platform SwiftStack

Nvidia today announced that it has acquired SwiftStack, a software-centric data storage and management platform that supports public cloud, on-premises and edge deployments.

The company’s recent launches focused on improving its support for AI, high-performance computing and accelerated computing workloads, which is surely what Nvidia is most interested in here.

“Building AI supercomputers is exciting to the entire SwiftStack team,” says the company’s co-founder and CPO Joe Arnold in today’s announcement. “We couldn’t be more thrilled to work with the talented folks at NVIDIA and look forward to contributing to its world-leading accelerated computing solutions.”

The two companies did not disclose the price of the acquisition, but SwiftStack had previously raised about $23.6 million in Series A and B rounds led by Mayfield Fund and OpenView Venture Partners. Other investors include Storm Ventures and UMC Capital.

SwiftStack, which was founded in 2011, placed an early bet on OpenStack, the massive open-source project that aimed to give enterprises an AWS-like management experience in their own data centers. The company was one of the largest contributors to OpenStack’s Swift object storage platform and offered a number of services around this, though it seems like in recent years it has downplayed the OpenStack relationship as that platform’s popularity has fizzled in many verticals.

SwiftStack lists the likes of PayPal, Rogers, data center provider DC Blox, Snapfish and Verizon (TechCrunch’s parent company) on its customer page. Nvidia, too, is a customer.

SwiftStack notes that it team will continue to maintain an existing set of open source tools like Swift, ProxyFS, 1space and Controller.

“SwiftStack’s technology is already a key part of NVIDIA’s GPU-powered AI infrastructure, and this acquisition will strengthen what we do for you,” says Arnold.

Google Cloud announces four new regions as it expands its global footprint

Google Cloud today announced its plans to open four new data center regions. These regions will be in Delhi (India), Doha (Qatar), Melbourne (Australia) and Toronto (Canada) and bring Google Cloud’s total footprint to 26 regions. The company previously announced that it would open regions in Jakarta, Las Vegas, Salt Lake City, Seoul and Warsaw over the course of the next year. The announcement also comes only a few days after Google opened its Salt Lake City data center.

GCP already had a data center presence in India, Australia and Canada before this announcement, but with these newly announced regions, it now offers two geographically separate regions for in-country disaster recovery, for example.

Google notes that the region in Doha marks the company’s first strategic collaboration agreement to launch a region in the Middle East with the Qatar Free Zones Authority. One of the launch customers there is Bespin Global, a major managed services provider in Asia.

“We work with some of the largest Korean enterprises, helping to drive their digital transformation initiatives. One of the key requirements that we have is that we need to deliver the same quality of service to all of our customers around the globe,” said John Lee, CEO, Bespin Global. “Google Cloud’s continuous investments in expanding their own infrastructure to areas like the Middle East make it possible for us to meet our customers where they are.”

Honeywell says it will soon launch the world’s most powerful quantum computer

“The best-kept secret in quantum computing.” That’s what Cambridge Quantum Computing (CQC) CEO Ilyas Khan called Honeywell‘s efforts in building the world’s most powerful quantum computer. In a race where most of the major players are vying for attention, Honeywell has quietly worked on its efforts for the last few years (and under strict NDA’s, it seems). But today, the company announced a major breakthrough that it claims will allow it to launch the world’s most powerful quantum computer within the next three months.

In addition, Honeywell also today announced that it has made strategic investments in CQC and Zapata Computing, both of which focus on the software side of quantum computing. The company has also partnered with JPMorgan Chase to develop quantum algorithms using Honeywell’s quantum computer. The company also recently announced a partnership with Microsoft.

Honeywell has long built the kind of complex control systems that power many of the world’s largest industrial sites. It’s that kind of experience that has now allowed it to build an advanced ion trap that is at the core of its efforts.

This ion trap, the company claims in a paper that accompanies today’s announcement, has allowed the team to achieve decoherence times that are significantly longer than those of its competitors.

“It starts really with the heritage that Honeywell had to work from,” Tony Uttley, the president of Honeywell Quantum Solutions, told me. “And we, because of our businesses within aerospace and defense and our business in oil and gas — with solutions that have to do with the integration of complex control systems because of our chemicals and materials businesses — we had all of the underlying pieces for quantum computing, which are just fabulously different from classical computing. You need to have ultra-high vacuum system capabilities. You need to have cryogenic capabilities. You need to have precision control. You need to have lasers and photonic capabilities. You have to have magnetic and vibrational stability capabilities. And for us, we had our own foundry and so we are able to literally design our architecture from the trap up.”

The result of this is a quantum computer that promises to achieve a quantum Volume of 64. Quantum Volume (QV), it’s worth mentioning, is a metric that takes into account both the number of qubits in a system as well as decoherence times. IBM and others have championed this metric as a way to, at least for now, compare the power of various quantum computers.

So far, IBM’s own machines have achieved QV 32, which would make Honeywell’s machine significantly more powerful.

Khan, whose company provides software tools for quantum computing and was one of the first to work with Honeywell on this project, also noted that the focus on the ion trap is giving Honeywell a bit of an advantage. “I think that the choice of the ion trap approach by Honeywell is a reflection of a very deliberate focus on the quality of qubit rather than the number of qubits, which I think is fairly sophisticated,” he said. “Until recently, the headline was always growth, the number of qubits running.”

The Honeywell team noted that many of its current customers are also likely users of its quantum solutions. These customers, after all, are working on exactly the kind of problems in chemistry or material science that quantum computing, at least in its earliest forms, is uniquely suited for.

Currently, Honeywell has about 100 scientists, engineers and developers dedicated to its quantum project.

Datastax acquires The Last Pickle

Data management company Datastax, one of the largest contributors to the Apache Cassandra project, today announced that it has acquired The Last Pickle (and no, I don’t know what’s up with that name either), a New Zealand-based Cassandra consulting and services firm that’s behind a number of popular open-source tools for the distributed NoSQL database.

As Datastax Chief Strategy Officer Sam Ramji, who you may remember from his recent tenure at Apigee, the Cloud Foundry Foundation, Google and Autodesk, told me, The Last Pickle is one of the premier Apache Cassandra consulting and services companies. The team there has been building Cassandra-based open source solutions for the likes of Spotify, T Mobile and AT&T since it was founded back in 2012. And while The Last Pickle is based in New Zealand, the company has engineers all over the world that do the heavy lifting and help these companies successfully implement the Cassandra database technology.

It’s worth mentioning that Last Pickle CEO Aaron Morton first discovered Cassandra when he worked for WETA Digital on the special effects for Avatar, where the team used Cassandra to allow the VFX artists to store their data.

“There’s two parts to what they do,” Ramji explained. “One is the very visible consulting, which has led them to become world experts in the operation of Cassandra. So as we automate Cassandra and as we improve the operability of the project with enterprises, their embodied wisdom about how to operate and scale Apache Cassandra is as good as it gets — the best in the world.” And The Last Pickle’s experience in building systems with tens of thousands of nodes — and the challenges that its customers face — is something Datastax can then offer to its customers as well.

And Datastax, of course, also plans to productize The Last Pickle’s open-source tools like the automated repair tool Reaper and the Medusa backup and restore system.

As both Ramji and Datastax VP of Engineering Josh McKenzie stressed, Cassandra has seen a lot of commercial development in recent years, with the likes of AWS now offering a managed Cassandra service, for example, but there wasn’t all that much hype around the project anymore. But they argue that’s a good thing. Now that it is over ten years old, Cassandra has been battle-hardened. For the last ten years, Ramji argues, the industry tried to figure out what the de factor standard for scale-out computing should be. By 2019, it became clear that Kubernetes was the answer to that.

“This next decade is about what is the de facto standard for scale-out data? We think that’s got certain affordances, certain structural needs and we think that the decades that Cassandra has spent getting harden puts it in a position to be data for that wave.”

McKenzie also noted that Cassandra provides users with a number of built-in features like support for mutiple data centers and geo-replication, rolling updates and live scaling, as well as wide support across programming languages, give it a number of advantages over competing databases.

“It’s easy to forget how much Cassandra gives you for free just based on its architecture,” he said. “Losing the power in an entire datacenter, upgrading the version of the database, hardware failing every day? No problem. The cluster is 100 percent always still up and available. The tooling and expertise of The Last Pickle really help bring all this distributed and resilient power into the hands of the masses.”

The two companies did not disclose the price of the acquisition.

Ampere launches new chip built from ground up for cloud workloads

Ampere, the chip startup run by former Intel President Renee James, announced a new chip today that she says is designed specifically to optimize for cloud workloads.

Ampere VP of product Jeff Wittich says the new chip is called the Ampere Altra, and it’s been designed with some features that should make it attractive to cloud providers. This involves three main focuses including high performance, scalability and power efficiency — all elements that would be important to cloud vendors operating at scale.

The Altra is an ARM chip with some big features.”It’s 64-bit ARM cores or 160 cores in a two-socket platforms –we support both one socket and two socket [configurations]. We are running at 3 GHz turbo, and that’s 3 GHz across all of the cores because of the way that cloud delivers compute, you’re utilizing all the cores as much of the time as possible. So our turbo performance was optimized for all of the cores being able to sustain it all the time,” Wittich explained.

The company sees this chip as a kind of workhorse for the cloud. “We’ve really looked at this as we’re designing a general purpose CPU that is built for the cloud environment, so you can utilize that compute the way the cloud utilizes that type of compute. So it supports the vast array of all of the workloads that run in the cloud,” he said.

Founder and CEO James says the company has been working with their cloud customers to give them the kind of information they need to optimize the chip for their individual workloads at a granular configuration level, something the hyper scalers in particular really require.

“Let’s go do what we can to build the platform that delivers the raw power and performance, the kind of environment that you’re looking for, and then have a design approach that enables them to work with us on what’s important and the kind of control, that kind of feature set that’s unique because each one of them have their own software environment,” James explained.

Among the companies working with Ampere early on have been Oracle (an investor, according to Crunchbase) and Microsoft, among others.

James says one of the unforeseen challenges of delivering this chip is possible disruptions in the supply chain due to the Corona-19 virus and its impact in Asia where many of the parts come from, and the chips are assembled.

She says the company has taken that into consideration and has been able to build up a worldwide supply chain she hopes will help with hiccups that might occur because of supply chain slow downs.

Stack Overflow expands its Teams service with new integrations

Most developers think of Stack Overflow as a question and answer site for their programming questions. But over the last few years, the company has also built a successful business in its Stack Overflow for Teams product, which essentially offers companies a private version of its Q&A product. Indeed, the Teams product now brings in a significant amount of revenue for the company and the new executive team at Stack Overflow is betting that it can help the company grow rapidly in the years to come.

To make Teams even more attractive to businesses, the company today launched a number of new integrations with Jira (Enterprise and Business), GitHub (Enterprise and Business) and Microsoft Teams (Enterprise). These join existing integrations with Slack, Okta and the Business tier of Microsoft Teams.

“I think the integrations that we have been building are reflective of that developer workflow and all of the number of tools that someone who is building and leveraging technology has to interact with,” Stack Overflow Chief Product Officer Teresa Dietrich told me. “When we think about integrations, we think about the vertical right, and I think that ‘developer workflow’ is one of those industry verticals that we’re thinking about. ChatOps is obviously another one, as you can see from our Slack and Teams integration. And the JIRA and GitHub [integrations] that we’re building are really at the core of a developer workflow.”

Current Stack Overflow for Teams customers include the likes of Microsoft, Expensify and Wix. As the company noted, 65 percent of its existing Teams customers use GitHub, so it’s no surprise that it is building out this integration.

The Case for Limiting Your Browser Extensions

Last week, KrebsOnSecurity reported to health insurance provider Blue Shield of California that its Web site was flagged by multiple security products as serving malicious content. Blue Shield quickly removed the unauthorized code. An investigation determined it was injected by a browser extension installed on the computer of a Blue Shield employee who’d edited the Web site in the past month.

The incident is a reminder that browser extensions — however useful or fun they may seem when you install them — typically have a great deal of power and can effectively read and/or write all data in your browsing sessions. And as we’ll see, it’s not uncommon for extension makers to sell or lease their user base to shady advertising firms, or in some cases abandon them to outright cybercriminals.

The health insurance site was compromised after an employee at the company edited content on the site while using a Web browser equipped with a once-benign but now-compromised extension which quietly injected code into the page.

The extension in question was Page Ruler, a Chrome addition with some 400,000 downloads. Page Ruler lets users measure the inch/pixel width of images and other objects on a Web page. But the extension was sold by the original developer a few years back, and for some reason it’s still available from the Google Chrome store despite multiple recent reports from people blaming it for spreading malicious code.

How did a browser extension lead to a malicious link being added to the health insurance company Web site? This compromised extension tries to determine if the person using it is typing content into specific Web forms, such as a blog post editing system like WordPress or Joomla.

In that case, the extension silently adds a request for a javascript link to the end of whatever the user types and saves on the page. When that altered HTML content is saved and published to the Web, the hidden javascript code causes a visitor’s browser to display ads under certain conditions.

Who exactly gets paid when those ads are shown or clicked is not clear, but there are a few clues about who’s facilitating this. The malicious link that set off antivirus alarm bells when people tried to visit Blue Shield California downloaded javascript content from a domain called linkojager[.]org.

The file it attempted to download — 212b3d4039ab5319ec.js — appears to be named after an affiliate identification number designating a specific account that should get credited for serving advertisements. A simple Internet search shows this same javascript code is present on hundreds of other Web sites, no doubt inadvertently published by site owners who happened to be editing their sites with this Page Ruler extension installed.

If we download a copy of that javascript file and view it in a text editor, we can see the following message toward the end of the file:

[NAME OF EXTENSION HERE]’s development is supported by advertisements that are added to some of the websites you visit. During the development of this extension, I’ve put in thousands of hours adding features, fixing bugs and making things better, not mentioning the support of all the users who ask for help.

Ads support most of the internet we all use and love; without them, the internet we have today would simply not exist. Similarly, without revenue, this extension (and the upcoming new ones) would not be possible.

You can disable these ads now or later in the settings page. You can also minimize the ads appearance by clicking on partial support button. Both of these options are available by clicking ’x’ button in the corner of each ad. In both cases, your choice will remain in effect unless you reinstall or reset the extension.

This appears to be boilerplate text used by one or more affiliate programs that pay developers to add a few lines of code to their extensions. The opt-out feature referenced in the text above doesn’t actually work because it points to a domain that no longer resolves — thisadsfor[.]us. But that domain is still useful for getting a better idea of what we’re dealing with here.

Registration records maintained by DomainTools [an advertiser on this site] say it was originally registered to someone using the email address frankomedison1020@gmail.com. A reverse WHOIS search on that unusual name turns up several other interesting domains, including icontent[.]us.

icontent[.]us is currently not resolving either, but a cached version of it at Archive.org shows it once belonged to an advertising network called Metrext, which marketed itself as an analytics platform that let extension makers track users in real time.

An archived copy of the content once served at icontent[.]us promises “plag’n’play” capability.

“Three lines into your product and it’s in live,” iContent enthused. “High revenue per user.”

Another domain tied to Frank Medison is cdnpps[.]us, which currently redirects to the domain “monetizus[.]com.” Like its competitors, Monetizus’ site is full of grammar and spelling errors: “Use Monetizus Solutions to bring an extra value to your toolbars, addons and extensions, without loosing an audience,” the company says in a banner at the top of its site.

Be sure not to “loose” out on sketchy moneymaking activities!

Contacted by KrebsOnSecurity, Page Ruler’s original developer Peter Newnham confirmed he sold his extension to MonetizUs in 2017.

“They didn’t say what they were going to do with it but I assumed they were going to try to monetize it somehow, probably with the scripts their website mentions,” Newnham said.

“I could have probably made a lot more running ad code myself but I didn’t want the hassle of managing all of that and Google seemed to be making noises at the time about cracking down on that kind of behaviour so the one off payment suited me fine,” Newnham said. “Especially as I hadn’t updated the extension for about 3 years and work and family life meant I was unlikely to do anything with it in the future as well.”

Monetizus did not respond to requests for comment.

Newnham declined to say how much he was paid for surrendering his extension. But it’s not difficult to see why developers might sell or lease their creation to a marketing company: Many of these entities offer the promise of a hefty payday for extensions with decent followings. For example, one competing extension monetization platform called AddonJet claims it can offer revenues of up to $2,500 per day for every 100,000 user in the United States (see screenshot below).

Read here how its work!

I hope it’s obvious by this point, but readers should be extremely cautious about installing extensions — sticking mainly to those that are actively supported and respond to user concerns. Personally, I do not make much use of browser extensions. In almost every case I’ve considered installing one I’ve been sufficiently spooked by the permissions requested that I ultimately decided it wasn’t worth the risk.

If you’re the type of person who uses multiple extensions, it may be wise to adopt a risk-based approach going forward. Given the high stakes that typically come with installing an extension, consider carefully whether having the extension is truly worth it. This applies equally to plug-ins designed for Web site content management systems like WordPress and Joomla.

Do not agree to update an extension if it suddenly requests more permissions than a previous version. This should be a giant red flag that something is not right. If this happens with an extension you trust, you’d be well advised to remove it entirely.

Also, never download and install an extension just because some Web site says you need it to view some type of content. Doing otherwise is almost always a high-risk proposition. Here, Rule #1 from KrebsOnSecurity’s Three Rules of Online Safety comes into play: “If you didn’t go looking for it, don’t install it.” Finally, in the event you do wish to install something, make sure you’re getting it directly from the entity that produced the software.

Google Chrome users can see any extensions they have installed by clicking the three dots to the right of the address bar, selecting “More tools” in the resulting drop-down menu, then “Extensions.” In Firefox, click the three horizontal bars next to the address bar and select “Add-ons,” then click the “Extensions” link on the resulting page to view any installed extensions.

Mind Games | The Evolving Psychology of Ransom Notes

Ransomware is one of cybercrime’s most fearsome weapons. With some estimates claiming it could cause losses of $7.5 billion in the US and $2.3 Billion for Canadian companies, there’s no doubt its financial impact is felt throughout the world, even if those figures may not be quite on the mark.

But in addition to the direct and indirect financial damage, ransomware differs from other forms of cybercrime by employing deep psychological mechanisms. In contrast to almost any other type of malware, ransomware doesn’t conceal its activity; it announces itself to the victim and coerces the target toward action (paying the ransom). Precisely how ransomware leverages the victim’s fears, beliefs, motives and values has evolved over time, both in response to attackers learning from their own mistakes – resulting in victims not paying – and the behavior of victims and the security industry, such as getting better at off-site backups and creating decryptors. In this post, we take a look at how the psychological mechanisms used by ransomware operators have evolved over time in an attempt to maximize the chances of a payout.

What Do Ransomware Attackers Need?

The basic functionality of ransomware involves initial infection, rapid encryption and victim notification, where the victim is informed that they have lost access to their data and must make payment for its release. On a psychological level, what criminals need to ensure is:

A.    That they grab the attention of the victim (ransomware that goes unnoticed is useless).

B.    Apply leverage on the victims; they must be persuaded that payment is their only viable option. 

C.    Create a sense of urgency: in order to maximize profitability, speed is of the essence (it allows greater throughput and higher returns before the malware is neutralized by security vendors).  

Ever since the first ransomware was introduced in 1989, the ransom demand methods have evolved through continual “trial and error” in an ongoing attempt to maximize the yield and reduce the risk involved.

Early Ransomware: Your Check is in the Post!

The first ever ransomware was created in 1989 by the biologist Joseph L. Popp. He distributed the malware in a rather old-fashioned way, by mailing (not emailing) 20,000 infected diskettes labeled “AIDS Information: Introductory Diskettes” to attendees of the World Health Organization’s international AIDS conference in Stockholm. Once executed, the malware hid file directories, encrypted file names and demanded victims send $189 to a PO Box in Panama if they wanted their data decrypted.

Popp made little if no money as it turned out. Decryptors were soon created that could reverse the damage done, but it did strike terror into its victims. There were reports that some medical institutions threw away up to 10 years of research after being hit by the ransomware.

Popp used several kinds of notifications to the victim; like many others that would follow in his footsteps, he indulged in mocking and scaring the victim:

Hurry, Hurry! Buy Now, While Stocks Last!

Many years passed before ransomware would hit the big time and started to employ other psychological mechanisms. One of the first to be introduced was a timer counting down to a point in the near future (say, 48 hours from infection) after which the files would be deleted and be totally unrecoverable, regardless of whether the victim subsequently wanted to pay. 

This mechanism is well known from the marketing world, and is used to create a sense of urgency. However, many ransomware victims were unable to pay within the given time, or didn’t understand what they had to do, or wasted all the countdown period looking for help, such that many of them missed the deadline, forcing attackers to carry out their threat and delete the encryption keys, leaving the files locked forever. 

While the countdown method helped to enhance the evil, relentless reputation of this kind of attack, it had a downside for the criminals: it wasn’t very effective in terms of maximizing returns as many victims would have paid eventually, but the countdown method made it impossible to collect the ransom. As a result, eventually ransomware perpetrators were forced to adapt the method. Newer forms of ransomware now often try to incentivise the victim by offering a lower price the faster they respond. “The faster you get in contact – the lower price you can expect”.

Shock and Awe: Alarming Sounds and Images

Pushing people to action then shifted to using visual and auditory scare tactics. The ransom note became more sinister and incorporated scary visuals and alarming sounds, intended to make it impossible to ignore. 

The Cerber ransomware used a VBScript to play an audio message that repeatedly announced “Attention! Your documents, photos, databases and other important files have been encrypted!”.

At the bottom of the Cerber ransom note pictured above, note the Latin dictum, “Quod me non necat fortiorem facit”, which translates to the more familiar “That which does not kill me, makes me stronger”. Quite why the attackers chose to include this little adage of fortitude remains a mystery; perhaps they thought it would help encourage some of their more educated victims to see the wisdom of paying!

But in terms of ensuring and expediting payment, the results were inconclusive (for a deeper dive into the psychological mechanisms used in ransomware splash screens, see this study sponsored by SentinelOne).

Waving a Bigger Stick

When threats and scare techniques were not enough to drive people to action (or at least not fast enough), ransomware attackers adapted their psychological ploys. Recent examples of ransomware take a variety of approaches, from showing much greater aggression, aimed at convincing people to pay the full sum instead of negotiating, to ‘soft sales’ with empathic concern. 

A good example of the former is the ransom note of Megacotex, which warns victims not to try to send partial payment or to negotiate a discount, else they will never see their data again. 

image of megacortex ransom note 3

Other ransomware operators understand that different victims may require different psychological entreaties and instead opt for empathy. Snake ransomware for example, tries to come across as reasonable and concerned with a FAQ-style presentation and reassuring language like “don’t worry” and “you can be up and running in no time”.

Notably the “demand” shifts to the more familiar ‘soft’ marketing call to action of “if you are interested in purchasing…”. About all that’s missing from this as compared to a regular sales message is the offer of 0% interest-free financing! 

Playing the Shame Game

From sometime around 2013 or 2014, a number of more organized attackers began quietly exfiltrating victim data prior to encrypting it. Cerber ransomware was perhaps one of the first. But recently, we have seen a growing shift in using the exfiltrated data directly as a method of coercion. The technique was perhaps first used by Maze ransomware, and now is also leveraged by DoppelPaymer, Sodinokibi and Nemty. This new twist involves setting up a publicly accessible website and exposing the data of uncooperative victims. 

The technique of threatening the victims with the publication of their sensitive data if they don’t pay turns the screw ever more tightly by discouraging victims from seeking out decryptors or avoiding making payment by restoring from backups. Now, even if the victim can recover, the threat of their data being exposed may still convince them to pay up.

When Script Kiddies Play with Ransomware

Last but not least, and what to some could be the most sinister of all, one recent variant combines ransomware with sextortion, and demands payment in nude photos instead of Bitcoins. The malware shows a message requesting the victims to send nude images to a specific email address in order to recover their files.

Most likely the work of children or adolescents and with a public decryptor already available, it’s nevertheless a prime example of how ransomware code is now in the hands of anyone with malicious intent.

Conclusion

The psychological impacts of cybercrime, and ransomware in particular, are profound. For many victims it results in worry, anguish, disbelief, and a sense of helplessness. Typically, those who have experienced this kind of cybercrime report that the effects last months after the actual incident. In this regard, ignorance might actually be bliss for victims, but sadly, this is one facet that ransomware victims will continue to suffer from in the foreseeable future.

As we have seen, with ransomware operators increasingly willing and able to exfiltrate data prior to encryption, organizations can now no longer rely on even having a good backup policy as an effective means to mitigate the threat of ransomware. The only sure defense is to stop the breach occurring in the first place, and for that you need a trusted, automated behavioral AI solution watching your back. If you’d like to see how SentinelOne can help protect your organization from the ransomware threat, contact us today or request a free demo


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Thought Machine nabs $83M for a cloud-based platform that powers banking services

The world of consumer banking has seen a massive shift in the last ten years. Gone are the days where you could open an account, take out a loan, or discuss changing the terms of your banking only by visiting a physical branch. Now, you can do all this and more with a few quick taps on your phone screen — a shift that has accelerated with customers expecting and demanding even faster and more responsive banking services.

As one mark of that switch, today a startup called Thought Machine, which has built cloud-based technology that powers this new generation of services on behalf of both old and new banks, is announcing some significant funding — $83 million — a Series B that the company plans to use to continue investing in its platform and growing its customer base.

To date, Thought Machine’s customers are primarily in Europe and Asia — they include large, legacy outfits like Standard Chartered, Lloyds Banking Group, and Sweden’s SEB through to “challenger” (AKA neo-) banks like Atom Bank. Some of this financing will go towards boosting the startup’s activities in the US, including opening an office in the country later this year and moving ahead with commercial deals.

The funding is being led by Draper Esprit, with participation also from existing investors Lloyds Banking Group, IQ Capital, Backed and Playfair.

Thought Machine, which started in 2014 and now employs 300, is not disclosing its valuation but Paul Taylor, the CEO and founder, noted that the market cap is currently “increasing healthily.” In its last round, according to PitchBook estimates, the company was valued at around $143 million, which, at this stage of funding, puts this latest round potentially in the range of between $220 million and $320 million.

Thought Machine is not yet profitable, mainly because it is in growth mode, said Taylor. Of note, the startup has been through one major bankruptcy restructuring, although it appears that this was mainly for organisational purposes: all assets, employees and customers from one business controlled by Taylor were acquired by another.

Thought Machine’s primary product and technology is called Vault, a platform that contains a range of banking services: checking accounts, savings accounts, loans, credit cards and mortgages. Thought Machine does not sell directly to consumers, but sells by way of a B2B2C model.

The services are provisioned by way of smart contracts, which allows Thought Machine and its banking customers to personalise, vary and segment the terms for each bank — and potentially for each customer of the bank.

Food for Thought (Machine)

It’s a little odd to think that there is an active market for banking services that are not built and owned by the banks themselves. After all, aren’t these the core of what banks are supposed to do?

But one way to think about it is in the context of eating out. Restaurants’ kitchens will often make in-house what they sell and serve. But in some cases, when it makes sense, even the best places will buy in (and subsequently sell) food that was crafted elsewhere. For example, a restaurant will re-sell cheese or charcuterie, and the wine is likely to come from somewhere else, too.

The same is the case for banks, whose “Crown Jewels” are in fact not the mechanics of their banking services, but their customer service, their customer lists, and their deposits. Better banking services (which may not have been built “in-house”) are key to growing these other three.

“There are all sorts of banks, and they are all trying to find niches,” said Taylor. Indeed, the startup is not the only one chasing that business. Others include Mambu, Temenos and Italy’s Edera.

In the case of the legacy banks that work with the startup, the idea is that these behemoths can migrate into the next generation of consumer banking services and banking infrastructure by cherry-picking services from the VaultOS platform.

“Banks have not kept up and are marooned on their own tech, and as each year goes by, it comes more problematic,” noted Taylor.

In the case of neobanks, Thought Machine’s pitch is that it has already built the rails to run a banking service, so a startup — “new challengers like Monzo and Revolut that are creating quite a lot of disruption in the market” (and are growing very quickly as a result) — can integrate into these to get off the ground more quickly and handle scaling with less complexity (and lower costs).

Money talks

Taylor was new to fintech when he founded Thought Machine, but he has a notable track record in the world of tech that you could argue played a big role in his subsequent foray into banking.

Formerly an academic specialising in linguistics and engineering, his first startup, Rhetorical Systems, commercialised some of his early speech-to-text research and was later sold to Nuance in 2004.

His second entrepreneurial effort, Phonetic Arts, was another speech startup, aimed at tech that could be used in gaming interactions. In 2010, Google approached the startup to see if it wanted to work on a new speech-to-text service it was building. It ended up acquiring Phonetic Arts, and Taylor took on the role of building and launching Google Now, with that voice tech eventually making its way to Google Maps, accessibility services, the Google Assistant and other places where you speech-based interaction makes an appearance in Google products.

While he was working for years in the field, the step changes that really accelerated voice recognition and speech technology, Taylor said, were the rapid increases in computing power and data networks that “took us over the edge” in terms of what a machine could do, specifically in the cloud.

And those are the same forces, in fact, that led to consumers being able to run our banking services from smartphone apps, and for us to want and expect more personalised services overall. Taylor’s move into building and offering a platform-based service to address the need for multiple third-party banking services follows from that, and also is the natural heir to the platform model you could argue Google and other tech companies have perfected over the years.

Draper Esprit has to date built up a strong portfolio of fintech startups that includes Revolut, N26, TransferWise and Freetrade. Thought Machine’s platform approach is an obvious complement to that list. (Taylor did not disclose if any of those companies are already customers of Thought Machine’s, but if they are not, this investment could be a good way of building inroads.)

“We are delighted to be partnering with Thought Machine in this phase of their growth,” said Vinoth Jayakumar, Investment Director, Draper Esprit, in a statement. “Our investments in Revolut and N26 demonstrate how banking is undergoing a once in a generation transformation in the technology it uses and the benefit it confers to the customers of the bank. We continue to invest in our thesis of the technology layer that forms the backbone of banking. Thought Machine stands out by way of the strength of its engineering capability, and is unique in being the only company in the banking technology space that has developed a platform capable of hosting and migrating international Tier 1 banks. This allows innovative banks to expand beyond digital retail propositions to being able to run every function and type of financial transaction in the cloud.”

“We first backed Thought Machine at seed stage in 2016 and have seen it grow from a startup to a 300-person strong global scale-up with a global customer base and potential to become one of the most valuable European fintech companies,” said Max Bautin, Founding Partner of IQ Capital, in a statement. “I am delighted to continue to support Paul and the team on this journey, with an additional £15 million investment from our £100 million Growth Fund, aimed at our venture portfolio outperformers.”

Thoma Bravo completes $3.9B Sophos acquisition

Thoma Bravo announced today that it has closed its hefty $3.9 billion acquisition of security firm Sophos, marking yet another private equity deal in the books.

The deal was originally announced in October. Stockholders voted to approve the deal in December.

They were paid $7.40 USD per share for their trouble, according to the company, and it indicated that as part of the closing, the stock had ceased trading on the London Stock Exchange. It also pointed out that investors who got in at the IPO price in June 2015 made a 168% premium on that investment.

Sophos hopes its new owner can help the company continue to modernize the platform. “With Thoma Bravo as a partner, we believe we can accelerate our progress and get to the future even faster, with dramatic benefits for our customers, our partners and our company as a whole,” Sophos CEO Kris Hagerman said in a statement. Whether it will enjoy those benefits or not, time will tell.

As for the buyer, it sees a company with a strong set of channel partners that it can access to generate more revenue moving forward under the Thoma Bravo umbrella. Sophos currently partners with 53,000 resellers and managed service providers, and counts more than 420,000 companies as customers. The platform currently helps protect 100 million users, according to the company. The buyer believes it can help build on these numbers.

The company was founded way back in 1985, and raised over $500 million before going public in 2015, according to PitchBook data. Products include Managed Threat Response, XG Firewall and Intercept X Endpoint.