What does Red Hat’s sale to IBM tell us about Couchbase’s valuation?

The IPO rush of 2021 continued this week with a fresh filing from NoSQL provider Couchbase. The company raised hundreds of millions while private, making its impending debut an important moment for a number of private investors, including venture capitalists.

According to PitchBook data, Couchbase was last valued at a post-money valuation of $580 million when it raised $105 million in May 2020. The company — despite its expansive fundraising history — is not a unicorn heading into its debut to the best of our knowledge.

We’d like to uncover whether it will be one when it prices and starts to trade, so we dug into Couchbase’s business model and its financial performance, hoping to better understand the company and its market comps.

The Couchbase S-1

The Couchbase S-1 filing details a company that sells database tech. More specifically, Couchbase offers customers database technology that includes what NoSQL can offer (“schema flexibility,” in the company’s phrasing), as well as the ability to ask questions of their data with SQL queries.

Couchbase’s software can be deployed on clouds, including public clouds, in hybrid environments, and even on-prem setups. The company sells to large companies, attracting 541 customers by the end of its fiscal 2021 that generated $107.8 million in annual recurring revenue, or ARR, by the close of last year.

Couchbase breaks its revenue into two main buckets. The first, subscription, includes software license income and what the company calls “support and other” revenues, which it defines as “post-contract support,” or PCS, which is a package of offerings, including “support, bug fixes and the right to receive unspecified software updates and upgrades” for the length of the contract.

The company’s second revenue bucket is services, which is self-explanatory and lower-margin than its subscription products.

How Cyber Safe is Your Drinking Water Supply?

Amid multiple recent reports of hackers breaking into and tampering with drinking water treatment systems comes a new industry survey with some sobering findings: A majority of the 52,000 separate drinking water systems in the United States still haven’t inventoried some or any of their information technology systems — a basic first step in protecting networks from cyberattacks.

The Water Sector Coordinating Council surveyed roughly 600 employees of water and wastewater treatment facilities nationwide, and found 37.9 percent of utilities have identified all IT-networked assets, with an additional 21.7 percent working toward that goal.

The Council found when it comes to IT systems tied to “operational technology” (OT) — systems responsible for monitoring and controlling the industrial operation of these utilities and their safety features — just 30.5 percent had identified all OT-networked assets, with an additional 22.5 percent working to do so.

“Identifying IT and OT assets is a critical first step in improving cybersecurity,” the report concluded. “An organization cannot protect what it cannot see.”

It’s also hard to see threats you’re not looking for: 67.9 percent of water systems reported no IT security incidents in the last 12 months, a somewhat unlikely scenario.

Michael Arceneaux, managing director of the WaterISAC — an industry group that tries to facilitate information sharing and the adoption of best practices among utilities in the water sector — said the survey shows much room for improvement and a need for support and resources.

“Threats are increasing, and the sector, EPA, CISA and USDA need to collaborate to help utilities prevent and recover from compromises,” Arceneaux said on Twitter.

While documenting each device that needs protection is a necessary first step, a number of recent cyberattacks on water treatment systems have been blamed on a failure to properly secure water treatment employee accounts that can be used for remote access.

In April, federal prosecutors unsealed an indictment against a 22-year-old from Kansas who’s accused of hacking into a public water system in 2019. The defendant in that case is a former employee of the water district he allegedly hacked.

In February, we learned that someone hacked into the water treatment plan in Oldsmar, Fla. and briefly increased the amount of sodium hydroxide (a.k.a. lye used to control acidity in the water) to 100 times the normal level. That incident stemmed from stolen or leaked employee credentials for TeamViewer, a popular program that lets users remotely control their computers.

In January, a hacker tried to poison a water treatment plant that served parts of the San Francisco Bay Area, reports Kevin Collier for NBCNews. The hacker in that case also had the username and password for a former employee’s TeamViewer account.

Image: WaterISAC.

Andrew Hildick-Smith is a consultant who served more than 15 years managing remote access systems for the Massachusetts Water Resources Authority. He said the percentage of companies that reported already having inventoried all of their IT systems is roughly equal to the number of larger water utilities (greater than 50,000 population) that recently had to certify to the Environmental Protection Agency (EPA) that they are compliant with the Water Infrastructure Act of 2018.

The water act gives utilities serving between 3,300 and 50,000 residents until the end of this month to complete a cybersecurity risk and resiliency assessment.

But Hildick-Smith said the vast majority of the nation’s water utilities — tens of thousands of them — serve fewer than 3,300 residents, and those utilities currently do not have to report to the EPA about their cybersecurity practices (or the lack thereof).

“A large number of utilities — probably close to 40,000 of them — are small enough that they haven’t been asked to do anything,” he said. “But some of those utilities are kind of doing cybersecurity based on self motivation rather than any requirement.”

According to the water sector report, a great many of the nation’s water utilities are subject to economic disadvantages typical of rural and urban communities.

“Others do not have access to a cybersecurity workforce,” the report explains. “Operating in the background is that these utilities are struggling to maintain and replace infrastructure, maintain revenues while addressing issues of affordability, and comply with safe and clean water regulations.”

The report makes the case for federal funding of state and local systems to provide cybersecurity training, tools and services for those in charge of maintaining IT systems, noting that 38 percent of water systems allocate less than 1 percent of their annual budgets to cybersecurity.

As the recent hacking incidents above can attest, enabling some form of multi-factor authentication for remote access can blunt many of these attacks.

However, the sharing of remote access credentials among water sector employees may be a contributing factor in these recent incidents, since organizations that let multiple employees use the same account also are less likely to have any form of multi-factor enabled.

A copy of the report is available here (PDF).

Update, 6:25 p.m. ET: Clarified that the report was issued by the Water Sector Coordinating Council (not the WaterISAC).

The Good, the Bad and the Ugly in Cybersecurity – Week 25

The Good

This week saw another big victory in the battle against ransomware with the arrest of a large swath of individuals associated with the Clop ransomware operation. Law enforcement officials from the Ukraine, United States, and the Republic of Korea conducted over 20 searches across the Kyiv area, including personal property of the defendants. It is reported that the individuals in question are responsible for nearly $500 million in financial damages.

The Clop ransomware team has been in operation in various forms since 2018. Since then, they have targeted a number of high-value entities including Software AG, Qualys, and TAM International. More recent operations involved the exploitation of Accellion FTP software and appliances. It is reported that the apprehended individuals were primarily involved with the money laundering side of operations. Ultimately, this means that the primary actors behind Clop are still at large. However, this is still a significant blow to their infrastructure and finances.

This week’s good news in the battle against cybercrime was not solely about Clop though. A Russian individual tied to the operations of the Kelihos botnet has also been convicted. Oleg Koshkin maintained a “crypting” service that obfuscates malware payloads so as to evade detection by legacy AV software. Koshkin was convicted of providing a critical service that enabled other cyber criminals to infect thousands of computers around the world.

The Bad

While most well known underground forums have banned discussions and commerce around ransomware, there are still some small outliers that continue. We recently came across two new families being offered to would-be criminals, M3rcury and Camelvalley. In both cases, it appears that these tools are in the very early stages of active development.

M3rcury lists the following as its primary features:

  • Removal of backups (shadow copies and other backups)
  • Hybrid RSA AES-256 encryption
  • UAC bypass
  • Sandbox Detection
  • Evasion of heuristic analysis
  • Heavy obfuscation
  • Scantime packed and crypted
  • Single file decryption to get victim trust

All that and more for 150.00 USD!

Camelvalley is advertised at the same price with similar features, and they go so far as to call out the fact that other forums are backing out of the ransomware world. In their original post, they state:

“Since the Colonial Pipeline cyberattack, there are only few ransomwares out there. All RaaS went private only. And the Locker discussion has been banned nearly everywhere. Except there. So, i’ll propose you a deal you can’t refuse!

The Camelvalley Ransomware comes out with plenty of interesting features, and even a solid Anti-Ransomware protection bypass that makes your cyberattack even more easier. The main goal of this project is to let you use less time on working on the ransomware itself and focus more on the evasion and hacking.”

M3rcury is written in Go and supports both 32 and 64 bit payloads. Camelvalley does not state the same, but it would be a fair assumption. In addition, CamelValley is said to have robust support for multithreaded encryption, uses a combination of RSA 2048 and ChaCha20 encryption algorithms, and makes use of the old RIPlace technique.

As stated before, both of these new ransomware offerings appear to be in the very early stages of development. Whether they emerge to become a true threat in-the-wild remains to be seen. The important thing to remember is this corner of the crimeware market will never go away.

The Ugly

This week the source code for the Paradise Ransomware was leaked across hacking and underground forums. The leaked archive contains all that is required to compile Paradise components from scratch. Upon compilation, users have the configuration builder, decryptor tool, and encrypter (the actual malware).

Paradise was never a particularly interesting family of malware. However, it is important to note that the leaked code does work. In the hands of a crafty individual, it is possible to generate new ransomware samples with a handful of configuration options. The ransomware generated from this leak is detected and prevented by most modern security software. It is, however, possible for individuals to modify the source to their needs, so time will tell what becomes of this.

Also in the world of data breaches, it was another busy week. We saw disclosures from Eggfree Cake Box, Carnival Corp, Intuit and more. These attacks are only getting more frequent and more aggressive. We also have ransomware actors sometimes forgoing the encryption step altogether and simply stealing data and holding it hostage. It is imperative that time and resources be spent on a robust, working, and tested disaster recovery plan. When was the last time cold storage was checked and verified? When was the last time you tested a restore of your backups? You should be able to answer questions like these, in addition to ensuring strong user hygiene and powerful modern endpoint security controls.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

First American Financial Pays Farcical $500K Fine

In May 2019, KrebsOnSecurity broke the news that the website of mortgage settlement giant First American Financial Corp. [NYSE:FAF] was leaking more than 800 million documents — many containing sensitive financial data — related to real estate transactions dating back 16 years. This week, the U.S. Securities and Exchange Commission settled its investigation into the matter after the Fortune 500 company agreed to pay a paltry penalty of less than $500,000.

First American Financial Corp.

If you bought or sold a property in the last two decades or so, chances are decent that you also gave loads of personal and financial documents to First American. According to data from the American Land Title Association, First American is the second largest mortgage title and settlement company in the United States, handling nearly a quarter of all closings each year.

The SEC says First American derives nearly 92 percent of its revenue from its title insurance segment, earning $7.1 billion last year.

Title insurance protects homebuyers from the prospect of someone contesting their legitimacy as the new homeowner. According to SimpleShowing.com, there are actually two title insurance policies in each transaction — one for the buyer and one for the lender (the latter also needs protection as they’re providing the mortgage to purchase the home).

Title insurance is not mandated by law, but most lenders require it as part of any mortgage transaction. In other words, if you wish to take out a mortgage on a home you will not be able to do so without giving companies like First American gobs of documents about your income, assets and liabilities — including quite a bit of sensitive financial data.

Aside from its core business competency — checking to make sure the property at issue in any real estate transaction is unencumbered by any liens or other legal claims against it — First American basically has one job: Protect the privacy and security of all these documents.

A redacted screenshot of one of many millions of sensitive records exposed by First American’s Web site.

It’s easy to see why companies like First American might not view protecting this data as sacrosanct, as the entire industry’s incentive for safeguarding all those sensitive documents is somewhat misaligned.

That is to say, in the title insurance industry the parties to a real estate transaction aren’t customers, but rather they are are the product. The actual customers of the title insurance companies are principally the banks which back these mortgage transactions.

We see a similar dynamic with social media platforms, where the “user” is not the customer at all but the product whose data is being bought and sold by these platforms.

Roughly five months before KrebsOnSecurity notified First American that anyone with a web browser could view sensitive document in its “Eagle Pro” database online just by changing some characters at the end of a link, an internal security audit at First American flagged the exact same vulnerability.

But the company never acted to fix it until the news media came calling.

The SEC’s administrative proceeding (PDF) explains how things slipped through the cracks. Under First American’s documented vulnerability remediation policies, the data leak was classified as a security weakness with a “level 3” severity, which placed it in the “medium risk” category and required remediation within 45 days.

But rather than recording the vulnerability as a level 3 severity, due to a clerical error the vulnerability was erroneously entered as a level 2 or “low risk” severity in First American’s automated tracking system. Level 2 issues required remediation within 90 days. Even so, First American missed that mark.

The SEC said that under First American’s remediation policies, if the person responsible for fixing the problem is unable to do so based on the timeframes listed above, that employee must have their management contact the company’s information security department to discuss their remediation plan and proposed time estimate.

“If it is not technically possible to remediate the vulnerability, or if remediation is cost prohibitive, the [employee] and their management must contact Information Security to obtain a waiver or risk acceptance approval from the CISO,” the SEC explained. “The [employee] did not request a waiver or risk acceptance from the CISO.”

So, someone within First American accepted the risk, but that person neglected to ensure the higher-ups within the company also were comfortable with that risk. It’s difficult not to hum a tune whenever the phrase “accepted the risk” comes up if you’ve ever seen this excellent infosec industry parody.

The SEC took aim at First American because a few days after our May 24, 2019 story ran, the company issued an 8-K filing with the agency stating First American had no prior indication of any vulnerability.

“That statement demonstrated that First American’s senior management was not properly informed of the prior report of a vulnerability and a failure to remediate the problem,” wrote Michael Volkov, a 30-year federal prosecutor who now runs The Volkov Law Group in Washington, D.C.

Reporting for Reuters Regulatory Intelligence, Richard Satran says the SEC charged First American with violating Rule 13a-15(a) of the Exchange Act.

“The rule broadly requires firms involved in securities issuance to have a compliance process in place to assure material information follows securities laws,” Satran wrote. “The SEC avoided getting into the specific details of the breach and instead focused on the way its disclosure was handled.”

Mark Rasch, also former federal prosecutor in Washington, said the SEC is signaling with this action that it intends to take on more cases in which companies flub security governance in some big way.

“It’s a win for the SEC, and for First America, but it’s hardly justice,” Rasch said. “It’s a paltry fine, and it involves no admission of guilt by First American.”

Rasch said First American’s first problem was labeling the weakness as a medium risk.

“This is lots of sensitive data you’re exposing to anyone with a web browser,” Rasch said. “That’s a high-risk vulnerability. It also means you probably don’t know whether or not anyone has accessed that data. There’s no way to tell unless you can go back through all your logs all those years.”

The SEC said the 800 million+ records had been publicly available on First American’s website since 2013. In August 2019, the company said a third-party investigation into the exposure identified just 32 consumers whose non-public personal information likely was accessed without authorization.

When KrebsOnSecurity asked how long it maintained access logs or how far back in time that review went, First American declined to be more specific, saying only that its logs covered a period that was typical for a company of its size and nature.

However, documents from New York financial regulators show First American was unable to determine whether records were accessed prior to Jun 2018 (one year prior to fixing the weakness).

The records exposed by First American would have been a virtual gold mine for phishers and scammers involved in Business Email Compromise (BEC) scams, which often impersonate real estate agents, closing agencies, title and escrow firms in a bid to trick property buyers into wiring funds to fraudsters. According to the FBI, BEC scams are the most costly form of cybercrime today.

First American is not out of the regulatory woods yet from this enormous data leak. In July 2020, the New York State Department of Financial Services announced the company was the target of their first ever cybersecurity enforcement action in connection with the incident, charges that could bring steep financial penalties. That inquiry is ongoing.

The DFS considers each instance of exposed personal information a separate violation, and the company faces penalties of up to $1,000 per violation. According to the SEC, First American’s EaglePro database contained tens of millions of document images that included non-public personal information.

Customize Your EDR To Adapt To Your Environment With SentinelOne Storyline Active Response (STAR)

Modern adversaries are continually automating their techniques, tactics, and procedures (TTPs) to evade defenses. To keep up, it makes sense that enterprise security teams should also be able to automate their response to the latest threats and identify ongoing campaigns in their environment. Machine-learning and rules-based detections capture unusual behaviors and common threats. However, they often require new agent logic, and updating your entire fleet to the latest agent to stop a new threat may not always be possible. Similarly, with EDR data producing millions or even billions of events a day, security teams need a way to look for the interesting behavioral and static indicators of compromise (IOCs) that might indicate a zero-day attack. While robust EDR data helps investigations, it may prove too noisy for useful alerting or discovering unusual behaviors.

Singularity ActiveEDRR provides advanced detection capabilities, best in class visibility, and allows the end user to write custom detection rules that address new threats or targeted threats specific to their industry or organization with Storyline Active Response (STAR)TM.

STAR lets enterprises incorporate custom detection logic and immediately push it out to their entire fleet, or a subset, to either kill any matching process or alert on it for further investigation. STAR can alleviate SOC burden as it can be used as a powerful policy enforcement tool, automatically mitigating threats and quarantining endpoints.

STAR can also add a new layer between threats and EDR data that can alert on a subset of interesting events instead of the entire dataset. This data can be easily consumed into a SIEM, bringing down the cost of using EDR data in a SIEM while making sure that no interesting events slip by.

How STAR Works

ActiveEDR comes with a default set of behavioral detection rules created by high-level research teams and provides endpoint protection from day one. SentinelOne enables customers to leverage these insights with STAR. With STAR custom detection rules, SOC teams can turn queries from Deep Visibility, SentinelOne’s EDR data collection and querying mechanism, into automated hunting rules that trigger alerts and responses when rules detect matches. STAR also allows users an automated way to look at every endpoint event collected across their entire fleet and evaluate each of those events against a list of rules.

Create a STAR Rule In Four Steps

  1. Write a query in Deep Visibility or create a new custom rule.
  2. Add an event condition.
  3. Designate response actions.
  4. Save the Rule.
STAR allows users an automated way to look at every endpoint event collected across the organization in real-time and evaluate each of those events against a list of rules.

STAR evaluates every endpoint event collected against every STAR rule. For large enterprises, STAR evaluates each event, in a stream of a billion daily events, against up to 1,000 STAR rules. It does this by working with Deep Visibility, which collects billions of events a day, so many that it detected every step of the 176-step attack in the latest MITRE test. STAR leverages that industry-leading technology and query language to write criteria that determine, in near real-time, if a collected event is part of a threat or is suspicious.

What makes STAR invaluable is the set of response tools it puts in the users’ hands when an event matches its criteria. The engine not only integrates with Deep Visibility but also with the agent. By checking a box when creating a rule, the analyst can enable STAR to kill any process that matches a STAR rule. By checking a different box, the user can enable STAR to automatically quarantine any device that sees a matching event. Rules can also be written to detect suspicious events and alert on them, allowing the users to then consume those alerts in the UI or via Syslog for further analysis in a SIEM.

Key STAR Use Cases

STAR has two main functions within a SOC, and most customers find value in both.

  1. Mitigate new and emerging zero-day threats
    No SOC Analyst wants to depend entirely on a vendor to protect from bleeding-edge attacks or novel threats emerging in niche locations or industries. As soon as they see a new threat emerge, analysts want the ability to write a rule that will detect and prevent that threat. Teams deeply value having the power to write their policy when they need to. STAR allows users to write rules that look for highly specific threats to their environment and automatically kill those threats.

    The screenshot below shows an example of a STAR rule to detect Hafnium Exchange zero-day threat.

  2. Augment SIEM data with low volume, high-value telemetry
    STAR allows users to generate new data points, highlighting suspicious behavior in their environment for automated cross-correlation in a SIEM or manual investigation. Security teams also find data to be invaluable. SentinelOne has quickly become known for its industry-leading EDR visibility and longer default retention. STAR builds on that story with the ability to generate alerts on almost anything. Customers leverage that data via UI, API, and Syslog to stitch together complicated attacks and shut them down.

    The following screenshot shows an example of a STAR rule to find a compromised computer using FTP to exfiltrate data.

SentinelOne Storyline Active Response (STAR)
Customize EDR to adapt to your environment

Conclusion

You can stay ahead of adversaries by customizing and automating detection rules that fit your business and environment with STAR.

With SentinelOne Storyline Active Response, you can proactively monitor and respond to incoming threat intelligence by turning queries into automated hunting rules. STAR is easy to use, powerful, and flexible thanks to Deep Visibility’s intuitive query language with regular expression support for complex queries.

We built STAR to enable your SOC team to react faster and more effectively. Whether you need to mitigate new and emerging zero-day threats with custom detection rules, augment SIEM and Data lake data with low volume, high-value telemetry, trigger automated workflows, or automate your threat hunting queries, SentinelOne Storyline Active Response has you covered.

If you would like to learn more about STAR and the SentinelOne XDR platform, contact us for more information or request a free demo.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Beamery raises $138M at an $800M valuation for its ‘operating system for recruitment’

Online job listings were one of the first things to catch on in the first generation of the internet. But that has, ironically, also meant that some of the most-used digital recruitment services around today are also some of the least evolved in terms of tapping into all of the developments that tech has to offer, leaving the door open for some disruption. Today, one of the startups doing just that is announcing a big round of funding to double down on its growth so far.

Beamery, which has built what it describes as a “talent operating system” — a way to manage sourcing, hiring and retaining of people, plus analyzing the bigger talent picture for an organization, a “talent graph” as Beamery calls it, in an all-in-one, end-to-end service — has raised $138 million, money that it plans to use to continue building out more technology, as well as growing its business, which has been expanding quickly and saw 337% revenue growth year over year in Q4.

The Ontario Teachers’ Pension Plan Board (Ontario Teachers’), a prolific tech investor, is leading the round by way of its Teachers’ Innovation Platform (TIP). Other participants in this Series C include several strategic backers who are also using Beamery: Accenture Ventures, EQT Ventures, Index Ventures, M12 (Microsoft’s venture arm) and Workday Ventures (the venture arm of the HR software giant).

Abakar Saidov, co-founder and CEO at London-based Beamery, told TechCrunch in an interview that it is not disclosing valuation, but sources in the know say it’s in the region of $800 million.

The round is coming on the heels of a very strong year for the company.

The “normal” way of doing things in the working world was massively upended with the rise of COVID-19 in early 2020, and within that, recruitment was among one of the most impacted areas. Not only were people applying and interviewing for jobs completely remotely, but in many cases they were getting hired, onboarded and engaged into new jobs without a single face-to-face interaction with a recruiter, manager or colleague.

And that’s before you consider the new set of constraints that HR teams were under in many places: variously, we saw hiring freezes, furloughs, layoffs and budget cuts (often more than one of these per business), and yet work still needed to get done.

All that really paved the way for platforms like Beamery’s — designed not only to be remote-friendly software-as-a-service running in the cloud, but to handle the whole recruiting and talent management process from a single place — to pick up new customers and prove its role as an updated, more user-friendly approach to the task of sourcing and placing talent.

“Traditional HR is very admin-heavy, and when you add in payroll and benefits, the systems that exist are very siloed,” said Saidov in the interview. “The innovation for us has been to move out of that construct and into something that is human, and has a human touch. From a data perspective, we’re creating the underlying system of record for all of the people touching a business. So when you build on top of that, everything looks like a consumer application.”

In the last 12 months, the company said that customers — which are in the area of large enterprises and include COVID vaccine maker AstraZeneca, Autodesk, Nasdaq, several major tech giants and strategic investor Workday — filled 1 million roles through its platform, a figure that includes not just sourcing and placing candidates from outside of an organization’s walls, but also filling roles internally.

The work that Beamery is doing is definitely helping the business not just pull its weight — its last round was a much more modest $28 million, which was raised way back in 2018 — but grow and invest in new services.

The company said it had a year-on-year increase of 462% in jobs posted across its customer base. A year before that (which would have extended into pre-pandemic 2019), the number of candidates pipelined increased by a mere 46%, pointing to acceleration.

Beamery today already offers a pretty wide range of different services.

They include tools to source candidates. This can be done organically by creating your own job boards to be found by anyone curious enough to look, and by leveraging other job boards on other platforms like LinkedIn, the Microsoft-owned professional networking platform that counts “Talent Solutions” — i.e. recruitment — as one of its primary business lines. (Recall Microsoft is one of Beamery’s backers.) It also provides tools to create and manage online recruitment events.

Beamery also offers tools to help people get the word out about a role, with a service akin to programmatic advertising (similar to ZipRecruiter) to populate other job boards, or run more targeted executive recruitment searches. It also provides a way for HR teams to create internal recruitment processes, and also run surveys with existing teams to get a better picture of the state of play.

And it has some analytics tools in place to measure how well recruitment drives, retention and other metrics are evolving to help plan what to do in the future.

The big question for me now is how and if Beamery will bring more into that universe. There have been some interesting startups emerging in the wider world of talent IT (if we could call it that) that could be interesting complements to what Beamery already has, or provide a roadmap for what it might try to build itself.

It includes much more extensive work on internal job boards (such as what Gloat has built); digging much deeper into building accurate pictures of who is at the company and what they do (see: ChartHop); or the many services that are building ways of sourcing and connecting with contractors, which are a huge, and growing, part of the talent equation for companies (see: Turing, RemoteDeelPapaya GlobalLattice, Factorial and many others).

Beamery already includes contractors alongside full- and part-time roles that can be filled using its platform, but when it comes to managing those contractors, that’s something that Beamery does not do itself, so that could be one area where it might grow, too.

“The key reason enterprises work with us it to consolidate a bunch of workflows,” Saidov said. “HR hates having different systems and everything becomes easier when things interoperate well.” Employing contractors typically involves three elements: sourcing, management and scheduling, so Beamery will likely approach how it grows in that area by determining which piece might be “super core” the centralization of more data, he added.

Another two likely areas he hinted are on Beamery’s roadmap are assessments — that is, providing tools to recruiters who want to measure the skills of applicants for jobs (another startup-heavy area today) — and tools to help recruiters do their jobs better, whether that involves more native communications tools in video and messaging, as well as Gong-like coaching to help them measure and improve screening and interviewing.

It might also consider developing a version for smaller businesses to use.

Questions investors are happy to see considered, it seems, as they invest in what looks like a winner in the bigger race. TIP’s other investments have included ComplyAdvantage, Epic Games, Graphcore, KRY and SpaceX, a long run in a wide field.

“Leading companies worldwide are prioritising recruitment and retention. They are turning to Beamery for a best-in-class talent solution that can be seamlessly integrated with their business,” said Maggie Fanari, MD for TIP in Emea. “Beamery’s best-in-class approach is already recognized by top-tier companies. I’m excited by the company’s vision of to use technology to support long-term talent growth and build better businesses. Beamery is the first company to bring predictive marketing and data science into recruitment. They are a truly innovative company, building a vision that can shape the future of work — the company fits all the criteria we look for in a TIP investment and more.”

Nylas, maker of APIs to integrate email and other productivity tools, raises $120M, passes 80K developers

Companies like Stripe and Twilio have put APIs front and center as an effective way to integrate complex functionality that may not be core to your own technology stack but is a necessary part of your wider business. Today, a company that has taken that model to create an effective way to integrate email, calendars and other tools into other apps using APIs is announcing a big round of funding to expand its business.

Nylas, which describes itself as a communications API platform — enabling more automation particularly in business apps by integrating productivity tools through a few lines of code — has raised $120 million in funding, money that it will be using to continue expanding the kinds of APIs that it offers, with a focus in particular not just on productivity apps, but AI and related tools to bring more automation into workflows.

Nylas is not disclosing its valuation, but this is a very significant step up for the company at a time when it is seeing strong traction.

This is more than double what Nylas had raised up to now ($55 million since being founded in 2015), and when it last raised — a $16 million Series B in 2018 — it said it had “thousands of developers” among its users. Now, that number has ballooned to 80,000, with Nylas processing some 1.2 billion API requests each day, working out to 20 terabytes of data, daily. It also said that revenue growth tripled in the last 12 months.

The Series C is bringing a number of interesting names to Nylas’s cap table. New investor Tiger Global Management is leading the round, with previous backers Citi Ventures, Slack Fund, 8VC and Round13 Capital also participating. Other new backers in this round include Owl Rock Capital, a division of Blue Owl; Stripe co-founders Patrick Collison and John Collison; Klarna CEO Sebastian Siemiatkowski; and Tony Fadell.

As with other companies in the so-called API economy, the gap and opportunity that Nylas has identified is that there are a lot of productivity tools that largely exist in their own silos — meaning when a person wants to use them when working in an application, they have to open a separate application to do so. At the same time, building new, say, tools, or building a bridge to integrate an existing application, can be time-consuming and complex.

Nylas first identified this issue with email. An integration to make it easier to use email and the data housed in it — which works with emails from major providers like Microsoft and Google, as well as other services built with the IMAP protocol — in other apps picked up a lot of followers, leading the company to expand into other areas that today include scheduling and calendaring, a neural API to build in tools like sentiment analysis or productivity or workflow automation; and security integrations to streamline the Google OAuth security review process (used for example in an app geared at developers).

“The fundamental shift towards digital communications and connectivity has resulted in companies across all industries increasingly leaning on developers to solve critical business challenges and build unique and engaging products and experiences. As a result, APIs have become core to modern software development and digital transformation,” Gleb Polyakov, co-founder and CEO of Nylas, said in statement.

“Through our suite of powerful APIs, we’re arming developers with the tools and applications needed to meet customer and market needs faster, create competitive differentiation through powerful and customized user experiences, and generate operational ROI through more productive and intelligently automated processes and development cycles. We’re thrilled to continue advancing our mission to make the world more productive and are honored to have the backing of distinguished investors and entrepreneurs.”

Indeed, the rise of Nylas and the function it fulfills is part of a bigger shift we’ve seen in businesses overall: as organizations become more digitized and use more cloud-based apps to get work done, developers have emerged as key mechanics to help that machine run. A bigger emphasis on APIs to integrate services together is part of their much-used toolkit, one of the defining reasons for investors backing Nylas today.

“Companies are rapidly adopting APIs as a way to automate productivity and find new and innovative ways to support modern work and collaboration,” said John Curtius, a partner at Tiger, in a statement. “This trend has become critical to creating frictionless and meaningful data-driven communications that power digital transformation. We believe Nylas is uniquely positioned to lead the future of the API economy.” Curtius is joining the board with this round.

Corrected to note that Blue Owl is not connected to State Farm.

Google announces EPYC-based Tau virtual machines for Cloud

Google this morning announced the launch of Tau, a new family of virtual machines built on AMD’s third-gen EPYC processor. According to the company, the new x86-compatible system offers a 42% price-performance boost over standard VMs. Google notably first started utilizing AMD EPYC processors for Cloud back in 2017, while Amazon Cloud’s offerings date back to 2018.

Google claims the Tau family “leapfrogs” existing cloud VMs. The systems come in a variety of configurations, ranging up to 60vCPUs per VM, and 4GB of memory per vCPU. Networking bandwidth goes up to 32 Gbps, and they can be coupled with a variety of different network attached storage.

“Customers across every industry are dealing with more demanding and data-intensive workloads and looking for strategic ways to speed up performance and reduce costs,” Google Cloud CEO Thomas Kurian said in a press release.  “Our work with key strategic partners like AMD has allowed us to broaden our offerings and deliver customers the best price performance for compute-heavy, business-critical applications– all on the cleanest cloud in the industry.”

Image Credits: Google

Google has already signed up some high-profile customers for an early trial, including Twitter, Snap and DoIT.

“High performance at the right price point is a critical consideration as we work to serve the global public conversation,” Twitter Platform Lead Nick Tornow said in a blog post. “We are excited by initial tests that show potential for double digit performance improvement. We are collaborating with Google Cloud to more deeply evaluate benefits on price and performance for specific compute workloads that we can realize through use of the new Tau VM family.”

Image Credits: Google

The Tau VMs will be arriving for Google Cloud in Q3 of this year. The company has already opened the system up to clients for pre-registration. Pricing is dependent on the configuration. For example, a 32vCPU VM sporting 128GB RAM will run around $1.35 an hour.

Neo4j raises Neo$325M as graph-based data analysis takes hold in enterprise

Databases run the world, but database products are often some of the most mature and venerable software in the modern tech stack. Designers will pixel push, frontend engineers will add clicks to make it more difficult to drop out of a soporific Zoom call, but few companies are ever willing to rip out their database storage engine. Too much risk, and almost no return.

So it’s exceptional when a new database offering breaks through the barriers and redefines the enterprise.

Neo4j, which offers a graph-centric database and related products, announced today that it raised $325 million at a more than $2 billion valuation in a Series F deal led by Eurazeo, with additional capital from Alphabet’s venture wing GV. Eurazeo managing director Nathalie Kornhoff-Brüls will join the company’s board of directors.

That funding makes Neo4j among the most well-funded database companies in history, with a collective fundraise haul of more than half a billion dollars. For comparison, MongoDB, which trades on Nasdaq, raised $311 million in total (according to Crunchbase) before its IPO. Meanwhile, Cockroach Labs of CockroachDB fame has now raised $355 million in funding, including a $160 million round earlier this year at a similar $2 billion valuation.

The past decade has seen a whole new crop of next-generation database models, from scale-out SQL to document to key-value stores to time series and on and on and on. What makes graph databases like Neo4j unique is their focus on the connections between individual data entities. Graph-based data models have become central to modern machine learning and artificial intelligence applications, and are now widely used by data analysts in applications as diverse as marketing to fraud detection.

CEO and co-founder Emil Eifrem said that Neo4j, which was founded back in 2007, has hit its growth stride in recent years given the rising popularity of graph-based analysis. “We have a deep developer community of hundreds of thousands of developers actively building applications with Neo4j in any given month, but we also have a really deep data science community,” he said.

In the past, most business analysis was built on relational databases. Yet, inter-connected complexity is creeping in everywhere, and that’s where Eifrem believes Neo4j has a durable edge. As an example, “any company that ships stuff is tapping into this global fine-grain mesh spanning continent to continent,” he suggested. “All of a sudden the ship captain in the Suez Canal … falls asleep, and then they block the Suez Canal for a week, and then you’ve got to figure out how will this affect my enterprise, how does that cascade across my entire supply chain.” With a graph model, that analysis is a cinch.

Neo4j says that 800 enterprises are customers and 75% of the Fortune 100 are users of the company’s products.

We last checked in with the company in 2020 when it launched 4.0, which offered unlimited scaling. Today, Neo4j comes in a couple of different flavors. It’s a database that can be either self-hosted or purchased as a cloud service offering which it dubs Aura. That’s for the data storage folks. For the data scientists, the company offers Neo4j Graph Data Science Library, a set of comprehensive tools for analyzing graph data. The company offers free (or “community” tiers), affordable starting tiers and full-scale enterprise pricing options depending on needs.

Development continues on the database. This morning at its developers conference, Neo4j demonstrated what it dubbed its “super-scaling technology” on a 200 billion node graph with more than a trillion relationships between them, showing how its tools could offer “real-time” queries on such a large scale.

Unsurprisingly, Eifrem said that the new venture funding will be used to continue doubling down on “product, product, product” but emphasized a few major strategic initiatives as critical for the company. First, he wants to continue to deepen the company’s partnerships with public cloud providers. It already has a deep relationship with Google Cloud (GV was an investor in this round after all), and hopes to continue building relationships with other providers.

It’s also seeing a major uptick in interest from the APAC region. Eifrem said that the company recently opened up an office in Singapore to accelerate its sales in the broader IT market there.

Overall, “We think that graphs can be a significant part of the modern data landscape. In fact, we believe it can be the biggest part of the modern data landscape. And this round, I think, sends a clear signal [that] we’re going for it,” he said.

Erik Nordlander and Tom Hulme of GV were the leads for that firm. In addition, DTCP and Lightrock newly invested and previous investors One Peak, Creandum and Greenbridge Partners joined the round.

Gusto makes first acquisition, buying Ardius to expand into R&D tax credits

Free money from the government sounds like winning the lottery, but the reality is that most tech startups and even local retail businesses and restaurants can potentially qualify for tax credits related to research and development in the United States. Those credits, which is what helps tech giants keep their tax rates to near zero, are hard for smaller companies to receive because of extensive documentation requirements and potential audit costs.

So a number of startups have been launched to solve that gap, and now, larger companies are entering the fray as well.

Gusto, which started off with payroll for SMBs and has since expanded into employee on-boarding, insurance, benefits and other HR offerings, today announced that it is acquiring Ardius, a startup designed to automate tax compliance particularly around R&D tax credits.

The Los Angeles-based company was founded by Joshua Lee in 2018, who previously had worked for more than a decade at accounting firm EY. Terms of the deal were not disclosed, and Ardius will run as an independent business with the entire team transitioning to Gusto.

The strategy here is simple: Most R&D credits require payroll documentation, data that is already stored in Gusto’s system of record. Ardius in its current incarnation was designed to tap into a number of payroll data providers and extract that data and turn it into verifiable tax documents. With this tie-up, the companies can simply do that automatically for Gusto’s extensive number of customers.

Joshua Reeves, co-founder and CEO of Gusto, said that the acquisition falls in line with the company’s long-term focus on customers and simplicity. “We want to bring together technology, great service, [and] make government simpler,” he said. “In some ways, a lot of stuff we’re doing — make payroll simpler, make healthcare simpler, make PPP [loans] and tax credits simpler — just make these things work the way they’re intended to work.” The company presumably could have built out such functionality, but he noted that “time to market” was a crucial point in making Ardius the company’s first acquisition.

Tomer London, co-founder and chief product officer, said that “we’ve been looking at this space for a long time because it kind of connects to one of our original product principles of building a product that is opinionated,” he said. In a space as complicated as HR, “we want to be out there and be an advisor, not just a tool. And this is just such a great example of where you can take the payroll data that we already have and in just a few clicks and in a matter of a few days, get access to really important cash flow for a business.” He noted that tax credits is “something that’s been on our roadmap for a long time.”

Gusto works with more than 100 third-party services that integrate on top of its platform. Reeves emphasized that while Ardius is part of Gusto, all companies — even those that might compete directly with the product — will continue to have equal access to the platform’s data. In its release, the company pointed out that Boast.ai, Clarus, Neo.Tax and TaxTaker are just some of the other tax products that integrate with Gusto today.

Of course, Ardius is just one of a number of competitors that have popped up in the R&D and economic development tax credit space. MainStreet, which I last profiled in 2020 for its seed round, just raised $60 million in funding in March led by SignalFire. Meanwhile, Neo.tax, which I also profiled last year, has raised a total of $5.5 million.

Reeves was sanguine about the attention the space is garnering and the potential competition for Ardius. When it comes to R&D tax credits, “whatever creates more accessibility, we’re a fan of,” he said. “It’s great that there’s more awareness because it’s still under-utilized frankly.” He emphasized that Gusto would be able to offer a more vertically-integrated solution given its data and software than other competitors in the space.

While the pandemic particularly hit SMBs, who often lacked the financial wherewithal of larger companies to survive the crisis, Gusto actually expanded its business as new companies sprouted up. Reeves said the company grew its customer base 50% in its last fiscal year, which ended in April. It “turns out in a health pandemic and in an economic crisis, things like payroll and accessing health care are quite important,” he said. Gusto launched a program to help SMBs collect the government’s stimulus PPP loans.

The company’s main bases of operation are in San Francisco, Denver and New York City, and the company has a growing contingent of remote workers, including the Ardius crew, who will remain based in LA. While Reeves demurred on future acquisitions, Gusto’s focus on expanding to a comprehensive financial wellness platform for both employees and businesses would likely suggest that additional acquisitions may well be in the offing in the future.