4 enterprise developer trends that will shape 2021

Technology has dramatically changed over the last decade, and so has how we build and deliver enterprise software.

Ten years ago, “modern computing” was to rely on teams of network admins managing data centers, running one application per server, deploying monolithic services, through waterfall, manual releases managed by QA and release managers.

Today, we have multi and hybrid clouds, serverless services, in continuous integration, running infrastructure-as-code.

SaaS has grown from a nascent 2% of the $450B enterprise software market in 2009, to 23% in 2020 and crossed $100B in revenue. PaaS and IaaS revenue represent another $50B in revenue, expecting to double to $100B by 2022.

With 77% of the enterprise software market — over $350B in annual revenue — still on legacy and on-premise systems, modern SaaS, PaaS and IaaS eating at the legacy market alone can grow the market 3x-4x over the next decade.

As the shift to cloud accelerates across the platform and infrastructure layers, here are four trends starting to emerge that will change how we develop and deliver enterprise software for the next decade.

1. The move to “everything as code”

Companies are building more dynamic, multiplatform, complex infrastructures than ever. We see the “-aaS” of the application, data, runtime and virtualization layers. Modern architectures are forcing extensibility to work with any number of mixed and matched services.

Salesforce introduces several new developer tools, including serverless functions

Salesforce has a bunch of announcements coming out of the virtual TrailheaDX conference taking place later this week, starting today with some new developer tools. The goal of these tools is to give developers a more modern way of creating applications on top of the Salesforce platform.

Perhaps the most interesting of the three being announced today is Salesforce Functions, which enable developers to build serverless applications on top of Salesforce. With a serverless approach, the developer creates a series of functions that trigger an operation. The cloud provider then delivers the exact amount of infrastructure resources required to run that operation and nothing more.

Wade Wegner, SVP of product for Salesforce and Salesforce DX, says the Salesforce offering gives developers a lot of flexibility around development languages such as Node.js or Java, and cloud platforms such as AWS or Azure. “I can just write my code, deploy it and let Salesforce operate it for me,” he said.

Wegner explained that the new approach lets developers build serverless applications with data that lives in Salesforce, and then run it on elastic infrastructure. This gives them the benefits of vertical and horizontal scale without having to be responsible for managing all aspects of how their application will run on the cloud infrastructure.

In addition to Functions, the company is also announcing Code Builder, a web-based IDE based on Microsoft Visual Studio Code Spaces. “By leveraging Visual Studio Code Spaces we can bring the same capabilities to developers right in the browser,” Wegner said.

He adds that this enables them to be more productive with support for many languages and frameworks in a browser in the context of the environment that they’re doing their work, while giving them a consistent and familiar experience.

Finally, the company is announcing the DevOps Center, which is a place to manage the growing complexity of delivering applications built on top of Salesforce in a modern continuous way. “It is really meant to provide new ways with which teams of developers can collaborate around the work that they’re doing, and to manage the complexities of continuously delivering applications…,” he said.

As is typical for Salesforce, the company is announcing these tools today, but they will not be generally available for some time. Functions and Code Builders are both in pilot, while DevOps Center will be available as a developer preview later this year.

Ampere announces latest chip with a 128-core processor

In the chip game, more is usually better, and to that end, Ampere announced the next chip on its product roadmap today, the Altra Max, a 128-core processor the company says is designed specifically to handle cloud-native, containerized workloads.

What’s more, the company has designed the chip so that it will fit in the same slot as their 80-core product announced last year (and in production now). That means that engineers can use the same slot when designing for the new chip, which saves engineering time and eases production, says Jeff Wittich, VP of products at the company.

Wittich says that his company is working with manufacturers today to make sure they can build for all of the requirements for the more powerful chip. “The reason we’re talking about it now, versus waiting until Q4 when we’ve got samples going out the door is because it’s socket compatible, so the same platforms that the Altra 80 core go into, this 128-core product can go into,” he said.

He says that containerized workloads, video encoding, large scale out databases and machine learning inference will all benefit from having these additional cores.

While he wouldn’t comment on any additional funding, the company has raised $40 million, according to Crunchbase data, and Wittich says they have enough funding to go into high-volume production on their existing products later this year.

Like everyone, the company has faced challenges keeping a consistent supply chain throughout the pandemic, but when it started to hit in Asia at the beginning of this year, the company set a plan in motion to find backup suppliers for the parts they would need should they run into pandemic-related shortages. He says that it took a lot of work, planning and coordination, but they feel confident at this point in being able to deliver their products in spite of the uncertainty that exists.

“Back in January we actually already went through [our list of suppliers], and we diversified our supply chain and made sure that we had options for everything. So we were able to get in front of that before it ever became a problem,” he said.

“We’ve had normal kinds of hiccups here and there that everyone’s had in the supply chain, where things get stuck in shipping and they end up a little bit late, but we’re right on schedule with where we were.”

The company is already planning ahead for its 2022 release, which is in development. “We’ve got a test chip running through five nanometer right now that has the key IP and some of the key features of that product, so that we can start testing those out in silicon pretty soon,” he said.

Finally, the company announced that it’s working with some new partners, including Cloudflare, Packet (which was acquired by Equinix in January), Scaleway and Phoenics Electronics, a division of Avnet. These partnerships provide another way for Ampere to expand its market as it continues to develop.

The company was founded in 2017 by former Intel president Renee James.

Hasura launches managed cloud service for its open-source GraphQL API platform

Hasura is an open-source engine that can connect to PostgreSQL databases and microservices across hybrid- and multi-cloud environments and then automatically build a GraphQL API backend for them, making it easier for developers to then build their own data-driven applications on top of this unified API . For a while now, the San Francisco-based startup has offered a paid version (Hasura Pro) with enterprise-ready reliability and security tools, in addition to its free open-source version. Today, the company launched Hasura Cloud, which takes the existing Pro version, adds a number of cloud-specific features like dynamic caching, auto-scaling and consumption-based pricing, and brings those together in a fully managed service.

Image Credits: Hasura

At its core, Hasura’s service promises businesses the ability to bring together data from their various siloed databases and allow their developers to extract value from them through its GraphQL APIs. While GraphQL is still relatively new, the Facebook-incubated technology has quickly become extremely popular among many development teams.

Before founding the company and launching it in 2018, Hasura CEO and co-founder Tanmai Gopal worked for a consulting firm — and like with so many founders, that’s where he got the inspiration for the service.

“One of the key things that we noticed was that in the entire landscape, computing is becoming better, there are better frameworks, it is easier to deploy code, databases are becoming better and they kind of work everywhere,” he said. “But this kind of piece in the middle that is still a bottleneck and that there isn’t really a good solution for is this data access piece.” Almost by default, most companies host data in various SaaS services and databases — and now they were trying to figure out how to develop apps based on this for both internal and external consumers, noted Gopal. “This data distribution problem was this bottleneck where everybody would just spend massive amounts of time and money. And we invented a way of kind of automating that,” he explained.

The choice of GraphQL was also pretty straightforward, especially because GraphQL services are an easy way for developers to consume data (even though, as Gopal noted, it’s not always fun to build the GraphQL service itself). One thing that’s unusual and worth noting about the core Hasura engine itself is that it is written in Haskell, which is a rather unusual choice.

Image Credits: Hasura

The team tells me that Hasura is now nearing 50 million downloads for its free version and the company is seeing large and small users from across various industries relying on its products, which is probably no surprise, given that the company is trying to solve a pretty universal problem around data access and consumption.

Over the last few quarters, the team worked on launching its cloud service. “We’ve been thinking of the cloud in a very different way,” Gopal said. “It’s not your usual, take the open-source solution and host it, like a MongoDB Atlas or Confluent. What we’ve done is we’ve said, we’re going to re-engineer the open-source solution to be entirely multi-tenant and be completely pay-per pricing.”

Given this philosophy, it’s no surprise that Hasura’s pricing is purely based on how much data a user moves through the service. “It’s much closer to our value proposition,” Hasura co-founder and COO Rajoshi Ghosh said. “The value proposition is about data access. The big part of it is the fact that you’re getting this data from your databases. But the very interesting part is that this data can actually come from anywhere. This data could be in your third-party services, part of your data could be living in Stripe and it could be living in Salesforce, and it could be living in other services. […] We’re the data access infrastructure in that sense. And this pricing also — from a mental model perspective — makes it much clearer that that’s the value that we’re adding.”

Now, there are obviously plenty of other data-centric API services on the market, but Gopal argues that Hasura has an advantage because of its advanced caching for dynamic data, for example.

‘BlueLeaks’ Exposes Files from Hundreds of Police Departments

Hundreds of thousands of potentially sensitive files from police departments across the United States were leaked online last week. The collection, dubbed “BlueLeaks” and made searchable online, stems from a security breach at a Texas web design and hosting company that maintains a number of state law enforcement data-sharing portals.

The collection — nearly 270 gigabytes in total — is the latest release from Distributed Denial of Secrets (DDoSecrets), an alternative to Wikileaks that publishes caches of previously secret data.

A partial screenshot of the BlueLeaks data cache.

In a post on Twitter, DDoSecrets said the BlueLeaks archive indexes “ten years of data from over 200 police departments, fusion centers and other law enforcement training and support resources,” and that “among the hundreds of thousands of documents are police and FBI reports, bulletins, guides and more.”

Fusion centers are state-owned and operated entities that gather and disseminate law enforcement and public safety information between state, local, tribal and territorial, federal and private sector partners.

KrebsOnSecurity obtained an internal June 20 analysis by the National Fusion Center Association (NFCA), which confirmed the validity of the leaked data. The NFCA alert noted that the dates of the files in the leak actually span nearly 24 years — from August 1996 through June 19, 2020 — and that the documents include names, email addresses, phone numbers, PDF documents, images, and a large number of text, video, CSV and ZIP files.

“Additionally, the data dump contains emails and associated attachments,” the alert reads. “Our initial analysis revealed that some of these files contain highly sensitive information such as ACH routing numbers, international bank account numbers (IBANs), and other financial data as well as personally identifiable information (PII) and images of suspects listed in Requests for Information (RFIs) and other law enforcement and government agency reports.”

The NFCA said it appears the data published by BlueLeaks was taken after a security breach at Netsential, a Houston-based web development firm.

“Preliminary analysis of the data contained in this leak suggests that Netsential, a web services company used by multiple fusion centers, law enforcement, and other government agencies across the United States, was the source of the compromise,” the NFCA wrote. “Netsential confirmed that this compromise was likely the result of a threat actor who leveraged a compromised Netsential customer user account and the web platform’s upload feature to introduce malicious content, allowing for the exfiltration of other Netsential customer data.”

Reached via phone Sunday evening, Netsential Director Stephen Gartrell declined to comment for this story.

The NFCA said a variety of cyber threat actors, including nation-states, hacktivists, and financially-motivated cybercriminals, might seek to exploit the data exposed in this breach to target fusion centers and associated agencies and their personnel in various cyber attacks and campaigns.

The BlueLeaks data set was released June 19, also known as “Juneteenth,” the oldest nationally celebrated commemoration of the ending of slavery in the United States. This year’s observance of the date has generated renewed public interest in the wake of widespread protests against police brutality and the filmed killing of George Floyd at the hands of Minneapolis police.

Stewart Baker, an attorney at the Washington, D.C. office of Steptoe & Johnson LLP and a former assistant secretary of policy at the U.S. Department of Homeland Security, said the BlueLeaks data is unlikely to shed much light on police misconduct, but could expose sensitive law enforcement investigations and even endanger lives.

“With this volume of material, there are bound to be compromises of sensitive operations and maybe even human sources or undercover police, so I fear it will put lives at risk,” Baker said. “Every organized crime operation in the country will likely have searched for their own names before law enforcement knows what’s in the files, so the damage could be done quickly. I’d also be surprised if the files produce much scandal or evidence of police misconduct. That’s not the kind of work the fusion centers do.”

Turn on MFA Before Crooks Do It For You

Hundreds of popular websites now offer some form of multi-factor authentication (MFA), which can help users safeguard access to accounts when their password is breached or stolen. But people who don’t take advantage of these added safeguards may find it far more difficult to regain access when their account gets hacked, because increasingly thieves will enable multi-factor options and tie the account to a device they control. Here’s the story of one such incident.

As a career chief privacy officer for different organizations, Dennis Dayman has tried to instill in his twin boys the importance of securing their online identities against account takeovers. Both are avid gamers on Microsoft’s Xbox platform, and for years their father managed their accounts via his own Microsoft account. But when the boys turned 18, they converted their child accounts to adult, effectively taking themselves out from under their dad’s control.

On a recent morning, one of Dayman’s sons found he could no longer access his Xbox account. The younger Dayman admitted to his dad that he’d reused his Xbox profile password elsewhere, and that he hadn’t enabled multi-factor authentication for the account.

When the two of them sat down to reset his password, the screen displayed a notice saying there was a new Gmail address tied to his Xbox account. When they went to turn on multi-factor authentication for his son’s Xbox profile — which was tied to a non-Microsoft email address — the Xbox service said it would send a notification of the change to unauthorized Gmail account in his profile.

Wary of alerting the hackers that they were wise to their intrusion, Dennis tried contacting Microsoft Xbox support, but found he couldn’t open a support ticket from a non-Microsoft account. Using his other son’s Outlook account, he filed a ticket about the incident with Microsoft.

Dennis soon learned the unauthorized Gmail address added to his son’s hacked Xbox account also had enabled MFA. Meaning, his son would be unable to reset the account’s password without approval from the person in control of the Gmail account.

Luckily for Dayman’s son, he hadn’t re-used the same password for the email address tied to his Xbox profile. Nevertheless, the thieves began abusing their access to purchase games on Xbox and third-party sites.

“During this period, we started realizing that his bank account was being drawn down through purchases of games from Xbox and [Electronic Arts],” Dayman the elder recalled. “I pulled the recovery codes for his Xbox account out of the safe, but because the hacker came in and turned on multi-factor, those codes were useless to us.”

Microsoft support sent Dayman and his son a list of 20 questions to answer about their account, such as the serial number on the Xbox console originally tied to the account when it was created. But despite answering all of those questions successfully, Microsoft refused to let them reset the password, Dayman said.

“They said their policy was not to turn over accounts to someone who couldn’t provide the second factor,” he said.

Dayman’s case was eventually escalated to Tier 3 Support at Microsoft, which was able to walk him through creating a new Microsoft account, enabling MFA on it, and then migrating his son’s Xbox profile over to the new account.

Microsoft told KrebsOnSecurity that while users currently are not prompted to enable two-step verification upon sign-up, they always have the option to enable the feature.

“Users are also prompted shortly after account creation to add additional security information if they have not yet done so, which enables the customer to receive security alerts and security promotions when they login to their account,” the company said in a written statement. “When we notice an unusual sign-in attempt from a new location or device, we help protect the account by challenging the login and send the user a notification. If a customer’s account is ever compromised, we will take the necessary steps to help them recover the account.”

Certainly, not enabling MFA when it is offered is far more of a risk for people in the habit of reusing or recycling passwords across multiple sites. But any service to which you entrust sensitive information can get hacked, and enabling multi-factor authentication is a good hedge against having leaked or stolen credentials used to plunder your account.

What’s more, a great many online sites and services that do support multi-factor authentication are completely automated and extremely difficult to reach for help when account takeovers occur. This is doubly so if the attackers also can modify and/or remove the original email address associated with the account.

KrebsOnSecurity has long steered readers to the site twofactorauth.org, which details the various MFA options offered by popular websites. Currently, twofactorauth.org lists nearly 900 sites that have some form of MFA available. These range from authentication options like one-time codes sent via email, phone calls, SMS or mobile app, to more robust, true “2-factor authentication” or 2FA options (something you have and something you know), such as security keys or push-based 2FA such as Duo Security (an advertiser on this site and a service I have used for years).

Email, SMS and app-based one-time codes are considered less robust from a security perspective because they can be undermined by a variety of well-established attack scenarios, from SIM-swapping to mobile-based malware. So it makes sense to secure your accounts with the strongest form of MFA available. But please bear in mind that if the only added authentication options offered by a site you frequent are SMS and/or phone calls, this is still better than simply relying on a password to secure your account.

The Good, the Bad and the Ugly in Cybersecurity – Week 25

The Good

A 40-year old man, by the name of Andrew Rakhshan, has been given the maximum sentence possible as a result of his involvement in DDoS attacks against Leagle.com. The legal news aggregation site has posted publicly available data regarding Rakhshan’s past criminal convictions in Canada. The actual events in question occurred in January 2015, at which point Rakhshan coordinated multiple DDoS attacks against the site, which was hosted by a provider in the Dallas/Ft. Worth area.

Rakhshan (born Kamyar Jahanrakhshan) received a sentence of 5 years in prison and was ordered to pay over $520,000 in fees and restitution costs. This was not the first run through of the case, however. The original trial took place in March 2018. A new trial was granted based on the defense attorneys’ claim that their defense (at the time) was ineffective. A conspiracy charge was added in the subsequent trial, adding to the previous findings of the original case.

Any time the law can be used as an effective tool against cyber crime is a celebratory occasion. This is not always easy and cases often lag for years, or are tried ineffectively due to a lack of technical prowess across all involved parties. That being said, cheers to all involved in this case, and let it serve as a lesson. Even “simple” DDoS attacks can result in steep penalties.

The Bad

This week, Israeli security consulting company, JSOF disclosed 19 unique vulnerabilities within a commonly-shared TCP/IP software library developed by Treck. The library, developed in the late 1990s, is a lightweight TCP/IP stack estimated to be used in “hundreds of millions” of network devices. Affected vendors range from individual developers to well-established Fortune 100 enterprises (e.g., Intel, Schneider Electric, and HP) and vulnerable devices include almost everything from home ‘smart’ devices to power grid infrastructure, transportation systems, healthcare systems and even devices used in commercial aircraft.

Four of the vulnerabilities are considered critical. JSOF said they plan to release updated information along with exploitation details at Black Hat USA 2020. Here’s a quick summary on each CVE:

  • CVE-2020-11896 (Critical RCE): IPv4 tunneling flaw in Treck TCP/IP Stack
  • CVE-2020-11897 (Critical OOB Write): OOB Write via malformed IPv6 packets in Treck TCP/IP stack
  • CVE-2020-11901 (Critical RCE): Remote code execution via invalid DNS response in Treck TCP/IP stack
  • CVE-2020-11898 (Critical ID): Information Disclosure through improper handling of IPv4 or ICMPv4 Length Parameter Inconsistency
  • CVE-2020-11900 (UAF): Double Free / Use-After-Free via IPv4 tunneling in Treck TCP/IP stack
  • CVE-2020-11902 (OOB Read): Out-of-Bounds read via IPv6OverIPv4 tunneling in Treck TCP/IP stack
  • CVE-2020-11904 (OB Write): Integer Overflow due to improper memory allocation in Treck TCP/IP stack
  • CVE-2020-11899 (OOB Read): Out-of-Bounds read via IPv6 malformed transmission in Treck TCP/IP stack
  • CVE-2020-11903 (ID): Out-of-Bounds read via DHCP control request in Treck TCP/IP stack
  • CVE-2020-11905 (ID): Out-of-Bounds read via DHCP over IPv6 in Treck TCP/IP stack
  • CVE-2020-11906 (IU): Integer Underflow via Ethernet Link Layer in Treck TCP/IP stack
  • CVE-2020-11907 (IU): Integer Underflow via Length Parameter Inconsistency in Treck TCP/IP stack
  • CVE-2020-11909 (IU): Integer Underflow via malformed IPv4 data in Treck TCP/IP stack
  • CVE-2020-11910 (OOB Read): Out-of-Bounds read via malformed IPv4 transmission data in Treck TCP/IP stack
  • CVE-2020-11911 (MC): Improper ICMPv4 Access Control behavior in Treck TCP/IP stack
  • CVE-2020-11912 (OOB Read): Out-of-Bounds Read in Treck TCP/IP stack
  • CVE-2020-11913 (OOB Read): Out-of-Bounds read via IPv6 in Treck TCP/IP stack
  • CVE-2020-11914 (OOB Read): Out-of-Bounds read via malformed ARP data in Treck TCP/IP stack
  • CVE-2020-11908 (ID): Information disclosure via improper handling of ‘’ termination markers in DHCP.

As of this writing, the following resources have been made available:

We strongly recommend that IT and security teams review the applicable CERT advisories and vendor advisories for the latest updates and remediation options. Identifying vulnerable devices, gauging exposure, and preventing post-exploitation activities is key with these types of flaws. SentinelOne’s Ranger provides a robust and streamlined interface for asset discovery, risk management and threat prevention.

The Ugly

It is no secret that the bad guys are well aware of many of the tools that the good guys use and rely on everyday in our ongoing battle. Online multi-scanners and sandboxes are leveraged by both sides. When the good guys provide details on some fancy new tool or process, you can bet that the bad guys will find a way to use it if it benefits them as well. One such recent case of this pertains to the Thanos ransomware family and their implementation of the RIPlace evasion technique, publicized by Nyotron.

The RIPlace tool can be used to evade certain AV products, allowing the malware to run uninhibited. Nyotron released their findings on RIPlace in November of 2019 in an effort to educate the public on this newly observed evasion technique. In addition, researchers from Recorded Future indicate that the actors behind Thanos have been repeatedly modifying new variants of the ransomware over the last several months. They are using RIPlace to specifically evade Malwarebytes AntiMalware and Windows Defender products. There is a high likelihood that variants tuned to other products are present in the wild as well.

SentinelOne’s Endpoint Protection platform is fully capable of detection and prevention of Thanos ransomware, as well as threats incorporating the RIPlace evasion technique.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

FEMA IT Specialist Charged in ID Theft, Tax Refund Fraud Conspiracy

An information technology specialist at the Federal Emergency Management Agency (FEMA) was arrested this week on suspicion of hacking into the human resource databases of University of Pittsburgh Medical Center (UPMC) in 2014, stealing personal data on more than 65,000 UPMC employees, and selling the data on the dark web.

On June 16, authorities in Michigan arrested 29-year-old Justin Sean Johnson in connection with a 43-count indictment on charges of conspiracy, wire fraud and aggravated identity theft.

Federal prosecutors in Pittsburgh allege that in 2013 and 2014 Johnson hacked into the Oracle PeopleSoft databases for UPMC, a $21 billion nonprofit health enterprise that includes more than 40 hospitals.

According to the indictment, Johnson stole employee information on all 65,000 then current and former employees, including their names, dates of birth, Social Security numbers, and salaries.

The stolen data also included federal form W-2 data that contained income tax and withholding information, records that prosecutors say Johnson sold on dark web marketplaces to identity thieves engaged in tax refund fraud and other financial crimes. The fraudulent tax refund claims made in the names of UPMC identity theft victims caused the IRS to issue $1.7 million in phony refunds in 2014.

“The information was sold by Johnson on dark web forums for use by conspirators, who promptly filed hundreds of false form 1040 tax returns in 2014 using UPMC employee PII,” reads a statement from U.S. Attorney Scott Brady. “These false 1040 filings claimed hundreds of thousands of dollars of false tax refunds, which they converted into Amazon.com gift cards, which were then used to purchase Amazon merchandise which was shipped to Venezuela.”

Johnson could not be reached for comment. At a court hearing in Pittsburgh this week, a judge ordered the defendant to be detained pending trial. Johnson’s attorney declined to comment on the charges.

Prosecutors allege Johnson’s intrusion into UPMC was not an isolated occurrence, and that for several years after the UPMC hack he sold personally identifiable information (PII) to buyers on dark web forums.

The indictment says Johnson used the hacker aliases “DS and “TDS” to market the stolen records to identity thieves on the Evolution and AlphaBay dark web marketplaces. However, archived copies of the now-defunct dark web forums indicate those aliases are merely abbreviations that stand for “DearthStar” and “TheDearthStar,” respectively.

“You can expect good things come tax time as I will have lots of profiles with verified prior year AGIs to make your refund filing 10x easier,” TheDearthStar advertised in an August 2015 message to AlphaBay members.

In some cases, it appears these DearthStar identities were actively involved in not just selling PII and tax refund fraud, but also stealing directly from corporate payrolls.

In an Aug. 2015 post to AlphaBay titled “I’d like to stage a heist but…,” TheDearthStar solicited people to help him cash out access he had to the payroll systems of several different companies:

“… I have nowhere to send the money. I’d like to leverage the access I have to payroll systems of a few companies and swipe a chunk of their payroll. Ideally, I’d like to find somebody who has a network of trusted individuals who can receive ACH deposits.”

When another AlphaBay member asks how much he can get, TheDearthStar responds, “Depends on how many people end up having their payroll records ‘adjusted.’ Could be $1,000 could be $100,000.”

2014 and 2015 were particularly bad years for tax refund fraud, a form of identity theft which cost taxpayers and the U.S. Treasury billions of dollars. In April 2014, KrebsOnSecurity wrote about a spike in tax refund fraud perpetrated against medical professionals that caused many to speculate that one or more major healthcare providers had been hacked.

A follow-up story that same month examined the work of a cybercrime gang that was hacking into HR departments at healthcare organizations across the country and filing fraudulent tax refund requests with the IRS on employees of those victim firms.

The Justice Department’s indictment quotes from Johnson’s online resume as stating that he is proficient at installing and administering Oracle PeopleSoft systems. A LinkedIn resume for a Justin Johnson from Detroit says the same, and that for the past five months he has served as an information technology specialist at FEMA. A Facebook profile with the same photo belongs to a Justin S. Johnson from Detroit.

Johnson’s resume also says he was self-employed for seven years as a “cyber security researcher / bug bounty hunter” who was ranked in the top 1,000 by reputation on Hacker One, a program that rewards security researchers who find and report vulnerabilities in software and web applications.

The CISO Side: A Certifiable Journey

Two CISO insights about obtaining certifications in the information security industry – a guest post by Rachel Arnold

Obtaining certifications in Information Security is one of the many ways professionals are choosing to use their time wisely these days. SecureNation asked some of our favorite Tulsa, Oklahoma CISOs (Chief Information Security Officers) about their experiences as information technology and security professionals. Jonathan Kimmitt, serves as CISO at the University of Tulsa as well as a member of the local ISSA board where Pedro Serrano serves as ISSA Oklahoma Chapter President and is the CISO at the Grand River Dam Authority. Both are very passionate about their roles not only professionally but also as leaders.

 

“It matters![That] may sound canned or out of a self-deployment book, but the real meaning is you are making an impact on the day to day activities of the company.”-Pedro Serrano, Grand River Dam Authority CISO

We had the pleasure of meeting these gentlemen last year and find their insights invaluable here at SecureNation. They mold who we are and what we do within the Information Security Community.

How did you connect with one another professionally?

Pedro: After being in Oklahoma City for 15 years we moved to Tulsa and I meet Jonathan when I was the IT chair for one of the local universities. It’s been 10 years since then and Jonathan and I have been involved in the local Information System Security Association (ISSA) group where I serve as the president and he is the communication officer.

Jonathan: The first time I met Pedro was at the very first BSides Oklahoma many years ago. I remember I stopped by his presentation and one of my work colleagues was there, so I sat down to talk. This colleague knew Pedro because they taught together at a college here in Tulsa. So, stories were shared, and my respect for Pedro grew.

Pedro and I crossed paths again at the local ISSA meetings. After a few months of meetings, he asked me to join the ISSA Board. Since then we have done presentations together, hosted events together, and generally became close colleagues in IT Security. He is a great sounding board and can always make me feel better when it has been a rough day at work.

Tell us a little about your journey, how did you each come to be passionate about security and privacy?

Pedro: My background is military communications, 20 years serving in the Air Force installing, upgrading and managing infrastructure as well as ground network systems. It [Information security] matters because you matter. There is great personal fulfillment in truly moving knowledge forward for all things cyber.

Jonathan: In the early 2000’s, after college, I had been offered a temporary position as Help Desk Supervisor at the University of Tulsa. The previous supervisor had quit, and they were needing someone to cover until a search committee could find a replacement. After a few months, they offered me the position full time. After a year or two, IT Security was becoming more popular in higher education. The university did not have a CISO, and probably didn’t even know what a CISO was at the time, but they knew they needed volunteers to be part of a security team.

How did you gain the knowledge you would need for that new role?

Jonathan: There was a system administrator from one of the colleges that was leading the team, so we did a lot of investigations, training, and incidents together. He was the one that started me down the CISSP(Certified Information Systems Security Professional) path originally.

Jonathan is currently one of the first in the industry to obtain the Certified Data Privacy Solutions Engineer certification launched earlier this year by ISACA. He has over 10 certifications combined from IAPP, (ISC)2, GIAC, and more, to name a few.

After a few years, he [system administrator] decided to take a position outside the University, so I was asked to take leadership of the CSRT. I ran the team for many years, dealing with all kinds of investigations and incidents on campus. Around 2013-2014, I was the only member of the CSRT, as everyone else had moved on or quit the team. This was the same time we had gotten a new CIO in IT, and one of his objectives was to create a formal IT Security department. Since I had been part of or managing the CSRT for 12+ years, they offered the additional title of Chief Information Security Officer and a new position for a security analyst. Unfortunately, this was alongside my role as Chief Services Officer. Within 2 years, the needs of IT Security had grown significantly, investigations and incidents had exploded. So, the university transitioned the Help Desk Services responsibilities over to another officer, and my role became solely CISO.

There must have been challenges, what were they and what resources did you rely on?

Jonathan: As a newly formed CISO, one of my first duties was to determine what we needed in terms of security, and start building resources. Before I came to the university, I was in law enforcement for a short time, so the idea of protecting people was always at the forefront of my mind. Not having a specific starting point for security at the university, I started looking at the safety of our people and working out from there. While I did not know it at the time, this was effectively building privacy concepts into the security foundation. To this day, my primary goal in security is about protecting people.

Not only is Jonathan viewed as an expert among his colleagues, but he also has experience providing IT Security expert testimony and evidence in criminal and civil proceedings.

Additionally, but quite separately, PCI (Payment Card Industry Data Security Standard) had become an issue on campus, and I was volunteered to be the PCI coordinator, mostly due to nobody else wanting to do it. I am quite grateful because learning the PCI-DSS allowed me to use it as the framework for IT Security on campus. To this day, I use the 12 requirements of PCI-DSS for anybody that needs to start an IT Security Program in their own organization.

After attaining PCI-compliance that first year, and enjoying the process of working with the auditors, the university leadership added GLBA (Gramm-Leach-Bliley Act), GDPR (General Data Protection Regulation), some FERPA(Family Educational Rights and Privacy Act), some HIPAA(Health Insurance Portability and Accountability), and a few others to my compliance list. I quickly found that a strong foundation of security concepts would meet many of the compliance requirements. Each time a new compliance was added, I was able to strengthen our security stance a little bit more, overall helping protect more people on campus.

I like formal frameworks for learning new things or skillsets. While I agree that at times it is appropriate for many of us to ‘google it’ and figure stuff out, I think that process can limit people in what they learn. A formal framework may let people learn about things that may not be needed right now, but in the future, they will remember and know where to look for more information.

What would you say to those that do not see the value of obtaining certifications?

Jonathan: I think that everyone has a different way of doing things. This process has worked for me. It has also worked for many people that I have helped along in their careers.

 Training and certification are my preferred method. I enjoy the time and effort that I put in, and the value it has given me over the years. My plan is to continue with that process.

I recommend that everyone finds their best method that meets their requirements. I also recommend, that people at least try (and be successful) at different methods before they decide which one is NOT for them.  I equate to my daughter not liking mustard when she has never tried it.  How do you know there is not value when you have never been successful at it?

Pedro: For me, it’s the ability to show that you are teachable. In my mind, it takes effort to pass a certification and it means that you have to study and apply yourself. In information security, if you are not learning constantly you WILL be behind!

What advice would you give to incoming security professionals and current security professionals about which certifications to pursue as a part of continuing education and building skillsets?

Jonathan: My personal belief is that in the beginning, you should start with a generalized training & certification such as Security + or SANS SEC401. This will give you a wide view of different aspects of IT Security. People say it is a mile wide and an inch deep.

Then based on your job or your interest, you should begin deepening your knowledge and skillsets in the areas that make sense. If you are interested in the pentesting/vulnerability assessment side, then CEH( Certified Ethical Hacker), GPEN(GIAC Certified Penetration Tester), and OSCP(Offensive Security Certified Professional) may be your path. If you are interested in engineering secure systems, then working on your Microsoft or CISCO engineer certs may be more appropriate. If you are working in compliance, then maybe HIPAA and PCI certifications might be a good idea.

I am a huge supporter of constantly learning. I personally spend upwards of 8 hours a week on training, podcast, webinars, etc. I feel like if you are not learning in Security or Privacy that you are falling behind. With trying to absorb that much information, for myself, it’s important to have goals and frameworks to help me keep things organized. 

Pedro: Start where you are today. Here is my thinking- I would pursue the CompTIA Security + certification. It’s very generic and it exposes you to all the domains in security. You want to be comfortable and happy with what you do, there are so many ramifications of security that you can specialize in and be very successful.

Together, we are exploring community voices through meaningful conversations about all things information security. We look forward to following Pedro, Jonathan, and other security professionals on their journeys. People make the process and technology can help make it possible here at SecureNation.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Outreach nabs $50M at a $1.33B valuation for software that helps with sales engagement

CRM software has become a critical piece of IT when it comes to getting business done, and today a startup focusing on one specific aspect of that stack — sales automation — is announcing a growth round of funding underscoring its own momentum. Outreach, which has built a popular suite of tools used by salespeople to help identify and reach out to prospects and improve their relationships en route to closing deals, has raised $50 million in a Series F round of funding that values the company at $1.33 billion. 

The funding will be used to continue expanding geographically — headquartered in Seattle, Outreach also has an office in London and wants to do more in Europe and eventually Asia — as well as to invest in product development.

The platform today essentially integrates with a company’s existing CRM, be it Salesforce, or Microsoft’s, or Kustomer, or something else — and provides an SaaS-based set of tools for helping to source and track meetings, have to-hand information on sales targets, and a communications manager that helps with outreach calls and other communication in real time. It will be investing in more AI around the product, such as its newest product Kaia (an acronym for “knowledge AI assistant”), and it has also hired a new CFO, Melissa Fisher, from Qualys, possibly a sign of where it hopes to go next as a business.

Sands Capital — an investor out of Virginia that also backs the likes of UiPath and DoorDash — is leading the round, Outreach noted, with “strong participation” also from strategic backer Salesforce Ventures. Other investors include Operator Collective (a new backer that launched last year and focuses on B2B) and previous backers Lone Pine Capital, Spark Capital, Meritech Capital Partners, Trinity Ventures, Mayfield and Sapphire Ventures.

Outreach has raised $289 million to date, and for some more context, this is definitely an up round: the startup was last valued at $1.1 billion when it raised a Series E in April 2019.

The funding comes on the heels of strong growth for the company: More than 4,000 businesses now use its tools, including Adobe, Tableau, DoorDash, Splunk, DocuSign and SAP, making Outreach the biggest player in a field that also includes Salesloft (which also raised a significant round last year on the heels of Outreach’s), ClariChorus.aiGongConversica and Afiniti. Its sweet spot has been working with technology-led businesses and that sector continues to expand its sales operations, even as much of the economy has contracted in recent months. 

“You are seeing a cambric explosion of B2B startups happening everywhere,” Manny Medina, CEO and co-founder of Outreach, said in a phone interview this week. “It means that sales roles are being created as we speak.” And that translates to a growing pool of potential customers for Outreach.

It wasn’t always this way.

When Outreach was first founded in 2011 in Seattle, it wasn’t a sales automation company. It was a recruitment startup called GroupTalent working on software to help source and hire talent, aimed at tech companies. That business was rolling along, until it wasn’t: In 2015, the startup found itself with only two months of runway left, with little hope of raising more. 

“We were not hitting our stride, and growth was hard. We didn’t make the numbers in 2014 and then had two months of cash left and no prospects of raising more,” Medina recalled. “So I sat down with my co-founders,” — Gordon Hempton, Andrew Kinzer and Wes Hather, none of whom are at the company anymore — “and we decided to sell our way out of it. We thought that if we generated more meetings we could gain more opportunities to try to sell our recruitment software.

“So we built the engine to do that, and we saw that we were getting 40% reply rates to our own outreaching emails. It was so successful we had a 10x increase in productivity. But we ran out of sales capacity, so we started selling the meetings we had managed to secure with potential talent directly to the tech companies themselves,” in other words, the other side of its marketplace, those looking to fill vacancies.

That quickly tipped over into a business opportunity of its own. “Companies were saying to us, ‘I don’t want to buy the recruitment software. I need that sales engine!” The company never looked back, and changed its name to work for the pivot.

Fast-forward to 2020, and times are challenging in a completely different way, defined as we are by a global health pandemic that affects what we do every day, where we go, how we work, how we interact with people and much more. 

Medina says the impact of the novel coronavirus has been a significant one for the company and its customers, in part because it fits well with two main types of usage cases that have emerged in the world of sales in the time of COVID-19.

“Older sellers now working from home are accomplished and don’t need to be babysat,” he said, but added they can’t rely on their traditional touchpoints “like meetings, dinners and bar mitzvahs” anymore to seal deals. “They don’t have the tools to get over the line. So our product is being called in to help them.”

Another group at the other end of the spectrum, he said, are “younger and less experienced salespeople who don’t have the physical environment [many live in smaller places with roommates] nor experience to sell well alone. For them it’s been challenging not to come into an office because especially in smaller companies, they rely on each other to train, to listen to others on calls to learn how to sell.”

That’s the other scenario where Outreach is finding some traction: They’re using Outreach’s tools as a proxy for physically sitting alongside and learning from more experienced colleagues, and using it as a supplement to learning the ropes in the old way.

“Outreach’s leadership position in the market, clear mission, and value-added approach make the company a natural investment choice for us,” said Michael Clarke, partner at Sands Capital’s Global Innovation Fund, in a statement. “Now more than ever, companies need an AI-powered sales engagement platform like Outreach. Enterprise sales teams are rapidly adopting sales engagement platforms and Outreach’s rapid growth reflects this.”

Like a lot of sales tools that are powered by AI, Outreach in part is taking on some of the more mundane jobs of salespeople.

But Medina doesn’t believe that this will play out in the “man versus machine” scenario we often ponder when we think about human obsolescence in the face of technological efficiency. In other words, he doesn’t think we’re close to replacing the humans in the mix, even at a time when we’re seeing so many layoffs.

“We are at the early innings,” he said. “There are 6.8 million sales people and we only have north of 100,000 users, not even 2% of the market. There may be a redefinition of the role, but not a reduction.”