Data storage company Cloudian launches a new edge analytics subsidiary called Edgematrix

Cloudian, a company that enables businesses to store and manage massive amounts of data, announced today the launch of Edgematrix, a new unit focused on edge analytics for large data sets. Edgematrix, a majority-owned subsidiary of Cloudian, will first be available in Japan, where both companies are based. It has raised a $9 million Series A from strategic investors NTT Docomo, Shimizu Corporation and Japan Post Capital, as well as Cloudian co-founder and CEO Michael Tso and board director Jonathan Epstein. The funding will be used on product development, deployment and sales and marketing.

Cloudian itself has raised a total of $174 million, including a $94 million Series E round announced last year. Its products include the Hyperstore platform, which allows businesses to store hundreds of petrabytes of data on premise, and software for data analytics and machine learning. Edgematrix uses Hyperstore for storing large-scale data sets and its own AI software and hardware for data processing at the “edge” of networks, closer to where data is collected from IoT devices like sensors.

The company’s solutions were created for situations where real-time analytics is necessary. For example, it can be used to detect the make, model and year of cars on highways so targeted billboard ads can be displayed to their drivers.

Tso told TechCrunch in an email that Edgematrix was launched after Cloudian co-founder and president Hiroshi Ohta and a team spent two years working on technology to help Cloudian customers process and analyze their data more efficiently.

“With more and more data being created at the edge, including IoT data, there’s a growing need for being able to apply real-time data analysis and decision-making at or near the edge, minimizing the transmission costs and latencies involved in moving the data elsewhere,” said Tso. “Based on the initial success of a small Cloudian team developing AI software solutions and attracting a number of top-tier customers, we decided that the best way to build on this success was establishing a subsidiary with strategic investors.”

Edgematrix is launching in Japan first because spending on AI systems there is expected to grow faster than in any other market, at a compound annual growth rate of 45.3% from 2018 to 2023, according to IDC.

“Japan has been ahead of the curve as an early adopter of AI technology, with both the governmetn and private sector viewing it as essential to boosting productivity,” said Tso. “Edgematrix will focus on the Japanese market for at least the next year, and assuming that all goes well, it would then expand to North America and Europe.”

GitLab hauls in $268M Series E on 2.75B valuation

GitLab is a company that doesn’t pull any punches or try to be coy. It actually has had a page on its website for some time stating it intends to go public on November 18, 2020. You don’t see that level of transparency from late-stage startups all that often. Today, the company announced a huge $268 million Series E on a tidy $2.75 billion valuation.

Investors include Adage Capital Management, Alkeon Capital, Altimeter Capital, Capital Group, Coatue Management, D1 Capital Partners, Franklin Templeton, Light Street Capital, Tiger Management Corp. and Two Sigma Investments.

The company seems to be primed and ready for that eventual IPO. Last year, GitLab co-founder and CEO Sid Sijbrandij said that his CFO Paul Machle told him he wanted to begin planning to go public, and he would need two years in advance to prepare the company. As Sijbrandij tells it, he told him to pick a date.

“He said, I’ll pick the 16th of November because that’s the birthday of my twins. It’s also the last week before Thanksgiving, and after Thanksgiving, the stock market is less active, so that’s a good time to go out,” Sijbrandij told TechCrunch.

He said that he considered it a done deal and put the date on the GitLab Strategy page, a page that outlines the company’s plans for everything it intends to do. It turned out that he was a bit too quick on the draw. Machle had checked the date in the interim and realized that it was a Monday, which is not traditionally a great day to go out, so they decided to do it two days later. Now the target date is officially November 18, 2020.

Screenshot 2019 09 17 08.35.33 2

GitLab has the date it’s planning to go public listed on its Strategy page.

As for that $268 million, it gives the company considerable runway ahead of that planned event, but Sijbrandij says it also gives him flexibility in how to take the company public. “One other consideration is that there are two options to go public. You can do an IPO or direct listing. We wanted to preserve the optionality of doing a direct listing next year. So if we do a direct listing, we’re not going to raise any additional money, and we wanted to make sure that this is enough in that case,” he explained.

Sijbrandij says that the company made a deliberate decision to be transparent early on. Being based on an open-source project, it’s sometimes tricky to make that transition to a commercial company, and sometimes that has a negative impact on the community and the number of contributions. Transparency was a way to combat that, and it seems to be working.

He reports that the community contributes 200 improvements to the GitLab open-source product every month, and that’s double the amount of just a year ago, so the community is still highly active in spite of the parent company’s commercial success.

It did not escape his notice that Microsoft acquired GitHub last year for $7.5 billion. It’s worth noting that GitLab is a similar kind of company that helps developers manage and distribute code in a DevOps environment. He claims in spite of that eye-popping number, his goal is to remain an independent company and take this through to the next phase.

“Our ambition is to stay an independent company. And that’s why we put out the ambition early to become a listed company. That’s not totally in our control as the majority of the company is owned by investors, but as long as we’re more positive about the future than the people around us, I think we can we have a shot at not getting acquired,” he said.

The company was founded in 2014 and was a member of Y Combinator in 2015. It has been on a steady growth trajectory ever since, hauling in more than $426 million. The last round before today’s announcement was a $100 million Series D last September.

LinkedIn launches skills assessments, tests that let you beef up your credentials for job hunting

LinkedIn, the social networking service for the working world, is today taking the wraps off its latest effort to provide its users with better tools for presenting their professional selves, and to make the process of recruitment on the platform more effective. It will now offer a new feature called Skills Assessments: short, multiple-choice tests that users can take to verify their knowledge in areas like computer languages, software packages and other work-related skills.

The feature is being rolled out globally today. However, while offering the skills assessments as part of an earlier, limited beta, LinkedIn tells us that 2 million tests were taken and applied across the platform. That’s a sign of how the full service might well be a very popular, and needed, feature.

First up are English-language tests covering some 75 different skills, all free to take, but the plan, according to Emrecan Dogan, the group product manager in its talent solutions division, is to “ramp that up agressively” in the near future, both adding in different languages and more test areas.

(Side note: Dogan joined LinkedIn when his company ScoreBeyond was quietly acquired by LinkedIn last year. ScoreBeyond was an online testing service to help students prep for college entrance exams. Given LinkedIn’s efforts to get closer to younger users — again, in part because of competitive pressure — I suspect that is one area where LinkedIn will likely want to expand this assessment tool longer term, if it takes off.)

The skills assessment tool is coming at an important moment for LinkedIn.

The Microsoft-owned company now has nearly 650 million people around the world using its social networking tools to connect with each other for professional purposes, most often to network, talk about work, or find work.

That makes for a fascinating and lucrative economy of scale when it comes to rolling out its products. But it comes with a major drawback, too: the bigger the platform gets, the harder it is to track and verify details about each and every individual on it. The skills assessment becomes one way of at least being able to verify certain people’s skills in specific areas, and for that information to start feeding into other channels and products on the platform.

It’s also a critical competitive move. The company is by far the biggest platform of its kind on the internet today, but smaller rivals are building interesting products to chip away at that lead in specific areas. Triplebyte, for example, has created a platform for those looking to hire engineers, and engineers looking for new roles, to connect by way of the engineers — yes — taking online tests to measure their skills and match them up with compatible job opportunities. Triplebyte is focused on just one field — software engineering — but the template is a disruptive one that, if replicated in other verticals, could slowly start to chip away at LinkedIn’s hegemony.

Other larger platforms also continue to look at ways that they might leverage their own social graphs to provide work-related networking services. Facebook, for example, had incorporated e-learning into its own efforts in professional development, laying the groundwork for other kinds of interactive training and assessment.

This is not the first time that LinkedIn has tinkered with the idea of offering tests to help ascertain the level of users’ skills on its platform, although the information was used for different ends. In India, several years ago the company started to incorporate tests on its platform to help suggest jobs to users. Nor is it the first time that the company has worked on ways to improve its skills and endorsement profile to make them more useful.

Testing on actual skills is just one area where verification has fallen short on LinkedIn. Another big trend in recruitment is the push for more diverse workforces. The thinking is that traditionally too many of the parameters that have been used up to now to assess people — what college was attended, or where people have worked already — have been essentially cutting many already-disenfranchised groups out of the process.

Given that LinkedIn currently has no way of ascertaining when people on its platform are from minority backgrounds, a skills assessment — and especially a good result on one — might potentially help tip the balance in favor of meritrocracy (if not proactive diversity focused hiring as such).

For regular users, the option to take skills assessments and add them to your profile will appear for users as a button in the skills and endorsements area of their profiles.

Users take short tests — currently only multiple choice — which Dogan says are created by professionals who are subject area experts that already work with LinkedIn, for example to write content for LinkedIn learning.

Indeed, in November last year, the company expanded LinkedIn Learning to include content from third-party providers and Q&A interactivity so there is a trove of work already there that might be repurposed as part of this new effort.

These tests measure your knowledge in specific areas, and if you pass, you are given a badge that you can apply to your profile page, and potentially broadcast out to those who are looking for people with the skills you’ve just verified you have. (This is presuming that you are not cheating and having someone else take the test for you, or taking it while looking up answers elsewhere.) You can opt out of sharing the information anywhere else, if you choose.

If you fail, you have three months to wait before taking it again, and in the meantime LinkedIn will use the moment to upsell you on its other content: you get offered LinkedIn Learning tests to improve your skills.

For those who pass, they will need to retake tests every year to keep their badges and credentials.

On the side of recruiters, they are able to use the data that gets amassed through the tests as a way of better filtering out users when sourcing candidate pools for job openings. This is a huge issue on a platform like LinkedIn: while having a large group of people on there is a boost for finding matches, in fact there can be too many, and too much of a challenge and time suck to figure out who is genuinely suitable for a particular role.

There is another angle where the skills are being used to help LinkedIn monetise: those who are putting in ads for jobs can now buy ads that are targeted specifically to people with certain skills that have been verified through assessments.

There are still some shortfalls in the skills assessment tool as it exists now. For example, coding tests are all multiple choice, but that’s not how many coding environments work these days. (Triplebyte for example offers collaborative assessments.) And of course, skills is just one aspect of how people might fit into a particular working environment. (Currently there are no plans to bring in psychometric or similar assessments, Dogan said.) This is an interesting start, however, and worth testing the waters as more interesting variations in recruitment and connecting professionals online continue to proliferate.

 

IEX’s Katsuyama is no flash in the pan

When you watch a commercial for one of the major stock exchanges, you are welcomed into a world of fast-moving, slick images full of glistening buildings, lush crops and happy people. They are typically interspersed with shots of intrepid executives veering out over the horizon as if to say, “I’ve got a long-term vision, and the exchange where my stock is listed is a valuable partner in achieving my goals.” It’s all very reassuring and stylish. But there’s another side to the story.

I have been educated about the realities of today’s stock exchange universe through recent visits with Brad Katsuyama, co-founder and CEO of IEX (a.k.a. The Investors Exchange). If Katsuyama’s name rings a bell, and you don’t work on Wall Street, it’s likely because you remember him as the protagonist of Michael Lewis’s 2014 best-seller, Flash Boys: A Wall Street Revolt, which explored high-frequency trading (HFT) and made the case that the stock market was rigged, really badly.

Five years later, some of the worst practices Lewis highlighted are things of the past, and there are several attributes of the American equity markets that are widely admired around the world. In many ways, though, the realities of stock trading have gotten more unseemly, thanks to sophisticated trading technologies (e.g., microwave radio transmissions that can carry information at almost the speed of light), and pitched battles among the exchanges, investors and regulators over issues including the rebates stock exchanges pay to attract investors’ orders and the price of market data charged by the exchanges.

I don’t claim to be an expert on the inner workings of the stock market, but I do know this: Likening the life cycle of a trade to sausage-making is an insult to kielbasa. More than ever, trading is an arcane, highly technical and bewildering part of our broader economic infrastructure, which is just the way many industry participants like it: Nothing to see here, folks.

Meanwhile, Katsuyama, company president Ronan Ryan and the IEX team have turned IEX into the eighth largest stock exchange company, globally, by notional value traded, and have transformed the concept of a “speed bump” into a mainstream exchange feature.

Brad Aug 12

Brad Katsuyama. Image by Joshua Blackburn via IEX Trading

Despite these and other accomplishments, IEX finds itself in the middle of a vicious battle with powerful incumbents that seem increasingly emboldened to use their muscle in Washington, D.C. What’s more, new entrants, such as The Long-Term Stock Exchange and Members Exchange, are gearing up to enter the fray in US equities, while global exchanges such as the Hong Kong Stock Exchange seek to bulk up by making audacious moves like attempting to acquire the venerable London Stock Exchange.

But when you sell such distinct advantages to one group that really can only benefit from that, it leads to the question of why anyone would want to trade on that market. It’s like walking into a playing field where you know that the deck is stacked against you.

As my discussion with Katsuyama reveals, IEX may have taken some punches in carving out a position for itself in this high-stakes war characterized by cutting-edge technology and size. However, the IEX team remains girded for battle and confident that it can continue to make headway in offering a fair and transparent option for market participants over the long term.

Gregg Schoenberg: Given Flash Boys and the attention it generated for you on Main Street, I’d like to establish something upfront. Does IEX exist for the asset manager, the individual, or both?

Brad Katsuyama: We exist primarily for the asset manager, and helping them helps the individual. We’re one step removed from the individual, and part of that is due to regulation. Only brokers can connect to exchanges, and the asset manager connects to the broker.

Schoenberg: To put a finer point on it, you believe in fairness and being the good guy. But you are not Robinhood. You are a capitalist.

Katsuyama: Yes, but we want to make money fairly. Actually, we thought initially about starting the business as a nonprofit, But once we laid out all the people we would need to convince to work for us, we realized it would’ve been hard for us to attract the skill sets needed as a nonprofit.

Schoenberg: Do you believe that the US equity market today primarily serves investors or traders?

Boston-based DataRobot raises $206M Series E to bring AI to enterprise

Artificial intelligence is playing an increasingly large role in enterprise software, and Boston’s DataRobot has been helping companies build, manage and deploy machine learning models for some time now. Today, the company announced a $206 million Series E investment led by Sapphire Ventures.

Other participants in this round included new investors Tiger Global Management, World Innovation Lab, Alliance Bernstein PCI, and EDBI along with existing investors DFJ Growth, Geodesic Capital, Intel Capital, Sands Capital, NEA and Meritech.

Today’s investment brings the total raised to $431 million, according to the company. It has a pre-money valuation of $1 billion, according to PitchBook. DataRobot would not confirm this number.

The company has been catching the attention of these investors by offering a machine learning platform aimed at analysts, developers and data scientists to help build predictive models much more quickly than it typically takes using traditional methodologies. Once built, the company provides a way to deliver the model in the form of an API, simplifying deployment.

The late-stage startup plans to use the money to continue building out its product line, while looking for acquisition opportunities where it makes sense. The company also announced the availability of a new product today, DataRobot MLOps, a tool to manage, monitor and deploy machine learning models across a large organization.

The company, which was founded in 2012, claims it has had triple-digit recurring revenue growth dating back to 2015, as well as one billion models built on the platform to-date. Customers contributing to that number include a broad range of companies such as Humana, United Airlines, Harvard Business School and Deloitte.

Man Who Hired Deadly Swatting Gets 15 Months

An Ohio teen who recruited a convicted serial “swatter “to fake a distress call that ended in the police shooting an innocent Kansas man in 2017 has been sentenced to 15 months in prison.

Image: FBI.gov

“Swatting” is a dangerous hoax that involves making false claims to emergency responders about phony hostage situations or bomb threats, with the intention of prompting a heavily-armed police response to the location of the claimed incident.

The tragic swatting hoax that unfolded on the night of Dec. 28, 2017 began with a dispute over a $1.50 wager in an online game “Call of Duty” between Shane M. Gaskill, a 19-year-old Wichita, Kansas resident, and Casey S. Viner, 18, from the Cincinnati, OH area.

Viner wanted to get back at Gaskill in grudge over the Call of Duty match, and so enlisted the help of another man — Tyler R. Barriss — a serial swatter in California known by the alias “SWAuTistic” who’d bragged of swatting hundreds of schools and dozens of private residences.

Chat transcripts presented by prosecutors showed Viner and Barriss both saying if Gaskill isn’t scared of getting swatted, he should give up his home address. But the address that Gaskill gave Viner to pass on to Barriss no longer belonged to him and was occupied by a new tenant.

Barriss’s fatal call to 911 emergency operators in Wichita was relayed from a local, non-emergency line. Barriss falsely claimed he was at the address provided by Viner, that he’d just shot his father in the head, was holding his mom and sister at gunpoint, and was thinking about burning down the home with everyone inside.

Wichita police quickly responded to the fake hostage report and surrounded the address given by Gaskill. Seconds later, 28-year-old Andrew Finch exited his mom’s home and was killed by a single shot from a Wichita police officer. Finch, a father of two, had no party to the gamers’ dispute and was simply in the wrong place at the wrong time.

“Swatting is not a prank, and it is no way to resolve disputes among gamers,” U.S. Attorney Stephen McAllister, said in a press statement. “Once again, I call upon gamers to self-police their community to ensure that the practice of swatting is ended once and for all.”

In chat records presented by prosecutors, Viner admitted to his role in the deadly swatting attack:

Defendant VINER: I literally said you’re gonna be swatted, and the guy who swatted him can easily say I convinced him or something when I said hey can you swat this guy and then gave him the address and he said yes and then said he’d do it for free because I said he doesn’t think anything will happen
Defendant VINER: How can I not worry when I googled what happens when you’re involved and it said a eu [sic] kid and a US person got 20 years in prison min
Defendant VINER: And he didn’t even give his address he gave a false address apparently
J.D.: You didn’t call the hoax in…
Defendant VINER: Does t [sic] even matter ?????? I was involved I asked him to do it in the first place
Defendant VINER: I gave him the address to do it, but then again so did the other guy he gave him the address to do it as well and said do it pull up etc

Barriss was sentenced earlier this year to 20 years in federal prison for his role in the fatal swatting attack.

Barriss also pleaded guilty to making hoax bomb threats in phone calls to the headquarters of the FBI and the Federal Communications Commission in Washington, D.C. In addition, he made bomb threat and swatting calls from Los Angeles to emergency numbers in Ohio, New Hampshire, Nevada, Massachusetts, Illinois, Utah, Virginia, Texas, Arizona, Missouri, Maine, Pennsylvania, New Mexico, New York, Michigan, Florida and Canada.

Prosecutors for the county that encompasses Wichita decided in April 2018 that the officer who fired the shot that killed Andrew Finch would not face charges, and would not be named because he wasn’t being charged with a crime.

Viner was sentenced after pleading guilty to one count each of conspiracy and obstructing justice, the US attorney’s office for Kansas said. CNN reports that Gaskill has been placed on deferred prosecution.

Viner’s obstruction charge stems from attempts to erase records of his communications with Barriss and the Wichita gamer, McAllister’s office said. In addition to his prison sentence, Viner was ordered to pay $2,500 in restitution and serve two years of supervised release.

Keeping your Business Protected from CVE-2019-0708 (aka Bluekeep)

Following the recent Metasploit exploit community release we’ve been busy this weekend in the lab testing the exploit against our vulnerable sandpit. You can see from the below screenshot that we were able to load the new module and successfully gain shell access on a vulnerable host in our test environment.

Following successful exploitation of the vulnerable machine (unpatched, with RDP enabled through windows firewall) we then proceeded to deploy SentinelOne.

Live Wild Exploitation

We have observed in-the-wild attempts to both identify as well as exploit vulnerable hosts.  When you couple this with the various ‘commercial’ options available (MSF, Immunity CANVAS) it becomes that much more critical that organizations continue to take action to protect themselves against this attack vector.

CVE-2019-0708 Background

https://portal.msrc.microsoft.com/en-US/security-guidance/advisory/CVE-2019-0708

“A remote code execution vulnerability exists in Remote Desktop Services – formerly known as Terminal Services – when an unauthenticated attacker connects to the target system using RDP and sends specially crafted requests. This vulnerability is pre-authentication and requires no user interaction. An attacker who successfully exploited this vulnerability could execute arbitrary code on the target system. An attacker could then install programs; view, change, or delete data; or create new accounts with full user rights.”

In essence, this is a remote code execution vulnerability that affects legacy Windows operating systems, in certain situations this required no authentication. This makes the vulnerability incredibly dangerous.

On top of that, Rapid7 just announced that a community developed exploit has been made public as part of a Metasploit framework (a security testing framework) pull request.  This is in addition to an Immunity’s commercial option in CANVAS, released in July 2019.

This exploit is currently not merged into the main branch of Metasploit, however, we expect it will be in the near future. 

Vulnerable Hosts

Using BinaryEdge we can see over 1 million potentially vulnerable hosts still on the Internet, despite a patch being released in May 2019.

Using binary edge we can see over 1 million potentially vulnerable hots still on the internet, this is despite a patch being released months ago.

Note: The quantity of exposed an vulnerable hosts can vary depending on they type of scan and the services used for the query, but by all accounts the number is between ~500k and ~1.2 million.

MS Recommended Mitigations

MS Recommended Workarounds

This may come as no surprise but enable Network Level Authentication (NLA) which will prevent the exploit from working from an unauthenticated perspective.

Keeping your business protected

It is important to understand a few things about this vulnerability, the pattern and mitigations. For the machines to be vulnerable in this instance we have to have a fairly relaxed security posture/configuration on the exposed machines.

  • We must have RDP enabled without NLA (never recommended but we often see this in the wild)
  • The service must be enabled and exposed from a network perspective (we can see over 1 million vulnerable servers on the Internet at time of writing)
  • The vulnerability is known and a patch is available so we need there to be a failing in the patch management and vulnerability management process surrounding this service/asse

However with this in mind we can show how having additional protective controls can greatly enhance even a “weak” security configuration.

In our test environment we deployed SentinelOne (EDR) to the device, leaving the machine unpatched and without NLA enabled. We once again attempted to shell the device using the MSF module:

We can see here no connection was established and that SentinelOne blocked the threat. 

Watch the Demo

Here we can see the alerts in the SentinelOne console. We drill down further into the alert:

In a real scenario, we would now take further action, patching the vulnerable system, hardening the configuration and also attempt to identify the threat actor and finally we would advise. For the lab, we aren’t going to go into that level of detail (I already know who launched the attack!)

Summary

Whilst the exploit release isn’t highly mature, it can BSOD a box and requires an understanding of the target (in terms of architecture and environment e.g. hypervisor) in order to gain a shell you can still see it’s highly effective when targeted against a Windows 7 SP1 machine that has a weak security configuration. You can take a range of measures to improve the posture of the machine which may include:

  • Disable RDP if it is not required
  • Deploy a more secure configuration (e.g. enable NLA)
  • Note that if a low privileged user account is compromised it may still be possible to use this exploit to gain SYSTEM level access even if NLA is enabled
  • Only allow RDP from whitelisted admin subnets only
  • Ensure systems are patched
  • Running regular vulnerability scans of your network and endpoints is advised
  • Ensure systems have adequate protection and response capabilities (such as SentinelOne)

If you need assistance ensuring your business is protected, please don’t hesitate to get in touch. Our team of highly trained security engineers are on hand to help you protect, detect and respond to emerging threats!

A guest post by Daniel Card, Head of Cyber Security Services PSTG.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Salesforce doubles down on verticals, launches Manufacturing and Consumer Goods Clouds

As legacy industries make the migration to cloud-based digital solutions to run and grow their businesses, Salesforce is hoping that it will get a cut of the action when it comes to their IT investments. The CRM giant has been doubling down on building specialised solutions for individual industry verticals, and today, it is unveiling new business units dedicated to not one but two of them: manufacturing and consumer goods.

The Manufacturing Cloud and Consumer Goods Cloud, as the two new products are called, are the latest in a list of other vertical-specific products the company has created. Other verticals targeted to date include finance, healthcare, media, nonprofits and retail.

The idea behind Salesforce’s strategy to build industry-specific solutions is that while the CRM and sales processes that go into manufacturing and consumer goods do have some aspects in common with other industries, both also have relatively specific requirements, too, around how sales are agreed and clients are managed.

In the case of manufacturing and consumer goods, both are capital-intensive businesses where those working on the physical products might be very removed from those working on sales (not just in terms of job functions, but in terms of the software that’s used to manage each operation), or those who are in the field who are helping to distribute those goods to the people ultimately selling them.

“In the manufacturing industry, changing customer and market demands can have a devastating effect on the bottom line, so being able to understand what is happening on the ground is imperative for success,” said Cindy Bolt, SVP and GM, Salesforce Manufacturing, in a statement. “Manufacturing Cloud bridges the gap between sales and operations teams while ensuring more predictive and transparent business, so they can build deeper and more trusted relationships with their customers.”

In both the cases of manufacturing and consumer goods, Salesforce is not creating these services out of thin air: the company had already been touting solutions for both sectors as part of its bigger push into specific industries. Past acquisitions of companies like Steelbrick — a specialist in quote-to-cash solutions, a cornerstone of how manufacturing sales are made — are likely to have played a contributing role in how the new clouds were built.

With the Manufacturing Cloud, Salesforce says that it has included a feature for sales agreements that link up with a company’s ERP and forecasting software to be able to better predict demand from individual customers as well as the wider market. The services are also coming with more analytical insights by way of Einstein Analytics, and more functionality to work with channel partners. Third parties working with Salesforce on joint solutions using Marketing Cloud include Acumen Solutions, Deloitte and Rootstock.

The Consumer Goods Cloud has some parallel with the Manufacturing Cloud, in that both are targeting businesses that are by their nature and by legacy very rooted in physical goods and are therefore not easily “disrupted” by digital innovation. Indeed, despite all that we hear about the might of Amazon and e-commerce, a full 95% of products are still sold in physical stores. That system has a lot of drawbacks, not least of them being challenges with consumer goods brands having accurate control over how products are distributed and ultimately sold.

“Retail execution remains one of the most important pieces of a consumer goods brands strategy, but so much opportunity is wasted if the field rep doesn’t have the data and technology needed to make smart decisions,” said John Strain, GM and SVP, Retail and Consumer Goods at Salesforce, in a statement. “Consumer Goods Cloud provides these field reps with the tools they need to be successful on the ground while helping build both business opportunities and stronger relationships with their retail partners.”

The company, citing research from PwC, claims that of the $200 billion that’s spent in the U.S. by consumer goods companies each year on merchandising, marketing and sales efforts for in-store sales, some $100 billion of that spend is never used in the way it was originally intended. (That’s one reason so many consumer goods companies have jumped into social media: it’s a way of connecting better and more directly, at least with the customers.)

That represents a huge area to tackle for a company likes Salesforce, and the Consumer Goods Cloud is the start of that effort. The product covers software that addresses areas optimising visits to stores, improving relationships with retailers, using Einstein insights for analytics and ordering software. Partners in the effort include Accenture and PwC.

Another important thing to note here is that Salesforce’s move into the area comes as a competitive strike: Not only are there companies out there that have built products specifically for these markets — Sysco for consumer goods, and Atlatl Software for manufacturing, for example — but Salesforce has to contend with general rivals such as Microsoft and SAP also targeting the same potential customers.

As of last quarter, Sales Cloud now accounts for more than one-quarter of Salesforce’s revenues, but today’s news underscores how “sales” is becoming a more complex and nuanced topic for the company as its business continues to grow, and as cloud-based digital processes become ever more ubiquitous across all sectors beyond simply knowledge workers. As Salesforce builds out more solutions to meet every kind of enterprise’s needs, it’s likely there will be more vertical-specific tools making their way to the platform.

FOSSA scores $8.5 million Series A to help enterprise manage open-source licenses

As more enterprise developers make use of open source, it becomes increasingly important for companies to make sure that they are complying with licensing requirements. They also need to ensure the open-source bits are being updated over time for security purposes. That’s where FOSSA comes in, and today the company announced an $8.5 million Series A.

The round was led by Bain Capital Ventures, with help from Costanoa Ventures and Norwest Venture Partners. Today’s round brings the total raised to $11 million, according to the company.

Company founder and CEO Kevin Wang says that over the last 18 months, the startup has concentrated on building tools to help enterprises comply with their growing use of open source in a safe and legal way. He says that overall this increasing use of open source is great news for developers, and for these bigger companies in general. While it enables them to take advantage of all the innovation going on in the open-source community, they need to make sure they are in compliance.

“The enterprise is really early on this journey, and that’s where we come in. We provide a platform to help the enterprise manage open-source usage at scale,” Wang explained. That involves three main pieces. First it tracks all of the open-source and third-party code being used inside a company. Next, it enforces licensing and security policy, and, finally, it has a reporting component. “We automate the mass reporting and compliance for all of the housekeeping that comes from using open source at scale,” he said.

The enterprise focus is relatively new for the company. It originally launched in 2017 as a tool for developers to track individual use of open source inside their programs. Wang saw a huge opportunity inside the enterprise to apply this same kind of capability inside larger organizations, which were hungry for tools to help them comply with the myriad open-source licenses out there.

“We found that there was no tooling out there that can manage the scale and breadth across all the different enterprise use cases and all the really complex mission-critical code bases,” he said. What’s more, he found that where there were existing tools, they were vastly underutilized or didn’t provide broad enough coverage.

The company announced a $2.2 million seed round in 2017, and since then has grown from 10 to 40 employees. With today’s funding, that should increase as the company is expanding quickly. Wang reports that the startup has been tripling its revenue numbers and customer accounts year over year. The new money should help accelerate that growth and expand the product and markets it can sell into.

Walt Disney Studios partners with Microsoft Azure on cloud innovation lab

Seems like everything is going to the cloud these days, so why should movie making be left out? Today, Walt Disney Studios announced a five-year partnership with Microsoft around an innovation lab to find ways to shift content production to the Azure cloud.

The project involves the Walt Disney StudioLAB, an innovation work space where Disney personnel can experiment with moving different workflows to the cloud. The movie production software company, Avid is also involved.

The hope is that by working together, the three parties can come up with creative, cloud-based workflows that can accelerate the innovation cycle at the prestigious movie maker. Every big company is looking for ways to innovate, regardless of their core business, and Disney is no different.

As movie making involves ever greater amounts of computing resources, the cloud is a perfect model for it, allowing them to scale up and down resources as needed, whether rendering scenes or adding special effects. As Disney’s CTO Jamie Voris sees it, this could make these processes more efficient, which could help lower cost and time to production.

“Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best,” Voris said in a statement. It’s the same kind of cloud value proposition that many large organizations are seeking. They want to speed time to market, while letting technology handle some of the more mundane tasks.

The partnership builds on an existing one that Microsoft already had with Avid, where the two companies have been working together to build cloud-based workflows for the film industry using Avid software solutions on Azure. Disney will add its unique requirements to the mix, and over the five years of the partnership, hopes to streamline some of its workflows in a more modern cloud context.