Salesforce is building an app to gauge a company’s sustainability progress

Salesforce has always tried to be a socially responsible company, encouraging employees to work in the community, giving 1% of its profits to different causes and building and productizing the 1-1-1 philanthropic model. The company now wants to help other organizations be more sustainable to reduce their carbon footprint, and today it announced it is working on a product to help.

Patrick Flynn, VP of sustainability at Salesforce, says that it sees sustainability as a key issue, and one that requires action right now. The question was how Salesforce could help. As a highly successful software company, it decided to put that particular set of skills to work on the problem.

“We’ve been thinking about how can Salesforce really take action in the face of climate change. Climate change is the biggest, most important and most complex challenge humans have ever faced, and we know right now, every individual, every company needs to step forward and do everything it can,” Flynn told TechCrunch.

And to that end, the company is developing the Salesforce Sustainability Cloud, to help track a company’s sustainability efforts. The tool should look familiar to Salesforce customers, but instead of tracking customers or sales, this tool tracks carbon emissions, renewable energy usage and how well a company is meeting its sustainability goals.

Dashboards

Image: Salesforce

The tool works with internal data and third-party data as needed, and is subject to both an internal audit by the Sustainability team and third-party organizations to be sure that Salesforce (and Sustainability Cloud customers) are meeting their goals.

Salesforce has been using this product internally to measure its own sustainability efforts, which Flynn leads. “We use the product to measure our footprint across all sorts of different aspects of our operations from data centers, public cloud, real estate — and we work with third-party providers everywhere we can to have them make their operations cleaner, and more powered by renewable energy and less carbon intensive,” he said. When there is carbon generated, the company uses carbon offsets to finance sustainability projects such as clean cookstoves or helping preserve the Amazon rainforest.

Flynn says increasingly the investor community is looking for proof that companies are building a real, verifiable sustainability program, and the Sustainability Cloud is an effort to provide that information both for Salesforce and for other companies that are in a similar position.

The product is in beta now and is expected to be ready next year. Flynn could not say how much they plan to charge for this service, but he said the goal of the product is positive social impact.

Hear Salesforce chairman, co-founder and CEO Marc Benioff discuss business as the greatest platform for change at Disrupt SF October 2-4. Get your passes to the biggest startup show around. 

Aliro comes out of stealth with $2.7M to ‘democratize’ quantum computing with developer tools

It’s still early days for quantum computing, but we’re nonetheless seeing an interesting group of startups emerging that are helping the world take advantage of the new technology now. Aliro Technologies, a Harvard startup that has built a platform for developers to code more easily for quantum environments — “write once, run anywhere” is one of the startup’s mottos — is today coming out of stealth and announcing its first funding of $2.7 million to get it off the ground.

The seed round is being led Flybridge Capital Partners, with participation from Crosslink Ventures and Samsung NEXT’s Q Fund, a fund the corporate investor launched last year dedicated specifically to emerging areas like quantum computing and AI.

Aliro is wading into the market at a key moment in the development of quantum computing.

While vendors continue to build new quantum hardware to be able to tackle the kinds of complex calculations that cannot be handled by current binary-based machines, for example around medicine discovery, or multi-variabled forecasting — just today IBM announced plans for a 53-qubit device — even so, it’s widely acknowledged that the computers that have been built so far face a number of critical problems that will hamper wide adoption.

The interesting development of recent times is the emergence of startups that are tackling these specific critical problems, dovetailing that progress with that of building the hardware itself. Take the fact that quantum machines so far have been too prone to error when used for extended amounts of time: last week, I wrote about a startup called Q-CTRL that has built firmware that sits on top of the machines to identify when errors are creeping in and provide fixes to stave off crashes.

The specific area that Aliro is addressing is the fact that quantum hardware is still very fragmented: each machine has its own proprietary language and operating techniques and sometimes even purpose for which it’s been optimised. It’s a landscape that is challenging for specialists to engage in, let alone the wider world of developers.

“We’re at the early stage of the hardware, where quantum computers have no standardisation, even those based on the same technology have different qubits (the basic building block of quantum activity) and connectivity. It’s like digital computing in 1940s,” said CEO and chairman Jim Ricotta. (The company is co-founded by Harvard computational materials science professor Prineha Narang along with Michael Cubeddu and Will Finegan, who are actually still undergraduate students at the university.)

“Because it’s a different style of computing, software developers are not used to quantum circuits,” said Ricotta, and engaging with them is “not the same as using procedural languages. There is a steep on-ramp from high-performance classical computing to quantum computing.”

While Aliro is coming out of stealth, it appears that the company is not being specific with details about how its platform actually works. But the basic idea is that Aliro’s platform will essentially be an engine that will let developers work in the languages that they know, and identify problems that they would like to solve; it will then assess the code and provide a channel for how to optimise that code and put it into quantum-ready language, and suggest the best machine to process the task.

The development points to an interesting way that we may well see quantum computing develop, at least in its early stages. Today, we have a handful of companies building and working on quantum computers, but there is still a question mark over whether these kinds of machines will ever be widely deployed, or if — like cloud computing — they will exist among a smaller amount of providers that will provide access to them on-demand, SaaS-style. Such a model would seem to fit with how much computing is sold today in the form of instances, and would open the door to large cloud names like Amazon, Google and Microsoft playing a big role in how this would be disseminated.

Such questions are still theoretical, of course, given some of the underlying problems that have yet to be fixed, but the march of progress seems inevitable, with forecasts predicting that quantum computing is likely to be a $2.2 billion industry by 2025, and if this is a route that is taken, the middlemen like Aliro could play an important role.

“I have been working with the Aliro team for the past year and could not be more excited about the opportunity to help them build a foundational company in Quantum Computing software, “ said David Aronoff, general partner at Flybridge, in a statement. “Their innovative approach and unique combination of leading Quantum researchers and a world-class proven executive team, make Aliro a formidable player in this exciting new sector.

“At Samsung NEXT we are focused on what the world will look like in the future, helping to make that a reality,” said Ajay Singh, Samsung NEXT’s Q Fund, in a statement. “We were drawn to Prineha and her team by their impressive backgrounds and extent of research into quantum computing. We believe that Aliro’s unique software products will revolutionize the entire category, by speeding up the inflection point where quantum becomes as accessible as classical computing. This could have implications on anything from drug discovery, materials development or chemistry. Aliro’s ability to map quantum circuits to heterogeneous hardware in an efficient way will be truly transformative and we’re thrilled to be on this journey with them.”

Salesforce brings AI power to its search tool

Enterprise search tools have always suffered from the success of Google. Users wanted to find the content they needed internally in the same way they found it on the web. Enterprise search has never been able to meet those lofty expectations, but today Salesforce announced Einstein Search, an AI-powered search tool for Salesforce users that is designed to point them to the exact information for which they are looking.

Will Breetz, VP of product management at Salesforce, says that enterprise search has suffered over the years for a variety of reasons. “Enterprise search has gotten a bad rap, but deservedly so. Part of that is because in many ways it is more difficult than consumer search, and there’s a lot of headwinds,” Breetz explained.

To solve these issues, the company decided to put the power of its Einstein artificial intelligence engine to bear on the problem. For starters, it might not know the popularity of a given topic like Google, but it can learn the behaviors of an individual and deliver the right answer based on a person’s profile, including geography and past activity to deliver a more meaningful answer.

Einstein Search Personal

Image: Salesforce

Next, it allows you to enter natural language search phrasing to find the exact information you need, and the search tool understands and delivers the results. For instance, you could enter, “my open opportunities in Boston” and using natural language understanding, the tool can translate that into the exact set of results you are looking for — your open opportunities in Boston. You could use conventional search to click a series of check boxes to narrow the list of results to only Boston, but this is faster and more efficient.

Finally, based on what the intelligence engine knows about you, and on your search parameters, it can predict the most likely actions you want to take and provide quick action buttons in the results to help you do that, reducing the time to action. It may not seem like much, but each reduced workflow adds up throughout a day, and the idea is to anticipate your requirements and help you get your work done more quickly.

Salesforce appears to have flipped the enterprise search problem. Instead of having a limited set of data being a handicap for enterprise search, it is taking advantage of that, and applying AI to help deliver more meaningful results. It’s for a limited set of findings for now, such as accounts, contacts and opportunities, but the company plans to add options over time.

Tableau update uses AI to increase speed to insight

Tableau was acquired by Salesforce earlier this year for $15.7 billion, but long before that, the company had been working on its fall update, and today it announced several new tools, including a new feature called “Explain Data” that uses AI to get to insight quickly.

“What Explain Data does is it moves users from understanding what happened to why it might have happened by automatically uncovering and explaining what’s going on in your data. So what we’ve done is we’ve embedded a sophisticated statistical engine in Tableau, that when launched automatically analyzes all the data on behalf of the user, and brings up possible explanations of the most relevant factors that are driving a particular data point,” Tableau chief product officer, Francois Ajenstat explained.

He added that what this really means is that it saves users time by automatically doing the analysis for them, and It should help them do better analysis by removing biases and helping them dive deep into the data in an automated fashion.

Explain Data Superstore extreme value

Image: Tableau

Ajenstat says this is a major improvement, in that, previously users would have do all of this work manually. “So a human would have to go through every possible combination, and people would find incredible insights, but it was manually driven. Now with this engine, they are able to essentially drive automation to find those insights automatically for the users,” he said.

He says this has two major advantages. First of all, because it’s AI-driven it can deliver meaningful insight much faster, but also it gives a more rigorous perspective of the data.

In addition, the company announced a new Catalog feature, which provides data bread crumbs with the source of the data, so users can know where the data came from, and whether it’s relevant or trustworthy.

Finally, the company announced a new server management tool that helps companies with broad Tableau deployment across a large organization to manage those deployments in a more centralized way.

All of these features are available starting today for Tableau customers.

Before He Spammed You, this Sly Prince Stalked Your Mailbox

A reader forwarded what he briefly imagined might be a bold, if potentially costly, innovation on the old Nigerian prince scam that asks for help squirreling away millions in unclaimed fortune: It was sent via the U.S. Postal Service, with a postmarked stamp and everything.

In truth these old fashioned “advance fee” or “419” scams predate email and have circulated via postal mail in various forms and countries over the years.

The recent one pictured below asks for help in laundering some $11.6 million from an important dead person that anyway has access to a secret stash of cash. Any suckers who bite are strung along for weeks while imaginary extortionists or crooked employees at these bureaucratic institutions demand licenses, bribes or other payments before disbursing any funds. Those funds never arrive, no matter how much money the sucker gives up.

This type of “advance fee” or “419” scam letter is common in spam, probably less so via USPS.

It’s easy to laugh at this letter, because it’s sometimes funny when scammers try so hard. But then again, maybe the joke’s on us because sending these scams via USPS makes them even more appealing to the people most vulnerable: Older individuals with access to cash but maybe not all their marbles. 

Sure, the lure costs $.55 up front. But a handful of successful responses to thousands of mailers could net fortunes for these guys phishing it old school.

The losses from these types of scams are sometimes hard to track because so many go unreported. But they are often perpetrated by the same people involved in romance scams online and in so-called ‘business email compromise” or BEC fraud, wherein the scammers try to spoof the boss at a major company in a bid to get wire payment for an “urgent” (read: fraudulent) invoice.

These scam letters are sometimes called 419 scams in reference to the penal code for dealing with such crimes in Nigeria, a perennial source of 419 letter schemes. A recent bust of a Nigerian gang targeted by the FBI gives some perspective on the money-making abilities of a $10 million ring that was running these scams all day long.

Reportedly, in the first seven months of 2019 alone the FBI received nearly 14,000 complaints reporting BEC scams with a total loss of around $1.1 billion—a figure that nearly matches losses reported for all of 2018.

Data storage company Cloudian launches a new edge analytics subsidiary called Edgematrix

Cloudian, a company that enables businesses to store and manage massive amounts of data, announced today the launch of Edgematrix, a new unit focused on edge analytics for large data sets. Edgematrix, a majority-owned subsidiary of Cloudian, will first be available in Japan, where both companies are based. It has raised a $9 million Series A from strategic investors NTT Docomo, Shimizu Corporation and Japan Post Capital, as well as Cloudian co-founder and CEO Michael Tso and board director Jonathan Epstein. The funding will be used on product development, deployment and sales and marketing.

Cloudian itself has raised a total of $174 million, including a $94 million Series E round announced last year. Its products include the Hyperstore platform, which allows businesses to store hundreds of petrabytes of data on premise, and software for data analytics and machine learning. Edgematrix uses Hyperstore for storing large-scale data sets and its own AI software and hardware for data processing at the “edge” of networks, closer to where data is collected from IoT devices like sensors.

The company’s solutions were created for situations where real-time analytics is necessary. For example, it can be used to detect the make, model and year of cars on highways so targeted billboard ads can be displayed to their drivers.

Tso told TechCrunch in an email that Edgematrix was launched after Cloudian co-founder and president Hiroshi Ohta and a team spent two years working on technology to help Cloudian customers process and analyze their data more efficiently.

“With more and more data being created at the edge, including IoT data, there’s a growing need for being able to apply real-time data analysis and decision-making at or near the edge, minimizing the transmission costs and latencies involved in moving the data elsewhere,” said Tso. “Based on the initial success of a small Cloudian team developing AI software solutions and attracting a number of top-tier customers, we decided that the best way to build on this success was establishing a subsidiary with strategic investors.”

Edgematrix is launching in Japan first because spending on AI systems there is expected to grow faster than in any other market, at a compound annual growth rate of 45.3% from 2018 to 2023, according to IDC.

“Japan has been ahead of the curve as an early adopter of AI technology, with both the governmetn and private sector viewing it as essential to boosting productivity,” said Tso. “Edgematrix will focus on the Japanese market for at least the next year, and assuming that all goes well, it would then expand to North America and Europe.”

LinkedIn launches skills assessments, tests that let you beef up your credentials for job hunting

LinkedIn, the social networking service for the working world, is today taking the wraps off its latest effort to provide its users with better tools for presenting their professional selves, and to make the process of recruitment on the platform more effective. It will now offer a new feature called Skills Assessments: short, multiple-choice tests that users can take to verify their knowledge in areas like computer languages, software packages and other work-related skills.

The feature is being rolled out globally today. However, while offering the skills assessments as part of an earlier, limited beta, LinkedIn tells us that 2 million tests were taken and applied across the platform. That’s a sign of how the full service might well be a very popular, and needed, feature.

First up are English-language tests covering some 75 different skills, all free to take, but the plan, according to Emrecan Dogan, the group product manager in its talent solutions division, is to “ramp that up agressively” in the near future, both adding in different languages and more test areas.

(Side note: Dogan joined LinkedIn when his company ScoreBeyond was quietly acquired by LinkedIn last year. ScoreBeyond was an online testing service to help students prep for college entrance exams. Given LinkedIn’s efforts to get closer to younger users — again, in part because of competitive pressure — I suspect that is one area where LinkedIn will likely want to expand this assessment tool longer term, if it takes off.)

The skills assessment tool is coming at an important moment for LinkedIn.

The Microsoft-owned company now has nearly 650 million people around the world using its social networking tools to connect with each other for professional purposes, most often to network, talk about work, or find work.

That makes for a fascinating and lucrative economy of scale when it comes to rolling out its products. But it comes with a major drawback, too: the bigger the platform gets, the harder it is to track and verify details about each and every individual on it. The skills assessment becomes one way of at least being able to verify certain people’s skills in specific areas, and for that information to start feeding into other channels and products on the platform.

It’s also a critical competitive move. The company is by far the biggest platform of its kind on the internet today, but smaller rivals are building interesting products to chip away at that lead in specific areas. Triplebyte, for example, has created a platform for those looking to hire engineers, and engineers looking for new roles, to connect by way of the engineers — yes — taking online tests to measure their skills and match them up with compatible job opportunities. Triplebyte is focused on just one field — software engineering — but the template is a disruptive one that, if replicated in other verticals, could slowly start to chip away at LinkedIn’s hegemony.

Other larger platforms also continue to look at ways that they might leverage their own social graphs to provide work-related networking services. Facebook, for example, had incorporated e-learning into its own efforts in professional development, laying the groundwork for other kinds of interactive training and assessment.

This is not the first time that LinkedIn has tinkered with the idea of offering tests to help ascertain the level of users’ skills on its platform, although the information was used for different ends. In India, several years ago the company started to incorporate tests on its platform to help suggest jobs to users. Nor is it the first time that the company has worked on ways to improve its skills and endorsement profile to make them more useful.

Testing on actual skills is just one area where verification has fallen short on LinkedIn. Another big trend in recruitment is the push for more diverse workforces. The thinking is that traditionally too many of the parameters that have been used up to now to assess people — what college was attended, or where people have worked already — have been essentially cutting many already-disenfranchised groups out of the process.

Given that LinkedIn currently has no way of ascertaining when people on its platform are from minority backgrounds, a skills assessment — and especially a good result on one — might potentially help tip the balance in favor of meritrocracy (if not proactive diversity focused hiring as such).

For regular users, the option to take skills assessments and add them to your profile will appear for users as a button in the skills and endorsements area of their profiles.

Users take short tests — currently only multiple choice — which Dogan says are created by professionals who are subject area experts that already work with LinkedIn, for example to write content for LinkedIn learning.

Indeed, in November last year, the company expanded LinkedIn Learning to include content from third-party providers and Q&A interactivity so there is a trove of work already there that might be repurposed as part of this new effort.

These tests measure your knowledge in specific areas, and if you pass, you are given a badge that you can apply to your profile page, and potentially broadcast out to those who are looking for people with the skills you’ve just verified you have. (This is presuming that you are not cheating and having someone else take the test for you, or taking it while looking up answers elsewhere.) You can opt out of sharing the information anywhere else, if you choose.

If you fail, you have three months to wait before taking it again, and in the meantime LinkedIn will use the moment to upsell you on its other content: you get offered LinkedIn Learning tests to improve your skills.

For those who pass, they will need to retake tests every year to keep their badges and credentials.

On the side of recruiters, they are able to use the data that gets amassed through the tests as a way of better filtering out users when sourcing candidate pools for job openings. This is a huge issue on a platform like LinkedIn: while having a large group of people on there is a boost for finding matches, in fact there can be too many, and too much of a challenge and time suck to figure out who is genuinely suitable for a particular role.

There is another angle where the skills are being used to help LinkedIn monetise: those who are putting in ads for jobs can now buy ads that are targeted specifically to people with certain skills that have been verified through assessments.

There are still some shortfalls in the skills assessment tool as it exists now. For example, coding tests are all multiple choice, but that’s not how many coding environments work these days. (Triplebyte for example offers collaborative assessments.) And of course, skills is just one aspect of how people might fit into a particular working environment. (Currently there are no plans to bring in psychometric or similar assessments, Dogan said.) This is an interesting start, however, and worth testing the waters as more interesting variations in recruitment and connecting professionals online continue to proliferate.

 

GitLab hauls in $268M Series E on 2.75B valuation

GitLab is a company that doesn’t pull any punches or try to be coy. It actually has had a page on its website for some time stating it intends to go public on November 18, 2020. You don’t see that level of transparency from late-stage startups all that often. Today, the company announced a huge $268 million Series E on a tidy $2.75 billion valuation.

Investors include Adage Capital Management, Alkeon Capital, Altimeter Capital, Capital Group, Coatue Management, D1 Capital Partners, Franklin Templeton, Light Street Capital, Tiger Management Corp. and Two Sigma Investments.

The company seems to be primed and ready for that eventual IPO. Last year, GitLab co-founder and CEO Sid Sijbrandij said that his CFO Paul Machle told him he wanted to begin planning to go public, and he would need two years in advance to prepare the company. As Sijbrandij tells it, he told him to pick a date.

“He said, I’ll pick the 16th of November because that’s the birthday of my twins. It’s also the last week before Thanksgiving, and after Thanksgiving, the stock market is less active, so that’s a good time to go out,” Sijbrandij told TechCrunch.

He said that he considered it a done deal and put the date on the GitLab Strategy page, a page that outlines the company’s plans for everything it intends to do. It turned out that he was a bit too quick on the draw. Machle had checked the date in the interim and realized that it was a Monday, which is not traditionally a great day to go out, so they decided to do it two days later. Now the target date is officially November 18, 2020.

Screenshot 2019 09 17 08.35.33 2

GitLab has the date it’s planning to go public listed on its Strategy page.

As for that $268 million, it gives the company considerable runway ahead of that planned event, but Sijbrandij says it also gives him flexibility in how to take the company public. “One other consideration is that there are two options to go public. You can do an IPO or direct listing. We wanted to preserve the optionality of doing a direct listing next year. So if we do a direct listing, we’re not going to raise any additional money, and we wanted to make sure that this is enough in that case,” he explained.

Sijbrandij says that the company made a deliberate decision to be transparent early on. Being based on an open-source project, it’s sometimes tricky to make that transition to a commercial company, and sometimes that has a negative impact on the community and the number of contributions. Transparency was a way to combat that, and it seems to be working.

He reports that the community contributes 200 improvements to the GitLab open-source product every month, and that’s double the amount of just a year ago, so the community is still highly active in spite of the parent company’s commercial success.

It did not escape his notice that Microsoft acquired GitHub last year for $7.5 billion. It’s worth noting that GitLab is a similar kind of company that helps developers manage and distribute code in a DevOps environment. He claims in spite of that eye-popping number, his goal is to remain an independent company and take this through to the next phase.

“Our ambition is to stay an independent company. And that’s why we put out the ambition early to become a listed company. That’s not totally in our control as the majority of the company is owned by investors, but as long as we’re more positive about the future than the people around us, I think we can we have a shot at not getting acquired,” he said.

The company was founded in 2014 and was a member of Y Combinator in 2015. It has been on a steady growth trajectory ever since, hauling in more than $426 million. The last round before today’s announcement was a $100 million Series D last September.

IEX’s Katsuyama is no flash in the pan

When you watch a commercial for one of the major stock exchanges, you are welcomed into a world of fast-moving, slick images full of glistening buildings, lush crops and happy people. They are typically interspersed with shots of intrepid executives veering out over the horizon as if to say, “I’ve got a long-term vision, and the exchange where my stock is listed is a valuable partner in achieving my goals.” It’s all very reassuring and stylish. But there’s another side to the story.

I have been educated about the realities of today’s stock exchange universe through recent visits with Brad Katsuyama, co-founder and CEO of IEX (a.k.a. The Investors Exchange). If Katsuyama’s name rings a bell, and you don’t work on Wall Street, it’s likely because you remember him as the protagonist of Michael Lewis’s 2014 best-seller, Flash Boys: A Wall Street Revolt, which explored high-frequency trading (HFT) and made the case that the stock market was rigged, really badly.

Five years later, some of the worst practices Lewis highlighted are things of the past, and there are several attributes of the American equity markets that are widely admired around the world. In many ways, though, the realities of stock trading have gotten more unseemly, thanks to sophisticated trading technologies (e.g., microwave radio transmissions that can carry information at almost the speed of light), and pitched battles among the exchanges, investors and regulators over issues including the rebates stock exchanges pay to attract investors’ orders and the price of market data charged by the exchanges.

I don’t claim to be an expert on the inner workings of the stock market, but I do know this: Likening the life cycle of a trade to sausage-making is an insult to kielbasa. More than ever, trading is an arcane, highly technical and bewildering part of our broader economic infrastructure, which is just the way many industry participants like it: Nothing to see here, folks.

Meanwhile, Katsuyama, company president Ronan Ryan and the IEX team have turned IEX into the eighth largest stock exchange company, globally, by notional value traded, and have transformed the concept of a “speed bump” into a mainstream exchange feature.

Brad Aug 12

Brad Katsuyama. Image by Joshua Blackburn via IEX Trading

Despite these and other accomplishments, IEX finds itself in the middle of a vicious battle with powerful incumbents that seem increasingly emboldened to use their muscle in Washington, D.C. What’s more, new entrants, such as The Long-Term Stock Exchange and Members Exchange, are gearing up to enter the fray in US equities, while global exchanges such as the Hong Kong Stock Exchange seek to bulk up by making audacious moves like attempting to acquire the venerable London Stock Exchange.

But when you sell such distinct advantages to one group that really can only benefit from that, it leads to the question of why anyone would want to trade on that market. It’s like walking into a playing field where you know that the deck is stacked against you.

As my discussion with Katsuyama reveals, IEX may have taken some punches in carving out a position for itself in this high-stakes war characterized by cutting-edge technology and size. However, the IEX team remains girded for battle and confident that it can continue to make headway in offering a fair and transparent option for market participants over the long term.

Gregg Schoenberg: Given Flash Boys and the attention it generated for you on Main Street, I’d like to establish something upfront. Does IEX exist for the asset manager, the individual, or both?

Brad Katsuyama: We exist primarily for the asset manager, and helping them helps the individual. We’re one step removed from the individual, and part of that is due to regulation. Only brokers can connect to exchanges, and the asset manager connects to the broker.

Schoenberg: To put a finer point on it, you believe in fairness and being the good guy. But you are not Robinhood. You are a capitalist.

Katsuyama: Yes, but we want to make money fairly. Actually, we thought initially about starting the business as a nonprofit, But once we laid out all the people we would need to convince to work for us, we realized it would’ve been hard for us to attract the skill sets needed as a nonprofit.

Schoenberg: Do you believe that the US equity market today primarily serves investors or traders?

Boston-based DataRobot raises $206M Series E to bring AI to enterprise

Artificial intelligence is playing an increasingly large role in enterprise software, and Boston’s DataRobot has been helping companies build, manage and deploy machine learning models for some time now. Today, the company announced a $206 million Series E investment led by Sapphire Ventures.

Other participants in this round included new investors Tiger Global Management, World Innovation Lab, Alliance Bernstein PCI, and EDBI along with existing investors DFJ Growth, Geodesic Capital, Intel Capital, Sands Capital, NEA and Meritech.

Today’s investment brings the total raised to $431 million, according to the company. It has a pre-money valuation of $1 billion, according to PitchBook. DataRobot would not confirm this number.

The company has been catching the attention of these investors by offering a machine learning platform aimed at analysts, developers and data scientists to help build predictive models much more quickly than it typically takes using traditional methodologies. Once built, the company provides a way to deliver the model in the form of an API, simplifying deployment.

The late-stage startup plans to use the money to continue building out its product line, while looking for acquisition opportunities where it makes sense. The company also announced the availability of a new product today, DataRobot MLOps, a tool to manage, monitor and deploy machine learning models across a large organization.

The company, which was founded in 2012, claims it has had triple-digit recurring revenue growth dating back to 2015, as well as one billion models built on the platform to-date. Customers contributing to that number include a broad range of companies such as Humana, United Airlines, Harvard Business School and Deloitte.