What is XDR (and Why Do Enterprises Need It)?

In the world of cybersecurity, acronyms abound. From AV to EPP to EDR and now XDR, these changing technologies reflect an ever-present truth: cyber threat actors keep evolving, and defenders need to stay one, or more, steps ahead. Coupled with the shifting threat landscape are the innovations in business and business operations themselves. We’ve moved from an on-prem world bounded by a manageable network perimeter to a distributed, cloud-powered infrastructure, with remote working and 5 billion monthly teleconferences adding to the complexity of ensuring business and operational security. On top of all that, as any CISO will tell you, the number of cyber attacks, cyber attackers, and offensive toolset is increasing.

The security technologies of the past were not built to cope with today’s complex, fast-moving threatscape. The evidence for that is compelling: rising ransomware attacks coupled with data breaches and IP theft, strained SOC teams dealing with alert fatigue and staffing shortages, and the proliferation of attacks that succeed despite the presence of traditional security tools.

Fortunately, these are just some of the problems XDR was designed to solve. In this post, we’ll explain what XDR is and how it changes the game to empower enterprise security teams and put threat actors on the back foot.

What is XDR?

XDR, Extended Detection and Response, is the evolution of EDR, Endpoint Detection and Response. EDR, particularly ActiveEDR, brought visibility and automated response to endpoints like laptops and workstations, but today’s network has so many other data points that may be traversed by attackers on the road to a successful compromise, from mobile phones and IoT devices, to Containers and Cloud-Native applications.

Sometimes referred to as “Cross-Layered” or “Any Data Source” detection and response, XDR supersedes EDR by delivering visibility into all data that crosses the network, rather than just data from the endpoint layer. XDR platforms like SentinelOne’s Singularity Platform collect data from all assets across the enterprise environment, collating it into a single data lake, and apply security analytics and AI across multiple security layers to provide enhanced automated detection and response.

What Are the Benefits of XDR over EDR?

With a single pool of raw data comprising information from across the entire ecosystem, XDR allows faster, deeper and more effective threat detection and response than EDR, collecting and collating data from a wider range of sources.

Cyber attacks typically travel across many enterprise assets, and only the visibility provided by XDR is capable of creating a full narrative of what happened, when, where and how. With XDR, the same contextualized Storylines that ActiveEDR provides across the Endpoint layer can be constructed across multiple layers: cloud, containers, virtual machines, IoT, endpoints, servers, and so on.

This comprehensive visibility leads several benefits, including:

  • increased ability to detect stealthy attacks
  • reduced dwell time
  • increased speed of mitigation.

Moreover, thanks to AI and automation, XDR reduces the burden of manual work on security analysts. An XDR platform like Singularity can proactively and rapidly identify sophisticated threats, increasing productivity of the security or SOC team and return a massive boost in ROI for the organization.

How is XDR Different from SIEM?

Although both XDR and SIEM tools collect data from multiple sources, they have almost nothing else in common. Unlike an XDR platform, SIEMs (like passive EDR tools) have no ability to identify meaningful trends, nor do they provide any automated detection or response abilities. Further, to be useful, SIEMs require a great deal of manual investigation and analysis.

Fortunately, if you have invested in SIEM tools, these need not be made redundant by your XDR platform, as they can directly feed into your XDR platform’s data lake, exposing all that raw data to the XDR’s AI and machine learning capabilities.

What Should I Look For in a Good XDR Solution?

The first key to an effective XDR solution is integration. It needs to work seemlessly across your security stack and provide native tools with rich APIs. Beware immature or rushed solutinos that may be nothing more than old tools bolted together. Your XDR should offer a single platform that allows you to easily and rapidly build a comprehensive view of the entire enterprise.

Secondly, automation backed by advance AI and proven Machine Learning algorithms is essential. Does your vendor have a rich history in developing state-of-the-art AI models, or are they primarily known for legacy technologies but now trying to change their spots?

Thirdly, how easy is your XDR solution to learn, maintain, configure and update? One of the main advantages that a strong XDR solution brings in increased productivity for your staff with automated detection and response. However, you want to be sure you’re not simply redirecting the work your staff have to do to managing or navigating a complicated solution.

What Are the Benefits of SentinelOne’s Singularity, AI-Powered XDR Platform?

SentinelOne’s AI-Powered XDR Platform brings all the benefits you’d expect from a complete solution: deep visibility, automated detection and response, rich integration and operational simplicity. With a single codebase and deployment model, Singularity is the first XDR to incorporate IoT and CWPP into an XDR platform.

All IoT data is seamlessly integrated into Singularity for ease of threat hunting and never-seen-before context. Using AI to monitor and control access to every IoT device, Singularity XDR allows machines to solve a problem that previously was impossible to address at scale.

Singularity’s container workload protection is supported on all major Linux platforms, physical and virtual, cloud native workloads, and Kubernetes containers. It provides prevention, detection, response, and hunting for known and unknown cyber threats. This includes malicious files and live attacks across cloud-native and containerized environments, offering advanced response options and autonomous remediation in real time.

Forrester TEI Study: SentinelOne Singularity XDR Platform Can Deliver ROI of 353%

Conclusion

Cyber security is often likened to an arms race between attackers and defenders, and that race is now extending beyond the single layer of the endpoint. As businesses embrace remote working and cloud infrastructure, introducing an increasing attack surface, only an integrated platform can provide the visibility and automated defences required across all assets. By combining endpoint, network, and application telemetry, XDR can provide the security analytics to win that race through enhanced detection, triage and response. If you’d like to know more about SentinelOne’s Singularity Platform, contact us or request a demo.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Materialize scores $40 million investment for SQL streaming database

Materialize, the SQL streaming database startup built on top of the open-source Timely Dataflow project, announced a $32 million Series B investment led by Kleiner Perkins, with participation from Lightspeed Ventures.

While it was at it, the company also announced a previously unannounced $8 million Series A from last year, led by Lightspeed, bringing the total raised to $40 million.

These firms see a solid founding team that includes CEO Arjun Narayan, formerly of Cockroach Labs, and chief scientist Frank McSherry, who created the Timely Dataflow project on which the company is based.

Narayan says that the company believes fundamentally that every company needs to be a real-time company, and it will take a streaming database to make that happen. Further, he says the company is built using SQL because of its ubiquity, and the founders wanted to make sure that customers could access and make use of that data quickly without learning a new query language.

“Our goal is really to help any business to understand streaming data and build intelligent applications without using or needing any specialized skills. Fundamentally what that means is that you’re going to have to go to businesses using the technologies and tools that they understand, which is standard SQL,” Narayan explained.

Bucky Moore, the partner at Kleiner Perkins leading the B round, sees this standard querying ability as a key part of the technology. “As more businesses integrate streaming data into their decision-making pipelines, the inability to ask questions of this data with ease is becoming a non-starter. Materialize’s unique ability to provide SQL over streaming data solves this problem, laying the foundation for them to build the industry’s next great data platform,” he said.

They would naturally get compared to Confluent, a streaming database built on top of the Apache Kafka open-source streaming database project, but Narayan says his company uses straight SQL for querying, while Confluent uses its own flavor.

The company still is working out the commercial side of the house and currently provides a typical service offering for paying customers with support and a service agreement (SLA). The startup is working on a SaaS version of the product, which it expects to release some time next year.

They currently have 20 employees with plans to double that number by the end of next year as they continue to build out the product. As they grow, Narayan says the company is definitely thinking about how to build a diverse organization.

He says he’s found that hiring in general has been challenging during the pandemic, and he hopes that changes in 2021, but he says that he and his co-founders are looking at the top of the hiring funnel because otherwise, as he points out, it’s easy to get complacent and rely on the same network of people you have been working with before, which tends to be less diverse.

“The KPIs and the metrics we really want to use to ensure that we really are putting in the extra effort to ensure a diverse sourcing in your hiring pipeline and then following that through all the way through the funnel. That’s I think the most important way to ensure that you have a diverse [employee base], and I think this is true for every company,” he said.

While he is working remotely now, he sees having multiple offices with a headquarters in NYC when the pandemic finally ends. Some employees will continue to work remotely, with the majority coming into one of the offices.

As Slack acquisition rumors swirl, a look at Salesforce’s six biggest deals

The rumors ignited last Thursday that Salesforce had interest in Slack. This morning, CNBC is reporting the deal is all but done and will be announced tomorrow. Chances are this is going to a big number, but this won’t be Salesforce’s first big acquisition. We thought it would be useful in light of these rumors to look back at the company’s biggest deals.

Salesforce has already surpassed $20 billion in annual revenue, and the company has a history of making a lot of deals to fill in the road map and give it more market lift as it searches for ever more revenue.

The biggest deal so far was the $15.7 billion Tableau acquisition last year. The deal gave Salesforce a missing data visualization component and a company with a huge existing market to feed the revenue beast. In an interview in August with TechCrunch, Salesforce president and chief operating officer Bret Taylor (who came to the company in the $750 million Quip deal in 2016), sees Tableau as a key part of the company’s growing success:

“Tableau is so strategic, both from a revenue and also from a technology strategy perspective,” he said. That’s because as companies make the shift to digital, it becomes more important than ever to help them visualize and understand that data in order to understand their customers’ requirements better.

Next on the Salesforce acquisition hit parade was the $6.5 billion MuleSoft acquisition in 2018. MuleSoft gave Salesforce access to something it didn’t have as an enterprise SaaS company — data locked in silos across the company, even in on-prem applications. The CRM giant could leverage MuleSoft to access data wherever it lived, and when you put the two mega deals together, you could see how you could visualize that data and also give more fuel to its Einstein intelligence layer.

In 2016, the company spent $2.8 billion on Demandware to make a big splash in e-commerce, a component of the platform that has grown in importance during the pandemic when companies large and small have been forced to move their businesses online. The company was incorporated into the Salesforce behemoth and became known as Commerce Cloud.

In 2013, the company made its first billion-dollar acquisition when it bought ExactTarget for $2.5 billion. This represented the first foray into what would become the Marketing Cloud. The purchase gave the company entrée into the targeted email marketing business, which again would grow increasingly in importance in 2020 when communicating with customers became crucial during the pandemic.

Last year, just days after closing the MuleSoft acquisition, Salesforce opened its wallet one more time and paid $1.35 billion for ClickSoftware. This one was a nod to the company’s Service cloud, which encompasses both customer service and field service. This acquisition was about the latter, and giving the company access to a bigger body of field service customers.

The final billion-dollar deal (until we hear about Slack perhaps) is the $1.33 billion Vlocity acquisition earlier this year. This one was a gift for the core CRM product. Vlocity gave Salesforce several vertical businesses built on the Salesforce platform and was a natural fit for the company. Using Vlocity’s platform, Salesforce could (and did) continue to build on these vertical markets giving it more ammo to sell into specialized markets.

While we can’t know for sure if the Slack deal will happen, it sure feels like it will, and chances are this deal will be even larger than Tableau as the Salesforce acquisition machine keeps chugging along.

C3.ai’s initial IPO pricing guidance spotlights the public market’s tech appetite

On the heels of news that DoorDash is targeting an initial IPO valuation up to $27 billion, C3.ai also dropped a new S-1 filing detailing a first-draft guess of what the richly valued company might be worth after its debut.

C3.ai posted an initial IPO price range of $31 to $34 per share, with the company anticipating a sale of 15.5 million shares at that price. The enterprise-focused artificial intelligence company is also selling $100 million of stock at its IPO price to Spring Creek Capital, and another $50 million to Microsoft at the same terms. And there are 2.325 million shares reserved for its underwriters as well.

The total tally of shares that C3.ai will have outstanding after its IPO bloc is sold, Spring Creek and Microsoft buy in, and its underwriters take up their option, is 99,216,958. At the extremes of its initial IPO price range, the company would be worth between $3.08 billion and $3.37 billion using that share count.

Those numbers decline by around $70 and $80 million, respectively, if the underwriters do not purchase their option.

So is the IPO a win for the company at those prices? And is it a win for all C3.ai investors? Amazingly enough, it feels like the answers are yes and no. Let’s explore why.

Slowing growth, rising valuation

If we just look at C3.ai’s revenue history in chunks, you can argue a growth story for the company; that it grew from $73.8 million in the the two quarters of 2019 ending July 31, to $81.8 million in revenue during the same portion of 2020. That’s growth of just under 11% on a year-over-year basis. Not great, but positive.

The Good, the Bad and the Ugly in Cybersecurity – Week 48

The Good

This week it was announced that INTERPOL and Group-IB successfully joined forces for what was dubbed “Operation Falcon’, arresting, three individuals in Lagos, Nigeria. These individuals are charged with connections to larger organized crime groups involved in mid-level Business Email Compromise (BEC) operations as well as associated phishing and malware distribution operations.

According to the release from INTERPOL, some 50,000 victims of their crimes have been identified so far. If history is any guide, there are many more that are unknown or yet to be discovered.  ‘Operation Falcon’ was carried for ~1year and was instrumental in successfully tracking these criminals and facilitating the exchange of threat and actor data across the participating parties.

The criminal group was involved in the distribution of multiple commodity malware families including Nanocore, AgentTesla, LokiBot, Azorult and many others. Malicious emails were used to either link to or distribute the malware to their targets. All the standard social engineering lures were in play, including the typical ‘purchase order’ style phish. The criminals even used COVID-19-based lures in some of their operations (making their actions that much more unsavory).

We applaud the efforts of INTERPOL and Group-IB, and encourage everyone to continue to be vigilant against these attacks and continue to cooperate where possible to keep bringing these criminals to justice.

The Bad

This week, the FBI updated an earlier flash alert, FLASH MU-000136-MW, regarding cyber actors targeting misconfigured SonarQube instances and accessing proprietary source code of US government agencies and businesses.

Unknown actors have been targeting exposed and vulnerable SonarQube instances since at least April of 2020. These are considered high-value targets given they tend to contain source code repositories of both private entities and US Government agencies. Such sensitive data can be used by cybercriminals in a variety of ways, with the most common being exfiltration for the purpose of extortion. That is, similar to what we see with modern ransomware campaigns, malicious actors can threaten to publicly release the data should the victim not comply with the demands of the attacker.

The FLASH alert notes that there have already been multiple examples of leaked data from these repositories being distributed in the public domain. SonarQube’s recommendations for mitigation include:

  • Changing the SonarQube default settings, including changing default administrator username, password, and port (9000).
  • Placing SonarQube instances behind a login screen, and checking if unauthorized users have accessed the instance.
  • Revoking access to any application programming interface keys or other credentials that were exposed in a SonarQube instance, if feasible.
  • Configuring SonarQube instances to sit behind your organization’s firewall and other perimeter defenses to prevent unauthenticated access.

We encourage all SonarQube users to review the FLASH releases and do the needful to stay safe and secure!

The Ugly

School districts and related entities have always been targets of malware attacks; however, there seems to be an uptick in recent ransomware attacks against schools and district administration infrastructure.

This week, news broke of the Baltimore County Public School district being “crippled” by a ransomware attack (early reports suggest Ryuk may be responsible), which effectively shut down schools for all 115,000 students on Wednesday. And Baltimore is not alone. Many schools and school districts appear in the lists of victims on ransomware attackers’ public blog sites. Egregor operators posted Spring ISD in Houston just this past week as well.

If we go back a little bit in history, we can also see other concerning examples:

  • Way House School – Maze
  • Fairfax County Public Schools – Maze
  • Clark County School District – Maze
  • Toledo Public Schools – Maze
  • Spring ISD – Egregor
  • Horry County Schools – CryptoLocker
  • Monroe County School District – GandCrab
  • Ouachita School District, Louisiana – Ryuk
  • Morehouse Parish School District – Ryuk
  • Rockville Centre School District – Ryuk
  • Houston County schools, – Ryuk
  • Cherry Hill School District – Ryuk
  • Las Cruces Public School District – Ryuk
  • Maine School Administrative District #6 – Ryuk
  • Crystal Lake Community High School District 155 – Ryuk
  • Mountain View Los Altos Union High School District – REvil
  • Havre Public Schools – Ryuk
  • Haywood County School District – SunCrypt

That is just a small subset.  Unfortunately, school and related infrastructure can often be ‘low hanging fruit’ to the bad guys, as it is not uncommon for exposed school networks to be out-of-date (patch-wise) or otherwise unprotected.

Ironically, one way to improve this situation is education. We encourage those in our community and industry to reach out and find ways to instruct education-related infrastructure-owners to advise on where the risks are, how to reduce exposure, where to concentrate detective and preventive controls for the greatest effect, and beyond. The bad guys are not going to stop, so it is up to us to stay ahead of them.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Wall Street needs to relax, as startups show remote work is here to stay

We are hearing that a COVID-19 vaccine could be on the way sooner than later, and that means we could be returning to normal life some time in 2021. That’s the good news. The perplexing news, however, is that each time some positive news emerges about a vaccine — and believe me I’m not complaining — Wall Street punishes stocks it thinks benefits from us being stuck at home. That would be companies like Zoom and Peloton.

While I’m not here to give investment advice, I’m confident that these companies are going to be fine even after we return to the office. While we surely pine for human contact, office brainstorming, going out to lunch with colleagues and just meeting and collaborating in the same space, it doesn’t mean we will simply return to life as it was before the pandemic and spend five days a week in the office.

One thing is clear in my discussions with startups born or growing up during the pandemic: They have learned to operate, hire and sell remotely, and many say they will continue to be remote-first when the pandemic is over. Established larger public companies like Dropbox, Facebook, Twitter, Shopify and others have announced they will continue to offer a remote-work option going forward. There are many other such examples.

It’s fair to say that we learned many lessons about working from home over this year, and we will carry them with us whenever we return to school and the office — and some percentage of us will continue to work from home at least some of the time, while a fair number of businesses could become remote-first.

Wall Street reactions

On November 9, news that the Pfizer vaccine was at least 90% effective threw the markets for a loop. The summer trade, in which investors moved capital from traditional, non-tech industries and pushed it into software shares, flipped; suddenly the stocks that had been riding a pandemic wave were losing ground while old-fashioned, even stodgy, companies shot higher.

As IBM shifts to hybrid cloud, reports have them laying off 10,000 in EU

As IBM makes a broad shift in strategy, Bloomberg reported this morning that the company would be cutting around 10,000 jobs in Europe. This comes on the heels of last month’s announcement that the organization will be spinning out its infrastructure services business next year. While IBM wouldn’t confirm the layoffs, a spokesperson suggested there were broad structural changes ahead for the company as it concentrates fully on a hybrid cloud approach.

IBM had this to say in response to a request for comment on the Bloomberg report: “Our staffing decisions are made to provide the best support to our customers in adopting an open hybrid cloud platform and AI capabilities. We also continue to make significant investments in training and skills development for IBMers to best meet the needs of our customers.”

Unfortunately, that means basically if you don’t have the currently required skill set, chances are you might not fit with the new version of IBM. IBM CEO Arvind Krishna alluded to the changing environment in an interview with Jon Fortt at the CNBC Evolve Summit earlier this month when he said:

The Red Hat acquisition gave us the technology base on which to build a hybrid cloud technology platform based on open-source, and based on giving choice to our clients as they embark on this journey. With the success of that acquisition now giving us the fuel, we can then take the next step, and the larger step, of taking the managed infrastructure services out. So the rest of the company can be absolutely focused on hybrid cloud and artificial intelligence.

The story has always been the same around IBM layoffs, that as they make the transition to a new model, it requires eliminating positions that don’t fit into the new vision, and today’s report is apparently no different, says Holger Mueller, an analyst at Constellation Research.

“IBM is in the biggest transformation of the company’s history as it moves from services to software and specialized hardware with Quantum. That requires a different mix of skills in its employee base and the repercussions of that manifest itself in the layoffs that IBM has been doing, mostly quietly, for the last 5+ years,” he said.

None of this is easy for the people involved. It’s never a good time to lose your job, but the timing of this one feels worse. In the middle of a recession brought on by COVID, and as a second wave of the virus sweeps over Europe, it’s particularly difficult.

We have reported on a number of IBM layoffs over the last five years. In May, it confirmed layoffs, but wouldn’t confirm numbers. In 2015, we reported on a 12,000 employee layoff.

Cast.ai nabs $7.7M seed to remove barriers between public clouds

When you launch an application in the public cloud, you usually put everything on one provider, but what if you could choose the components based on cost and technology and have your database one place and your storage another?

That’s what Cast.ai says that it can provide, and today it announced a healthy $7.7 million seed round from TA Ventures, DNX, Florida Funders and other unnamed angels to keep building on that idea. The round closed in June.

Company CEO and co-founder Yuri Frayman says that they started the company with the idea that developers should be able to get the best of each of the public clouds without being locked in. They do this by creating Kubernetes clusters that are able to span multiple clouds.

“Cast does not require you to do anything except for launching your application. You don’t need to know  […] what cloud you are using [at any given time]. You don’t need to know anything except to identify the application, identify which [public] cloud providers you would like to use, the percentage of each [cloud provider’s] use and launch the application,” Frayman explained.

This means that you could use Amazon’s RDS database and Google’s ML engine, and the solution decides how to make that work based on your requirements and price. You set the policies when you are ready to launch and Cast will take care of distributing it for you in the location and providers that you desire, or that makes most sense for your application.

The company takes advantage of cloud-native technologies, containerization and Kubernetes to break the proprietary barriers that exist between clouds, says company co-founder Laurent Gil. “We break these barriers of cloud providers so that an application does not need to sit in one place anymore. It can sit in several [providers] at the same time. And this is great for the Kubernetes application because they’re kind of designed with this [flexibility] in mind,” Gil said.

Developers use the policy engine to decide how much they want to control this process. They can simply set location and let Cast optimize the application across clouds automatically, or they can select at a granular level exactly the resources they want to use on which cloud. Regardless of how they do it, Cast will continually monitor the installation and optimize based on cost to give them the cheapest options available for their configuration.

The company currently has 25 employees with four new hires in the pipeline, and plans to double to 50 by the end of 2021. As they grow, the company is trying keep diversity and inclusion front and center in its hiring approach; they currently have women in charge of HR, marketing and sales at the company.

“We have very robust processes on the continuous education inside of our organization on diversity training. And a lot of us came from organizations where this was very visible and we took a lot of those processes [and lessons] and brought them here,” Frayman said.

Frayman has been involved with multiple startups, including Cujo.ai, a consumer firewall startup that participated in TechCrunch Disrupt Battlefield in New York in 2016.

Industrial drone maker Percepto raises $45M and integrates with Boston Dynamics’ Spot

Consumer drones have over the years struggled with an image of being no more than expensive and delicate toys. But applications in industrial, military and enterprise scenarios have shown that there is indeed a market for unmanned aerial vehicles, and today, a startup that makes drones for some of those latter purposes is announcing a large round of funding and a partnership that provides a picture of how the drone industry will look in years to come.

Percepto, which makes drones — both the hardware and software — to monitor and analyze industrial sites and other physical work areas largely unattended by people, has raised $45 million in a Series B round of funding.

Alongside this, it is now working with Boston Dynamics and has integrated its Spot robots with Percepto’s Sparrow drones, with the aim being better infrastructure assessments, and potentially more as Spot’s agility improves.

The funding is being led by a strategic backer, Koch Disruptive Technologies, the investment arm of industrial giant Koch Industries (which has interests in energy, minerals, chemicals and related areas), with participation also from new investors State of Mind Ventures, Atento Capital, Summit Peak Investments and Delek-US. Previous investors U.S. Venture Partners, Spider Capital and Arkin Holdings also participated. (It appears that Boston Dynamics and SoftBank are not part of this investment.)

Israel-based Percepto has now raised $72.5 million since it was founded in 2014, and it’s not disclosing its valuation, but CEO and founder Dor Abuhasira described as “a very good round.”

“It gives us the ability to create a category leader,” Abuhasira said in an interview. It has customers in around 10 countries, with the list including ENEL, Florida Power and Light and Verizon.

While some drone makers have focused on building hardware, and others are working specifically on the analytics, computer vision and other critical technology that needs to be in place on the software side for drones to work correctly and safely, Percepto has taken what I referred to, and Abuhasira confirmed, as the “Apple approach”: vertical integration as far as Percepto can take it on its own.

That has included hiring teams with specializations in AI, computer vision, navigation and analytics as well as those strong in industrial hardware — all strong areas in the Israel tech landscape, by virtue of it being so closely tied with its military investments. (Note: Percepto does not make its own chips: these are currently acquired from Nvidia, he confirmed to me.)

“The Apple approach is the only one that works in drones,” he said. “That’s because it is all still too complicated. For those offering an Android-style approach, there are cracks in the complete flow.”

It presents the product as a “drone-in-a-box”, which means in part that those buying it have little work to do to set it up to work, but also refers to how it works: its drones leave the box to make a flight to collect data, and then return to the box to recharge and transfer more information, alongside the data that is picked up in real time.

The drones themselves operate on an on-demand basis: they fly in part for regular monitoring, to detect changes that could point to issues; and they can also be launched to collect data as a result of engineers requesting information. The product is marketed by Percepto as “AIM”, short for autonomous site inspection and monitoring.

News broke last week that Amazon has been reorganising its Prime Air efforts — one sign of how some more consumer-facing business applications — despite many developments — may still have some turbulence ahead before they are commercially viable. Businesses like Percepto’s stand in contrast to that, with their focus specifically on flying over, and collecting data, in areas where there are precisely no people present.

It has dovetailed with a bigger focus from industries on the efficiencies (and cost savings) you can get with automation, which in turn has become the centerpiece of how industry is investing in the buzz phrase of the moment, “digital transformation.”

“We believe Percepto AIM addresses a multi-billion-dollar issue for numerous industries and will change the way manufacturing sites are managed in the IoT, Industry 4.0 era,” said Chase Koch, president of Koch Disruptive Technologies, in a statement. “Percepto’s track record in autonomous technology and data analytics is impressive, and we believe it is uniquely positioned to deliver the remote operations center of the future. We look forward to partnering with the Percepto team to make this happen.”

The partnership with Boston Dynamics is notable for a couple of reasons: it speaks to how various robotics hardware will work together in tandem in an automated, unmanned world, and it speaks to how Boston Dynamics is pulling up its socks.

On the latter front, the company has been making waves in the world of robotics for years, specifically with its agile and strong dog-like (with names like “Spot” and “Big Dog”) robots that can cover rugged terrain and handle tussles without falling apart.

That led it into the arms of Google, which acquired it as part of its own secretive moonshot efforts, in 2013. That never panned out into a business, and probably gave Google more complicated optics at a time when it was already being seen as too powerful. Then, SoftBank stepped in to pick it up, along with other robotics assets, in 2017. That hasn’t really gone anywhere either, it seems, and just this month it was reported that Boston Dynamics was reportedly facing yet another suitor, Hyundai.

All of this is to say that partnerships with third parties that are going places (quite literally) become strong signs of how Boston Dynamics’ extensive R&D investments might finally pay off with enterprising dividends.

Indeed, while Percepto has focused on its own vertical integration, longer term and more generally there is an argument to be made for more interoperability and collaboration between the various companies building “connected” and smart hardware for industrial, physical applications.

It means that specific industries can focus on the special equipment and expertise they require, while at the same time complementing that with hardware and software that are recognised as best-in-class. Abuhasira said that he expects the Boston Dynamics partnership to be the first of many.

That makes this first one an interesting template. The partnership will see Spot carrying Percepto’s payloads for high-resolution imaging and thermal vision “to detect issues including hot spots on machines or electrical conductors, water and steam leaks around plants and equipment with degraded performance, with the data relayed via AIM.” It will also mean a more thorough picture, beyond what you get from the air. And, potentially, you might imagine a time in the future when the data that the combined devices source results even in Spot (or perhaps a third piece of autonomous hardware) carrying out repairs or other assistance.

“Combining Percepto’s Sparrow drone with Spot creates a unique solution for remote inspection,” said Michael Perry, VP of Business Development at Boston Dynamics, in a statement. “This partnership demonstrates the value of harnessing robotic collaborations and the insurmountable benefits to worker safety and cost savings that robotics can bring to industries that involve hazardous or remote work.”

Adobe expands customer data platform to include B2B sales

The concept of the customer data platform (CDP) is a relatively new one. Up until now, it has focused primarily on pulling data about an individual consumer from a variety of channels into a super record, where in theory you can serve more meaningful content and deliver more customized experiences based on all this detailed knowledge. Adobe announced its intention today to create such a product for business to business (B2B) customers, a key market where this kind of data consolidation had been missing.

Indeed Brian Glover, Adobe’s director of product marketing for Marketo Engage, who has been put in charge of this product, says that these kinds of sales are much more complex and B2B sales and marketing teams are clamoring for a CDP.

“We have spent the last couple of years integrating Marketo Engage across Adobe Experience Cloud, and now what we’re doing is building out the next generation of new and complimentary B2B offerings on the Experience platform, the first of which is the B2B CDP offering,” Glover told me.

He says that they face unique challenges adapting CDP for B2B sales because they typically involve buying groups, meaning you need to customize your messages for different people depending on their role in the process.

An individual consumer usually knows what they want and you can prod them to make a decision and complete the purchase, but a B2B sale is usually longer and more complex involving different levels of procurement. For example, in a technology sale, it may involve the CIO, a group, division or department who will be using the tech, the finance department, legal and others. There may be an RFP and the sales cycle may span months or even years.

Adobe believes this kind of sale should still be able to use the same customized messaging approach you use in an individual sale, perhaps even more so because of the inherent complexity in the process. Yet B2B marketers face the same issues as their B2C counterparts when it comes to having data spread across an organization.

“In B2B that complexity of buying groups and accounts just adds another level to the data management problem because ultimately you need to be able to connect to your customer people data, but you also need to be able to connect the account data too and be able to [bring] the two together,” Glover explained.

By building a more complete picture of each individual in the buying cycle, you can, as Glover puts it, begin to put the bread crumbs together for the entire account. He believes that a CRM isn’t built for this kind of complexity and it requires a specialty tool like a CDP built to support B2B sales and marketing.

Adobe is working with early customers on the product and expects to go into beta before the end of next month with GA some time in the first half of next year.