Zach Walker, Defense Innovation Unit (DIU)

There is no greater maxim than speed is decisive in war. However, cyber warfare today is a mostly manual process. Humans scour code to find vulnerabilities and fix problems with patches. Humans evaluate whether a patch will maintain overall system functionality, and whether a patch is performant. Human attackers exploit unpatched systems or vulnerabilities that, in some cases, have been latent in systems for over a decade.

For example, the 2017 WannaCry attack was based on a vulnerability latent in every version of Microsoft Windows since 2001. It took 16 years for a latent vulnerability to become weaponized and wreak havoc across the world. In a sense, modern cyber warfare revolves around attackers taking advantage of low-hanging fruit and defenders hoping that Microsoft will release a patch to fix their systems before it’s too late.

In the future, we won’t have the luxury of waiting 16 years to patch a bug that leads to a zero-day exploit. Humans will augment attack and defense with machine scale and artificial intelligence – as DARPA has said, to take advantage of “zero-second” vulnerabilities. The first to master autonomous cyber warfare will be able to sow disruption, gain access to communications, persist, disrupt, and alter the course of battle. Those left behind will be at a tremendous disadvantage.

Artificially intelligent cyber warfare is already here. DARPA’s Cyber Grand Challenge (CGC) had the audacious goal of building autonomous systems capable of identifying, exploiting, and mitigating previously unknown vulnerabilities.

DARPA held the CGC in August 2016 with a machine-only Capture the Flag-style tournament at DEFCON 24. But is the tech ready for prime time? Congress seemed to think so. In the 2017 Senate Appropriations Committee Department of Defense Appropriations Bill, the Senate suggested that DoD explore “automated exploit generation and vulnerability identification… such as those exemplified in the Cyber Grand Challenge.” Last week, the 2019 NDAA Conference Report articulated the need for a Cyberspace Solarium Commission to give the nation a cyber warfare strategy in which zero-second attack and defense will be the norm.

Another remarkable aspect of the CGC was that it demonstrated the use of artificial intelligence for finding and remediating vulnerabilities. In its Perspectives on AI, DARPA describes three waves of AI. The first, Handcrafted Knowledge, entails reasoning over narrowly defined problems where the structure of the problem is defined by humans but the specifics are explored by machines. This is how the CGC played out; with virtually limitless ways to find and exploit vulnerabilities in the game, machines had to figure out actions would be the most lucrative. It was truly artificially intelligent cyber warfare.

ForAllSecure, the winner of the CGC, came out of the competition with $2 million in prize money and a long line of companies and nation-states interested in their tech. What didn’t they leave CGC with? A contract to bring their tech into the Department of Defense (DoD). DARPA’s job is to prove the possible with their challenges, and that’s exactly what they did in the CGC. But the DoD wasn’t yet ready to accept this technology. Fortunately, the Defense Innovation Unit Experimental (DIUx) was. Leveraging Other Transaction Authority as defined in 10 U.S.C. 2371(b), DIUx launched a project called VOLTRON to find out if commercial “cyber reasoning” could be used to find and remediate previously unknown vulnerabilities in DoD weapon systems. Companies had until June 20, 2017 to respond to a single-sentence solicitation: “The Department of Defense is interested in systems to automatically find previously unreported vulnerabilities in software without source code and automatically generate patches to remediate vulnerabilities with minimal false positives.”

Sixteen companies responded to the solicitation, and twenty-six business days later, DIUx awarded a $5 million contract to prototype cyber reasoning in the DoD. One year later, DIUx has contracts with three more companies, and their tools are being prototyped in every military service. This effort has brought together some of the best vulnerability researchers in the nation, for the first time, to work from a unified platform and to share best practices.

DIUx has been charged to move at the speed of commercial innovation, and by prototyping commercialized DARPA tech back into the DoD less than one year after the conclusion of a Grand Challenge, we’re doing just that. In the sense of how Clay Christensen describes disruptive innovation, VOLTRON is disrupting DoD cybersecurity.

Coalesce Research Group invites all the participants across the globe to attend the ‘International
Conference on Big Data Analytics and Data Science’ scheduled on Nov 11-12, 2019 in Las
Vegas, Nevada, USA.
Data Science 2019 offers an excellent opportunity to meet and make new contacts in the field of
Big Data, by providing collaboration spaces and break-out rooms with tea and lunch for
delegates between sessions with invaluable networking time for you. It allows delegates to have
issues addressed on Big Data and Data Science by recognized global experts who are up to date
with the latest developments and provide information on new techniques and technologies. This
International conference will feature world-renowned speakers, keynote speakers, plenary
speeches, young research forum, poster presentations, technical workshops, and career guidance
Big Data Analytics and Data Science conference cover all aspects of Big Data, Data Science and
Data mining including algorithms, software and systems, and applications.
For more details:

Find us here-

Facebook is rolling out to launch cryptocurrency money its expectations will “change the worldwide economy.” The currency, named Libra, is being created by Facebook, yet the organization means to impart control to a consortium of associations, including investment firms, credit card organizations, and other tech monsters.

At launch, you’ll have the option to send Libra within Facebook Messenger and WhatsApp, with it for the most part being implied as a middle person for moving conventional monetary standards. In the end, Facebook trusts Libra will be acknowledged as a type of installment, and other budgetary administrations will be based over its blockchain-based system.

Facebook is additionally driving a subsidiary organization, Calibra, which will develop products and administrations based around Libra. It’s the place Facebook means to make money off of the cryptographic money, and it’ll be beginning with the dispatch of its computerized wallet. Calibra will likewise deal with Libra reconciliations for Facebook’s different products.

In 2003, Tableau set out to pioneer self-service analytics with an intuitive analytics platform that would empower people of any skill level to work with data. Our customers grew with us to form the strongest analytics community in the world. And today, that mission to help people see and understand data grows stronger.

I’m excited to announce that Tableau has entered into an agreement to be acquired by Salesforce in an acquisition that combines the #1 CRM with the #1 analytics platform. By joining forces we will accelerate our ability to accomplish our mission. Together, Salesforce and Tableau share a deep commitment to empowering their respective communities and enabling people of every skill level to transform their businesses, their careers, and their lives through technology.


By: Adam Selipsky – CEO, Tableau Software

Google brings a new game to town with the recent announcement of its Anthos product hitting general availability. Anthos was conceived to help developers and IT administrators navigate the complex waters of distributed applications. While Microsoft was the first hyperscale cloud platform operator to make it possible to run its cloud environment in customers’ own data centers with Azure Stack, both Amazon and Google have now introduced products and services to do the same thing.

All three recognize the need to help customers modernize existing applications by taking advantage of the latest innovations like containers and Kubernetes. Making all these different applications work together across different platforms both on-premises and in the cloud is challenging. Google says it has a viable solution, or solutions, for this.

Aparna Sinha, Kubernetes group product manager at Google, describes the company’s take on modernizing legacy applications by three different approaches:

  1. GKE On-Prem to bring Google’s cloud services to a customer’s data center
  2. Service Mesh for moving applications to a microservices architecture
  3. Containerize legacy applications to make them portable and accessible

“We have seen a lot of customer interest in both hybrid and multi-cloud approaches to providing services that deliver consistent performance and the right levels of control,” Sinha told me.

Each of these offers a structured approach to moving legacy apps to a cloud-based architecture. While this doesn’t rule out keeping some portions of the application inhouse, it does necessitate the use of containers and Kubernetes as the foundational pieces of a new application paradigm.

Google Kubernetes Environment On-Prem

As the cornerstone of Google’s hybrid cloud offering, GKE On-Prem offers customers multiple options for modernizing legacy applications stuck on old hardware. Workload portability, or enabling applications to run anywhere, is the ultimate goal. GKE On-Prem makes it possible to build and run applications when you need to keep the data inhouse or you don’t want to move large amounts of data to the cloud.

Google’s approach here is different from Amazon’s or Microsoft’s in in that GKE On-Prem runs on top of VMware vSphere. Everything runs on customer hardware with support for all the mainstream VMware OEMs, including Cisco, Dell/EMC, HPE, and Lenovo. This approach caters to the large number of existing VMware customers and keeps the familiar management and operating environment already in place.

Service Mesh

Google sees the future of application integration built upon a wide range of microservices all orchestrated and managed in the cloud. Google Cloud Service Mesh (GCSM) is the product offering that handles everything from communication and networking to monitoring and security. GCSM utilizes Google’s Istio product to handle the heavy lifting required to make these new microservices reliable and secure.

Serverless computing is the concept of providing specific services on demand running in a platform-independent manner. The bottom line here is the ability to deliver some piece of functionality without being tied to any physical system. Google’s approach to the problem is to use Kubernetes and a new project called Knative on top of Istio to make it all work.


Most corporations have monolithic applications that they will never rewrite. These might be packaged applications like a database or another application purchased in the past. Google’s approach here is to move these applications into a container-based platform to enable them to run and, more importantly, be managed and integrated with the Google Cloud Platform environment.

To make this process easier, Google has a migration center offering specific services both internally and through partners. A variety of approaches to the problem, from lift-and-shift to migrate-and-modernize, can be taken depending on the complexity and flexibility of the customer requirements. Google realizes that one size doesn’t fit all in this approach, and it’s enlisted a wide range of partners to make it happen.

Bottom Line

Google’s whole strategy in tackling the problem of complexity is to simplify. While that might seem trite, it really does work when you take their products out for a test drive. Developers can spin up a test system with just a few clicks and then develop against essentially the same core infrastructure as they would have in production.

Microsoft’s answer to the integrated on-premises and cloud development story is to go with an Azure Stack system. Similarly, the folks at Amazon want you to buy their hardware and run a clone of AWS in your data center. Google thinks you can get what you need by running on top of VMware vSphere on existing hardware at significantly lower cost than either AWS or Microsoft.


Source: Data Center Knowledge

XRay for MQ – Nastel Technologies is a premier global provider of AIOps solutions covering end-to-end Transaction Tracking and Application Performance Management (APM) for mission-critical applications in the mixed cloud environments. Nastel is entering its 25th year in business and has recently introduced the latest offering, Nastel XRay® for MQ.

Nastel XRay® for MQ, a Big Data analytics platform for AIOps, now joins the Nastel portfolio of products. Nastel XRay® is a highly scalable Business Transaction Monitoring (BTM) solution. The product tracks transactions end-to-end from mobile devices throughout Java applications, ESBs (Enterprise Service Bus) and legacy systems, applying machine learning and big data analytics.

Nastel XRay® for MQ delivers the following functionality
o Profile your MQ and Broker middleware deployments
o Know where your messages are instantly
o Find applications which abuse middleware resources
o Search messages by content, header and more
o Visualize messages flows across apps, queues, channels
o Optimize performance of your MQ apps and messaging topologies
o Web dashboards for developers and operations
o Ad-hoc web reports for advanced users
o Cloud (SaaS) and Docker deployments
o Expand message tracking to other middleware technologies such as Kafka, JMS, Java and more

“We are thrilled to be adding XRay to Nastel’s ever growing suite of products. As we start to shift into taking leadership of the AIOps market we are focused on continuing innovation as well as the startup spirit that originally helped the company to excel. With the addition of XRay, we are perfectly positioned to do what no other company is capable of. Our unique ability to exploit middleware messages to abstract business understanding from machine data places us at the right place, at the right time with the right technology to dominate the AIOps space.” said Nastel’s CEO, David Mavashev.

SAP and Apple will help customers to fabricate their autonomous portable business applications, Reuters highlighted on Tuesday. Apple will utilize its ML innovation to do advancement. The new advancement, with AR, will assist customers with using iPhones or iPads to complete business errands, for example, fixing apparatus and stocking racks absolutely.

Two years back, SAP held hands with the Cupertino-goliath to deliver redo versatile applications for its present scope of items, including cost the board and HR, to work locally on iOS OS. Local applications are fabricated explicitly for a specific stage or gadget. Consequently, local applications work more easily than cloud or electronic applications made to work over various stages.

As per both the organizations, the association is currently being extended over SAP’s more extensive scope of the portfolio for applications, including fragments, for example, acquisition. The applications can likewise be worked on Apple’s Mac PC go.

Individuals acquainted with the issue said in SAP occasion called Sapphire yearly U.S. client meeting that the Cupertino-goliath’s essential ML innovation would empower organizations to create applications on SAP’s Leonardo stage.

The German programming organization is moving from running organizations’ inward operations towards a more end-client centred methodology, sponsored by its $8 billion procurement of Qualtrics, an American organization.

The new activity will help the customers of the two organizations.

                                               Top 10 Global IoT Solution Providers 2019

Back in the year 2014, the size of loT retail market in the US was estimated to be around $2.15 billion. It has been growing steadily ever since with estimates that it will grow its size to over $11 billion by the year 2025. With its growth, new trends are introduced, and new technologies that are utilizing loT are developed.

One major trend is infonomics, with loT helping its growth by its vast ability to collect data and analyze it. It’s expected that by the year 2023 that the buying and selling of loT data will become an essential part of loT systems. This won’t be done only for profit /extra revenue. It will also be done as part of ad campaigns; utilizing on the data that is collected to prove that a certain product delivers what it is promising.
loT will be playing a vital role in helping CRM systems. For instance, an loT device can notify companies that a customer is having problem with his service /product, before the customer himself reports, or even realizes it. From there, it is easy to imagine how loT will revolutionize preventive maintenance.

Fleet management systems that use loT devices have already seen the impact of these devices on their bottom-line, and it is bound for loT to play even a bigger role in this field in particular. The US government sees the benefits that can be reaped from utilizing loT solutions. And agencies like The General Service Administration is already using loT to cut costs by collecting data from sensors in government-owned buildings, in hopes that this data can help save up to $15 million per year in heating and cooling costs.

Andy Dé, Senior Global Industry Director, Healthcare and Life Sciences

1.  AI and Analytics integrated with electronic health records (EHRs) will deliver actionable insights for superior care delivery and personalized care at a lower cost for Healthcare Providers.

Thanks to the meaningful use stipulations of the Affordable Care Act, a majority of the healthcare systems in North America have made significant investments in EHRs. These deployments have often been multi-year initiatives often running over time and over cost, not unlike enterprise resource planning (ERP) deployments in other industries. While EHRs have helped hospitals create a central repository of their clinical and financial data, extracting accurate, reliable and actionable analytics for right time decision-making is a daunting challenge.

Leading healthcare organizations have deployed next-generation analytics platforms to deliver actionable insights integrated into their EHR’s clinical workflows to empower their executives, clinicians and nurses at the POC. In 2019 and beyond, they will bolster this by integrating machine learning and NLP algorithms to enable predictive analytics powered patient risk stratification, proactively identify the most at-risk patients likely to overstay their length of stay and at risk of 30-day re-admissions.

Machine learning and NLP integrated with next-generation Analytics will also enable researchers and clinicians to identify variances in genomic profiles from new born patients to personalized treatment protocols. This will help drive the appropriate clinical care protocols and care coordination to ensure higher quality at a lower risk and cost.

2.  Healthcare Payers will integrate AI with their ERP and Analytics to improve health outcomes while saving millions from fraud, waste and abuse.

Integrating AI and analytics will enable care managers to proactively identify patient conditions and improve the quality of their engagement with them, as well the relevant information to collect from them. Virtual AI powered agents or bots can rapidly respond to customer questions re: coverage and services and potentially help improve the overall customer experience. Machine learning and natural language processing (NLP) can help proactively identify fraud, waste and abuse pertaining to claims processing and save healthcare payers millions.

3.  Pharma, Medical Devices and Diagnostics companies will leverage AI and Analytics to deliver new drugs and improve diagnostics and minimally invasive surgeries.

AI, including machine learning and deep learning, algorithms can help pharma companies identify new indications for previously failed formulations to potentially deliver drugs for diseases like Parkinson’s, Alzheimer’s and dementia. The onerous task of trials and errors to match drug interventions with individual patients can be automated with machine learning. Neural networks currently being leveraged for autonomous vehicles will find potential application in structure-based drug design by combining atoms virtually to come up with potential molecules that can treat diseases aligned with their biological origins.

Medical diagnostics and devices companies will be one of the largest adopters of AI and analytics for automated medical diagnoses for cancer and cardiovascular diseases. Using machine learning enabled scanning of radiology images to autonomous/remote surgery using robots, but endowed with superior machine vision, can make minimally invasive surgeries safer and most cost-effective, while assuring faster recuperation for patients.

4.  AI and Analytics will help policy makers tackle the Opioid Epidemic in North America through actionable intelligence.

The United States is in the throes of an unprecedented opioid epidemic with more than 2 million Americans addicted to prescription pain pills and similar drugs. During 2016, there were more than 63,600 overdose deaths in the United States, including 42,249 that involved an opioid (66.4 percent). That’s an average of 115 opioid overdose deaths each day.

No alt text provided for this image

Figure 2. The opioid epidemic in the U.S. by the numbers. Source: True and [2]

The time has come to tackle this epidemic with the tools and platforms of the 21st century. Crafting a national opioid disease registry enabled by self-service analytics and providing visibility to policy makers, healthcare providers and citizens at large, would be a step in the right direction. This would enable states, communities and health organizations with a reliable picture of the extent of the opioid epidemic in their communities, as a first step. Monitoring the number of opioid cases in emergency departments mapped to the associated opioid, based on actionable analytics, would help these communities address the challenge head on. Proactively distributing take home overdose reversal kits to patients and their families to restore normal respiration to a person whose breathing has stopped, would go a long way to reduce the large number of fatalities resulting from opioid overdose across the nation.

Challenges to the Convergence of AI and Analytics in Enabling Innovation

The most significant challenge to delivering on the innovations above is the somewhat silo-ed approach to innovation deployment and delivery. The notion of AI partnering with analytics to deliver actionable insights that will enable prediction and prescription for rapid decision-making, as well as automation of manually repetitive tasks, is still a work in progress and not necessarily intuitive.

My prediction for 2019 is that next-generation analytics platforms will embed simple machine learning, natural language generation (NLG) and NLP capabilities that will enable executives, line of business leaders, clinicians, nurses and analysts to automate repetitive tasks. It will also empower them with predictive, and perhaps even prescriptive, actionable insights from the board room to the point-of-care (POC), enabling rapid decision-making with life-saving impact – from drug innovation and delivery, rapid diagnostics and minimally invasive surgeries, to precision medicine, value based care, chronic disease management, as well as fraud, waste and abuse detection.