Facebook is rolling out to launch cryptocurrency money its expectations will “change the worldwide economy.” The currency, named Libra, is being created by Facebook, yet the organization means to impart control to a consortium of associations, including investment firms, credit card organizations, and other tech monsters.

At launch, you’ll have the option to send Libra within Facebook Messenger and WhatsApp, with it for the most part being implied as a middle person for moving conventional monetary standards. In the end, Facebook trusts Libra will be acknowledged as a type of installment, and other budgetary administrations will be based over its blockchain-based system.

Facebook is additionally driving a subsidiary organization, Calibra, which will develop products and administrations based around Libra. It’s the place Facebook means to make money off of the cryptographic money, and it’ll be beginning with the dispatch of its computerized wallet. Calibra will likewise deal with Libra reconciliations for Facebook’s different products.

In 2003, Tableau set out to pioneer self-service analytics with an intuitive analytics platform that would empower people of any skill level to work with data. Our customers grew with us to form the strongest analytics community in the world. And today, that mission to help people see and understand data grows stronger.

I’m excited to announce that Tableau has entered into an agreement to be acquired by Salesforce in an acquisition that combines the #1 CRM with the #1 analytics platform. By joining forces we will accelerate our ability to accomplish our mission. Together, Salesforce and Tableau share a deep commitment to empowering their respective communities and enabling people of every skill level to transform their businesses, their careers, and their lives through technology.

 

By: Adam Selipsky – CEO, Tableau Software

Google brings a new game to town with the recent announcement of its Anthos product hitting general availability. Anthos was conceived to help developers and IT administrators navigate the complex waters of distributed applications. While Microsoft was the first hyperscale cloud platform operator to make it possible to run its cloud environment in customers’ own data centers with Azure Stack, both Amazon and Google have now introduced products and services to do the same thing.

All three recognize the need to help customers modernize existing applications by taking advantage of the latest innovations like containers and Kubernetes. Making all these different applications work together across different platforms both on-premises and in the cloud is challenging. Google says it has a viable solution, or solutions, for this.

Aparna Sinha, Kubernetes group product manager at Google, describes the company’s take on modernizing legacy applications by three different approaches:

  1. GKE On-Prem to bring Google’s cloud services to a customer’s data center
  2. Service Mesh for moving applications to a microservices architecture
  3. Containerize legacy applications to make them portable and accessible

“We have seen a lot of customer interest in both hybrid and multi-cloud approaches to providing services that deliver consistent performance and the right levels of control,” Sinha told me.

Each of these offers a structured approach to moving legacy apps to a cloud-based architecture. While this doesn’t rule out keeping some portions of the application inhouse, it does necessitate the use of containers and Kubernetes as the foundational pieces of a new application paradigm.

Google Kubernetes Environment On-Prem

As the cornerstone of Google’s hybrid cloud offering, GKE On-Prem offers customers multiple options for modernizing legacy applications stuck on old hardware. Workload portability, or enabling applications to run anywhere, is the ultimate goal. GKE On-Prem makes it possible to build and run applications when you need to keep the data inhouse or you don’t want to move large amounts of data to the cloud.

Google’s approach here is different from Amazon’s or Microsoft’s in in that GKE On-Prem runs on top of VMware vSphere. Everything runs on customer hardware with support for all the mainstream VMware OEMs, including Cisco, Dell/EMC, HPE, and Lenovo. This approach caters to the large number of existing VMware customers and keeps the familiar management and operating environment already in place.

Service Mesh

Google sees the future of application integration built upon a wide range of microservices all orchestrated and managed in the cloud. Google Cloud Service Mesh (GCSM) is the product offering that handles everything from communication and networking to monitoring and security. GCSM utilizes Google’s Istio product to handle the heavy lifting required to make these new microservices reliable and secure.

Serverless computing is the concept of providing specific services on demand running in a platform-independent manner. The bottom line here is the ability to deliver some piece of functionality without being tied to any physical system. Google’s approach to the problem is to use Kubernetes and a new project called Knative on top of Istio to make it all work.

Containerize

Most corporations have monolithic applications that they will never rewrite. These might be packaged applications like a database or another application purchased in the past. Google’s approach here is to move these applications into a container-based platform to enable them to run and, more importantly, be managed and integrated with the Google Cloud Platform environment.

To make this process easier, Google has a migration center offering specific services both internally and through partners. A variety of approaches to the problem, from lift-and-shift to migrate-and-modernize, can be taken depending on the complexity and flexibility of the customer requirements. Google realizes that one size doesn’t fit all in this approach, and it’s enlisted a wide range of partners to make it happen.

Bottom Line

Google’s whole strategy in tackling the problem of complexity is to simplify. While that might seem trite, it really does work when you take their products out for a test drive. Developers can spin up a test system with just a few clicks and then develop against essentially the same core infrastructure as they would have in production.

Microsoft’s answer to the integrated on-premises and cloud development story is to go with an Azure Stack system. Similarly, the folks at Amazon want you to buy their hardware and run a clone of AWS in your data center. Google thinks you can get what you need by running on top of VMware vSphere on existing hardware at significantly lower cost than either AWS or Microsoft.

 

Source: Data Center Knowledge

XRay for MQ – Nastel Technologies is a premier global provider of AIOps solutions covering end-to-end Transaction Tracking and Application Performance Management (APM) for mission-critical applications in the mixed cloud environments. Nastel is entering its 25th year in business and has recently introduced the latest offering, Nastel XRay® for MQ.

Nastel XRay® for MQ, a Big Data analytics platform for AIOps, now joins the Nastel portfolio of products. Nastel XRay® is a highly scalable Business Transaction Monitoring (BTM) solution. The product tracks transactions end-to-end from mobile devices throughout Java applications, ESBs (Enterprise Service Bus) and legacy systems, applying machine learning and big data analytics.

Nastel XRay® for MQ delivers the following functionality
o Profile your MQ and Broker middleware deployments
o Know where your messages are instantly
o Find applications which abuse middleware resources
o Search messages by content, header and more
o Visualize messages flows across apps, queues, channels
o Optimize performance of your MQ apps and messaging topologies
o Web dashboards for developers and operations
o Ad-hoc web reports for advanced users
o Cloud (SaaS) and Docker deployments
o Expand message tracking to other middleware technologies such as Kafka, JMS, Java and more

“We are thrilled to be adding XRay to Nastel’s ever growing suite of products. As we start to shift into taking leadership of the AIOps market we are focused on continuing innovation as well as the startup spirit that originally helped the company to excel. With the addition of XRay, we are perfectly positioned to do what no other company is capable of. Our unique ability to exploit middleware messages to abstract business understanding from machine data places us at the right place, at the right time with the right technology to dominate the AIOps space.” said Nastel’s CEO, David Mavashev.

SAP and Apple will help customers to fabricate their autonomous portable business applications, Reuters highlighted on Tuesday. Apple will utilize its ML innovation to do advancement. The new advancement, with AR, will assist customers with using iPhones or iPads to complete business errands, for example, fixing apparatus and stocking racks absolutely.

Two years back, SAP held hands with the Cupertino-goliath to deliver redo versatile applications for its present scope of items, including cost the board and HR, to work locally on iOS OS. Local applications are fabricated explicitly for a specific stage or gadget. Consequently, local applications work more easily than cloud or electronic applications made to work over various stages.

As per both the organizations, the association is currently being extended over SAP’s more extensive scope of the portfolio for applications, including fragments, for example, acquisition. The applications can likewise be worked on Apple’s Mac PC go.

Individuals acquainted with the issue said in SAP occasion called Sapphire yearly U.S. client meeting that the Cupertino-goliath’s essential ML innovation would empower organizations to create applications on SAP’s Leonardo stage.

The German programming organization is moving from running organizations’ inward operations towards a more end-client centred methodology, sponsored by its $8 billion procurement of Qualtrics, an American organization.

The new activity will help the customers of the two organizations.

                                               Top 10 Global IoT Solution Providers 2019

Back in the year 2014, the size of loT retail market in the US was estimated to be around $2.15 billion. It has been growing steadily ever since with estimates that it will grow its size to over $11 billion by the year 2025. With its growth, new trends are introduced, and new technologies that are utilizing loT are developed.

One major trend is infonomics, with loT helping its growth by its vast ability to collect data and analyze it. It’s expected that by the year 2023 that the buying and selling of loT data will become an essential part of loT systems. This won’t be done only for profit /extra revenue. It will also be done as part of ad campaigns; utilizing on the data that is collected to prove that a certain product delivers what it is promising.
loT will be playing a vital role in helping CRM systems. For instance, an loT device can notify companies that a customer is having problem with his service /product, before the customer himself reports, or even realizes it. From there, it is easy to imagine how loT will revolutionize preventive maintenance.

Fleet management systems that use loT devices have already seen the impact of these devices on their bottom-line, and it is bound for loT to play even a bigger role in this field in particular. The US government sees the benefits that can be reaped from utilizing loT solutions. And agencies like The General Service Administration is already using loT to cut costs by collecting data from sensors in government-owned buildings, in hopes that this data can help save up to $15 million per year in heating and cooling costs.

Andy Dé, Senior Global Industry Director, Healthcare and Life Sciences

1.  AI and Analytics integrated with electronic health records (EHRs) will deliver actionable insights for superior care delivery and personalized care at a lower cost for Healthcare Providers.

Thanks to the meaningful use stipulations of the Affordable Care Act, a majority of the healthcare systems in North America have made significant investments in EHRs. These deployments have often been multi-year initiatives often running over time and over cost, not unlike enterprise resource planning (ERP) deployments in other industries. While EHRs have helped hospitals create a central repository of their clinical and financial data, extracting accurate, reliable and actionable analytics for right time decision-making is a daunting challenge.

Leading healthcare organizations have deployed next-generation analytics platforms to deliver actionable insights integrated into their EHR’s clinical workflows to empower their executives, clinicians and nurses at the POC. In 2019 and beyond, they will bolster this by integrating machine learning and NLP algorithms to enable predictive analytics powered patient risk stratification, proactively identify the most at-risk patients likely to overstay their length of stay and at risk of 30-day re-admissions.

Machine learning and NLP integrated with next-generation Analytics will also enable researchers and clinicians to identify variances in genomic profiles from new born patients to personalized treatment protocols. This will help drive the appropriate clinical care protocols and care coordination to ensure higher quality at a lower risk and cost.

2.  Healthcare Payers will integrate AI with their ERP and Analytics to improve health outcomes while saving millions from fraud, waste and abuse.

Integrating AI and analytics will enable care managers to proactively identify patient conditions and improve the quality of their engagement with them, as well the relevant information to collect from them. Virtual AI powered agents or bots can rapidly respond to customer questions re: coverage and services and potentially help improve the overall customer experience. Machine learning and natural language processing (NLP) can help proactively identify fraud, waste and abuse pertaining to claims processing and save healthcare payers millions.

3.  Pharma, Medical Devices and Diagnostics companies will leverage AI and Analytics to deliver new drugs and improve diagnostics and minimally invasive surgeries.

AI, including machine learning and deep learning, algorithms can help pharma companies identify new indications for previously failed formulations to potentially deliver drugs for diseases like Parkinson’s, Alzheimer’s and dementia. The onerous task of trials and errors to match drug interventions with individual patients can be automated with machine learning. Neural networks currently being leveraged for autonomous vehicles will find potential application in structure-based drug design by combining atoms virtually to come up with potential molecules that can treat diseases aligned with their biological origins.

Medical diagnostics and devices companies will be one of the largest adopters of AI and analytics for automated medical diagnoses for cancer and cardiovascular diseases. Using machine learning enabled scanning of radiology images to autonomous/remote surgery using robots, but endowed with superior machine vision, can make minimally invasive surgeries safer and most cost-effective, while assuring faster recuperation for patients.

4.  AI and Analytics will help policy makers tackle the Opioid Epidemic in North America through actionable intelligence.

The United States is in the throes of an unprecedented opioid epidemic with more than 2 million Americans addicted to prescription pain pills and similar drugs. During 2016, there were more than 63,600 overdose deaths in the United States, including 42,249 that involved an opioid (66.4 percent). That’s an average of 115 opioid overdose deaths each day.

No alt text provided for this image

Figure 2. The opioid epidemic in the U.S. by the numbers. Source: True Recovery.com and https://www.hhs.gov/opioids/about-the-epidemic/index.html [2]

The time has come to tackle this epidemic with the tools and platforms of the 21st century. Crafting a national opioid disease registry enabled by self-service analytics and providing visibility to policy makers, healthcare providers and citizens at large, would be a step in the right direction. This would enable states, communities and health organizations with a reliable picture of the extent of the opioid epidemic in their communities, as a first step. Monitoring the number of opioid cases in emergency departments mapped to the associated opioid, based on actionable analytics, would help these communities address the challenge head on. Proactively distributing take home overdose reversal kits to patients and their families to restore normal respiration to a person whose breathing has stopped, would go a long way to reduce the large number of fatalities resulting from opioid overdose across the nation.

Challenges to the Convergence of AI and Analytics in Enabling Innovation

The most significant challenge to delivering on the innovations above is the somewhat silo-ed approach to innovation deployment and delivery. The notion of AI partnering with analytics to deliver actionable insights that will enable prediction and prescription for rapid decision-making, as well as automation of manually repetitive tasks, is still a work in progress and not necessarily intuitive.

My prediction for 2019 is that next-generation analytics platforms will embed simple machine learning, natural language generation (NLG) and NLP capabilities that will enable executives, line of business leaders, clinicians, nurses and analysts to automate repetitive tasks. It will also empower them with predictive, and perhaps even prescriptive, actionable insights from the board room to the point-of-care (POC), enabling rapid decision-making with life-saving impact – from drug innovation and delivery, rapid diagnostics and minimally invasive surgeries, to precision medicine, value based care, chronic disease management, as well as fraud, waste and abuse detection.

Healthcare organizations have mastered the art of data collection. But in order to be forward-looking, they need to focus on how to turn their data into valuable information.

Sheila Talton, President & CEO, Gray Matter Analytics

For what it’s worth, most hospitals and health systems have realized the value in investing in big data. They’re eagerly raking in a plethora of data, focusing solely on the collection aspect.

But that’s insufficient!!!

“The real story is: How do you have data that’s accessible that can actually become information? Because data is not information.

Most healthcare organizations have gotten the data gathering process down pat, and they’ve become experts at utilizing data to report what happened. However, the industry needs to keep moving forward so that data can be used to get descriptive insights, predictive insights and prescriptive recommendations.

Systems are clearly impeded from making good use of the data they have. Part of the problem is the fact that much of the data is siloed.

For other organizations, it’s simply a budget issue. The majority of health systems can hardly obtain the financial resources to maintain their existing tools, let alone implement new processes and programs.

When a hospital does hop on the bandwagon and begin to manage data, it’s often pulled into believing in a one-size-fits-all model. Every tool and solution that a certain hospital utilizes will work for me, the hospital thinks.

Each hospital has a different patient population and budget, meaning it requires its own unique solution.

The need for better data integration is especially applicable in this day and age as the healthcare sector pushes for “value-based care”.

Since the industry is moving toward shared risk models, figuring out how to use data is more important than ever. Health system leaders need to ask, How do we manipulate and manage our data? How can we use it to make the lives of our clinicians easier?

On top of asking these questions, everyone from payers to providers needs to be finding ways to share data with each other. Only by doing so can the healthcare field glean meaningful insights and information.

Becoming a data-centric entity. But it also boils down to planning for the long term. Instead of focusing solely on installing one-off solutions and collecting data, systems must consider their ultimate goals.