4 DevOps definitions managers must understand

From a software engineering perspective, most DevOps techniques happen at the practitioner level, but true DevOps adoption goes beyond development and operations alone.
DevOps requires fundamental changes to the culture of an organization. To succeed, all stakeholders ought to understand the fundamentals. To help explain, here are four DevOps definitions that managers should understand:
1. Continuous improvement
Adopting best practices should be more than a one-time event. Organizations should have built-in processes that identify areas that can be made better. Some businesses accomplish this through dedicated process-improvement teams. Others allow the teams that adopt the processes to self-assess and determine their own process-improvement paths. Regardless of the method, the goal is continuous improvement.
2. Continuous integration
Continuous integration adds tremendous value in DevOps by allowing large teams of developers working on cross-technology components in multiple locations to deliver software in an agile manner. It also ensures that each team’s work is continuously integrated with that of other development teams, then validated. Continuous integration reduces risk and identifies issues earlier in the software development life cycle.
3. Continuous delivery
Continuous delivery is the process of automating the deployment of software to testing, system testing, staging and production environments. Though some organizations stop short of production, those that adopt DevOps generally use the same automated process in all environments to improve efficiency and reduce the risk introduced by inconsistent processes. Adopting continuous delivery is the most critical component of DevOps adoption.
4. Continuous testing
Organizations should adopt processes in three key areas to enable continuous testing: test environment provisioning, test data management and test integration.
 Each organization should determine what processes to adopt for each area. These processes may even vary with each project, based on individual testing needs and the requirements of service agreements. Customer-facing applications may need more security testing than internal applications, for example.
Test environment provisioning and test data management are more important challenges for projects that use agile methodologies and practice continuous integration than for projects that use waterfall methodology. Likewise, function and performance test requirements for complex applications with components that have different delivery cycles are different if they are for simple, monolithic web apps.
The key point is to establish processes to continuously test code as it is created. This helps make practices such as continuous delivery possible.
A basic understanding of each of these techniques is essential knowledge for managers looking to adopt a DevOps approach.
To learn more about DevOps, download your free copy of DevOps for Dummies.
The post 4 DevOps definitions managers must understand appeared first on Cloud computing news.
Quelle: Thoughts on Cloud

Radeon VII im Test: Die Grafikkarte für Videospeicher-Liebhaber

Höherer Preis, ähnliche Performance und doppelt so viel Videospeicher wie die Geforce RTX 2080: AMDs Radeon VII ist eine primär technisch spannende Grafikkarte. Bei Energie-Effizienz und Lautheit bleibt sie chancenlos, die 16 GByte Videospeicher sind eher ein Nischen-Bonus. Ein Test von Marc Sauter und Sebastian Grüner (AMD Vega, AMD)
Quelle: Golem

From The Enterprisers Project: 9 Kubernetes Jobs Facts and Figures

Over at the Enterprisers Project, Kevin Casey has written up a great piece about the state of the Kubernetes job market. Even if you thought this market was growing, you may be surprised to find out just how big it’s become. A few of the stats listed out and explained in this post: There are […]
The post From The Enterprisers Project: 9 Kubernetes Jobs Facts and Figures appeared first on Red Hat OpenShift Blog.
Quelle: OpenShift

Microsoft Healthcare Bot brings conversational AI to healthcare

Today we announced the general availability of the Microsoft Healthcare Bot in the Azure Marketplace. The Microsoft Healthcare Bot is a cloud service that powers conversational AI for healthcare. It’s designed to empower healthcare organizations to build and deploy compliant, AI-powered virtual health assistants and chatbots that help them put more information in the hands of their users, enable self-service, drive better outcomes, and reduce costs.

The Healthcare Bot service has several unique aspects:

Out-of-the-box healthcare intelligence including language models to understand healthcare intents and medical terminology, as well as content from credible providers with information about conditions, symptoms, doctors, medications, and even a symptom checker.
Customization and extensibility, which allows partners to introduce their own business flows, and securely connect to their own backend systems over HL7 FHIR or REST APIs. The service model allows our partners to focus on the important things like their key business needs and their own flows.
Security and compliance with industry standards, such as ISO 27001, ISO 27018, HIPAA, Cloud Security Alliance (CSA) Gold, and GDPR which we consider as table stakes in this industry. We also provide tools and out-of-the-box functionality that help our partners create secure and compliant solutions.

The close collaboration with our preview partners, including Premera Blue Cross, Quest Diagnostics, and Advocate Aurora Heath, helped identify diverse use cases that address the needs and expectations of healthcare organizations. We now have a better understanding of what’s important to our partners, and how to evolve the product by focusing on key differentiating features. For example, we realized the importance of enabling a visual design environment that allows review of the flows by clinical personnel and domain experts who are non-developers. We also evolved our scenario templates catalog and provided a gallery of example use cases to start from, which allows our partners to develop their bots quickly and inexpensively.
 
It has been exciting for us to see our partners go live with their chatbots, enhance their chatbots over time, and meet their business goals. And in the upcoming months, we will develop the service further.

It’s my opinion that virtual health assistants and chatbot technology will never replace medical personnel. But technology can help make better use of medical personnel's time and relieve some of the burden from the healthcare system.

Technology is here to enable that. It’s the responsibility of our generation to leverage technology to help solve important problems for humankind.

For more information:

Microsoft Healthcare Bot service on Azure Marketplace

Microsoft Healthcare Bot project page
Quelle: Azure

Lighting up healthcare data with FHIR®: Announcing the Azure API for FHIR

In the last several years we’ve seen fundamental transformation in healthcare data management, but the biggest, and perhaps most important shift, has been in how healthcare organizations think about cloud technology and their most sensitive health data. Healthcare leaders have transitioned from asking “Why should I manage healthcare data in the cloud?” and are now asking “How?”.

The change in the question may seem subtle, but the rigor required to ensure the highest level of privacy, security, and management of Protected Health Information (PHI) in the cloud has been a barrier to entry for much of the healthcare ecosystem. Compounding the difficulty is the state of data: multiple datasets, fragmented sources of truth, inconsistent formats, and exponential growth of data types.

We are now seeing, almost daily, new breakthroughs with applied machine learning on health data. But to truly apply machine learning at scale in the healthcare industry, we must ensure a secure and trusted pathway to manage that data in the cloud. Moving data into the cloud in its current state can reduce cost, but cost isn’t the only measure. Healthcare leaders are thinking about how they bring their data into the cloud while increasing opportunities to use and learn from that data: How do we ensure the privacy of patient data? How do we retain control and access management for our data at scale? How do we bring data into the cloud in a way that will accelerate machine learning for the future?  

And today I am thrilled to announce Azure technology that begins to answer the question of “how”: Azure API for FHIR®.

Azure API for FHIR®: Your health data. Unlocked with FHIR.

Data management in the open source FHIR (Fast Healthcare Interoperability Resource) standard is becoming turnkey for interoperability and machine learning on healthcare data. There is a growing need for healthcare partners to build and maintain FHIR services that exchange and manage data in the FHIR format.

Azure API for FHIR offers exchange of data via a FHIR API and a managed Platform as a Service (PaaS) offering in Azure, designed for management and persistence of PHI data in the native FHIR format. The FHIR API and data store enables you to securely connect and interact with any system that utilizes FHIR APIs, and Microsoft takes on the operations, maintenance, updates and compliance requirements in the PaaS offering, so you can free up your own operational and development resources.

Key features of the Azure API for FHIR will include:

Provision and start running in just a few minutes
High performance, low latency
Enterprise grade, managed FHIR service
Role Based Access Control (RBAC) – allowing you to manage access to your data at scale
Audit log tracking for access, creation, modification, and reads within each data store
SMART on FHIR functionality
Secure compliance in the cloud: ISO 27001:2013 certified, supports HIPAA and GDPR, and built on the HITRUST certified Azure platform
Data is isolated to a unique database per API instance
Protection of your data with multi-region failover

The cost-effective way to start in the cloud

Because we believe it's important to invest in the FHIR standard, you pay only for underlying database usage and data transfer when using the Azure API for FHIR.

The cloud environment you choose for healthcare applications is critical. You want elastic scale so you pay only for the throughput and storage you need. The Azure services that power Azure API for FHIR are designed for rapid performance no matter what size datasets you’re managing. The data persistence layer in the Azure API for FHIR leverages Azure Cosmos DB, which guarantees latencies at the 99th percentile and guarantees high availability with multi-homing capabilities.

Those with experience in healthcare data management may wonder:  we have HL7 standards in the industry already, why do we need FHIR to bring data into the cloud? HL7 has served the industry well since its first implementations in the 1980s. But as it’s evolved, customizations of HL7 can translate to a heavy lift for the future of healthcare learning: data science. FHIR is gaining traction because it provides a consistent, open source, extensible data standard that can scale as we learn. In order to accelerate machine learning on healthcare data, organizations are shifting data to the FHIR format as they transition into the cloud:  saving both time and money.

Where can I apply the Azure API for FHIR?

Azure API for FHIR is intended for customers developing solutions that integrate healthcare data from one or more systems of record. The API promotes the use of ingesting, managing, and persisting that data in native FHIR resources. Leveraging an open source standard (FHIR) enables interoperability for data sharing both within and outside of your ecosystem and helps accelerates the machine learning process on data is normalized in FHIR.

Our customers are already seeing powerful scenarios for FHIR applications:

Startup/IoMT:
Fred Hutchinson Cancer Research in Seattle, WA is developing innovative IoMT and patient applications to remotely monitor patients undergoing chemotherapy. While in development, they needed a secure, fully managed backend service to handle patient data across multiple participating hospitals. To ensure they could design once and integrate quickly into a broad number hospital EHR systems, they are using Azure API for FHIR and a SMART on FHIR implementation.

Provider Ecosystems:
University of Pittsburgh Medical Center has been working with Microsoft FHIR offerings in their hospital systems: “The ability to one-click deploy a FHIR server as a managed service allows us to think more about our applications and customer needs, and less about the plumbing required to store and represent clinical data.” – Brian Kolowitz, director of product management, UPMC Enterprises.

Research:
Associate Dean of Research Information Technology at University of Michigan, Dr. Sachin Kheterpal, is leading efforts to streamline data ingestion and management for Michigan Medicine’s research teams. To drive faster research innovation and ML development, University of Michigan will be piloting the management of data through the Azure API instead of their on-premise systems. “We’re expecting to reduce operational workloads, increase data control, improve data de-identification, and enable our data scientists to move faster with data normalized in the FHIR standard that benefits from a community of developers based upon FHIR resources.”

If you want additional support as you integrate FHIR, we’ve also been working with over 25 partners in our Early Access Program. ISV and SI partners in the Early Access Program understand the technical details and applications for Azure API for FHIR and can help get your data into FHIR and the cloud even more easily.

Investing in FHIR to accelerate AI in healthcare

The Azure ecosystem already has robust components for Microsoft partners to build secure and compliant health solutions in the cloud on their own, but we’re going to continue making it easier. We’re focused on delivering turnkey cloud solutions so our healthcare partners can focus their attention on innovation. Check out Azure API for FHIR and do more with your health data.

FHIR® is the registered trademark of HL7 and is used with the permission of HL7 
Quelle: Azure

Moto G7: Lenovo bringt vier neue Motorola-Smartphones

Aus drei mach vier: Gab es vom Moto G6 noch drei Modelle, hat Lenovo vom Moto G7 gleich vier Varianten vorgestellt. Alle vier Moto-G7-Modelle unterscheiden sich bei der technischen Ausstattung erheblich. Mit der neuen Power-Kategorie gibt es ein Gerät mit besonders langer Akkulaufzeit. (Motorola, Smartphone)
Quelle: Golem