Azure Storage support for Azure Active Directory based access control generally available

We are pleased to share the general availability of Azure Active Directory (AD) based access control for Azure Storage Blobs and Queues. Enterprises can now grant specific data access permissions to users and service identities from their Azure AD tenant using Azure’s Role-based access control (RBAC).  Administrators can then track individual user and service access to data using Storage Analytics logs. Storage accounts can be configured to be more secure by removing the need for most users to have access to powerful storage account access keys.

By leveraging Azure AD to authenticate users and services, enterprises gain access to the full array of capabilities that Azure AD provides, including features like two-factor authentication, conditional access, identity protection, and more. Azure AD Privileged Identity Management (PIM) can also be used to assign roles “just-in-time” and reduce the security risk of standing administrative access.

In addition, developers can use Managed identities for Azure resources to deploy secure Azure Storage applications without having to manage application secrets.

When Azure AD authentication is combined with the new Azure Data Lake Storage Gen 2 capabilities, users can also take advantage of granular file and folder access control using POSIX-style access permissions and access control lists (ACL’s).

RBAC for Azure Resources can be used to grant access to broad sets of resources across a subscription, a resource group, or to individual resources like a storage account and blob container. Role assignments can be made through the Azure portal or through tools like Azure PowerShell, Azure CLI, or Azure Resource Manager templates.

Azure AD authentication is available from the standard Azure Storage tools including the Azure portal, Azure CLI, Azure PowerShell, Azure Storage Explorer, and AzCopy.

$ az login
Note, we have launched a browser for you to login. For old experience with device code, use "az login –use-device-code"
You have logged in. Now let us find all the subscriptions to which you have access…
[
{
"cloudName": "AzureCloud",
"id": "XXXXXXXX-YYYY-ZZZZ-AAAA-BBBBBBBBBBBB",
"isDefault": true,
"name": "My Subscription",
"state": "Enabled",
"tenantId": "00000000-0000-0000-0000-000000000000",
"user": {
"name": "cbrooks@microsoft.com",
"type": "user"
}
}
]
$ export AZURE_STORAGE_AUTH_MODE="login"
$ az storage blob list –account-name mysalesdata –container-name mycontainer –query [].name
[
"salesdata.csv"
]​

We encourage you to use Azure AD to grant users access to data, and to limit user access to the storage account access keys. A typical pattern for this would be to grant users the "Reader" role make the storage account visible to them in the portal along with the "Storage Blob Data Reader" role to grant read access to blob data. Users who need to create or modify blobs can be granted the "Storage Blob Data Contributor" role instead.

Developers are encouraged to evaluate Managed Identities for Azure resources to authenticate applications in Azure or Azure AD service principals for apps running outside Azure.

Azure AD access control for Azure Storage is available now for production use in all Azure cloud environments
Quelle: Azure

Azure Premium Block Blob Storage is now generally available

As enterprises accelerate cloud adoption and increasingly deploy performance sensitive cloud-native applications, we are excited to announce general availability of Azure Premium Blob Storage. Premium Blob Storage is a new performance tier in Azure Blob Storage for block blobs and append blobs, complimenting the existing Hot, Cool, and Archive access tiers. Premium Blob Storage is ideal for workloads that require very fast response times and/or high transactions rates, such as IoT, Telemetry, AI, and scenarios with humans in the loop such as interactive video editing, web content, online transactions, and more.

Premium Blob Storage provides lower and more consistent storage latency, providing low and consistent storage response times for both read and write operations across a range of object sizes, and is especially good at handling smaller blob sizes. Your application should be deployed to compute instances in the same Azure region as the storage account to realize low latency end-to-end. For more details on performance see, “Premium Block Blob Storage – a new level of performance.”

Figure 1 – Latency comparison of Premium and Standard Blob Storage

Premium Blob Storage is available with Locally-Redundant Storage (LRS) and comes with High-Throughput Block Blobs (HTBB), which provides very high and instantaneous write throughput when ingesting block blobs larger than 256KB.

Pricing and region availability

Premium Blob Storage has higher data storage cost, but lower transaction cost compared to data stored in the regular Hot tier. This makes it cost effective and can be less expensive for workloads with high transaction rates. Check out the pricing page for more details.

Premium Blob Storage is initially available in US East, US East 2, US Central, US West, US West 2, North Europe, West Europe, Japan East, Australia East, Korea Central, and Southeast Asia regions with more regions to come. Stay up to date on region availability through the Azure global infrastructure page.

Platform interoperability

At present, data stored in Premium cannot be tiered to Hot, Cool, or Archive access tiers. We are working on supporting object tiering in the future. To move data, you can synchronously copy blobs using the new PutBlockFromURL API (sample code) or AzCopy v10, which supports this API. PutBlockFromURL synchronously copies data server side, which means all data movement happens inside Azure Storage.

In addition, Storage Analytics Logging, Static website, and Lifecycle Management (preview) are not currently available with Premium Blob Storage.

Next steps

To get started with Premium Blob Storage you provision a Block Blob storage account in your subscription and start creating containers and blobs using the existing Blob Service REST API and/or any existing tools such as AzCopy or Azure Storage Explorer.

We are very excited about being able to deliver Azure Premium Blob Storage with low and consistent latency and look forward to hearing your feedback at premiumblobfeedback@microsoft.com. To learn more about Blob Storage please visit our product page.
Quelle: Azure

Youtube: Google will keine teuren Serien mehr produzieren

Mit seinen Eigenproduktionen hat Google versucht, mit Netflix und Amazon zu konkurrieren – diese Strategie ist wohl gescheitert: Insidern zufolge will das Unternehmen keine teuren Serien mehr produzieren. Wie es mit den bisherigen Youtube Originals weitergehen soll, ist noch unbekannt. (Youtube, Video-Community)
Quelle: Golem

Incrementally copy new files by LastModifiedDate with Azure Data Factory

Azure Data Factory (ADF) is the fully-managed data integration service for analytics workloads in Azure. Using ADF, users can load the lake from 80 plus data sources on-premises and in the cloud, use a rich set of transform activities to prep, cleanse, and process the data using Azure analytics engines, while also landing the curated data into a data warehouse for getting innovative analytics and insights.

When you start to build the end to end data integration flow the first challenge is to extract data from different data stores, where incrementally (or delta) loading data after an initial full load is widely used at this stage. Now, ADF provides a new capability for you to incrementally copy new or changed files only by LastModifiedDate from a file-based store. By using this new feature, you do not need to partition the data by time-based folder or file name. The new or changed file will be automatically selected by its metadata LastModifiedDate and copied to the destination store.

The feature is available when loading data from Azure Blob Storage, Azure Data Lake Storage Gen1, Azure Data Lake Storage Gen2, Amazon S3, File System, SFTP, and HDFS.

The resources for this feature are as follows:

1. You can visit our tutorial, “Incrementally copy new and changed files based on LastModifiedDate by using the Copy Data tool” to help you get your first pipeline with incrementally copying new and changed files only based on their LastModifiedDate from Azure Blob storage to Azure Blob storage by using copy data tool.

2. You can also leverage our template from template gallery, “Copy new and changed files by LastModifiedDate with Azure Data Factory” to increase your time to solution and provide you enough flexibility to build a pipeline with the capability of incrementally copying new and changed files only based on their LastModifiedDate.

3. You can also start from scratch to get that feature from ADF UI.

4. You can write code to get that feature from ADF SDK.

You are encouraged to give these additions a try and provide us with feedback. We hope you find them helpful in your scenarios. Please post your questions on Azure Data Factory forum or share your thoughts with us on Data Factory feedback site.
Quelle: Azure

Azure.Source – Volume 75

Preview | Generally available | News & updates | GTC 2019 | Technical content | Azure shows | Events | Customers, partners, and industries

Now in preview

Windows Virtual Desktop now in public preview on Azure

The public preview of the Windows Virtual Desktop service is now available on Azure. Customers can now access the only service that delivers simplified management, multi-session Windows 10, optimizations for Office 365 ProPlus, and support for Windows Server Remote Desktop Services (RDS) desktops and apps. With Windows Virtual Desktop, you can deploy and scale your Windows desktops and apps on Azure in minutes, while enjoying built-in security and compliance. Access to Windows Virtual Desktop is available through applicable RDS and Windows Enterprise licenses.

Azure Data Studio: An Open Source GUI Editor for Postgres

Support for PostgreSQL in Azure Data Studio is now available in preview. Azure Data Studio is a cross-platform modern editor focused on data development. It's available for Linux, MacOS, and Windows. We're also introducing a corresponding preview PostgreSQL extension in Visual Studio Code (VS Code). Both Azure Data Studio and Visual Studio Code are open source and extensible – two things that PostgreSQL itself is based on. If your primary use case is data, choose Azure Data Studio to manage multiple database connections, explore database object hierarchy, set up dashboards, and more.

Azure Container Registry virtual network and Firewall rules preview support

Announcing Azure Container Registry (ACR) now supports limiting public endpoint access. Customers can now limit registry access within an Azure Virtual Network (VNet), as well as whitelist IP addresses and ranges for on-premises services. VNet and Firewall rules are now supported with virtual machines (VM) and Azure Kubernetes Services (AKS). VNet and Firewall rules are available for public preview in all 25 public cloud regions. General availability (GA) will be based on a curve of usage and feedback.

Also available in preview

Public preview: Azure Log Analytics in France Central, Korea Central, North Europe
Azure Kubernetes Service (AKS) cluster autoscaler is in preview
Azure Kubernetes service (AKS) control plane audit logging is now in preview

Now generally available

Azure Backup for SQL Server in Azure Virtual Machines now generally available

Now generally available, Azure Backup for SQL Server Virtual Machines (VMs), an enterprise scale, zero-infrastructure solution that eliminates the need to deploy and manage backup infrastructure while providing a simple and consistent experience to centrally manage and monitor the backups on standalone SQL instances and Always On Availability Groups. Built into Azure, the solution combines the core cloud promises of simplicity, scalability, security and cost effectiveness with inherent SQL backup capabilities that are leveraged by using native APIs, to yield high fidelity backups and restores.

Also generally available

General availability: Azure Kubernetes Service in India Central
Azure premium blob storage is now generally available

News and updates

Microsoft’s Azure Cosmos DB is named a leader in the Forrester Wave: Big Data NoSQL

Announcing that Forrester has named Microsoft as a Leader in The Forrester Wave™: Big Data NoSQL for the first quarter of 2019 based on their evaluation of Azure Cosmos DB validating the exceptional market momentum and customer satisfaction. According to Forrester, “half of global data and analytics technology decision makers have either implemented or are implementing NoSQL platforms, taking advantage of the benefits of a flexible database that serves a broad range of use cases.” We are committed to making Azure Cosmos DB the best globally distributed database for all businesses and modern applications. With Azure Cosmos DB, we believe that you will be able to write amazingly powerful, intelligent, modern apps and transform the world.

March 2019 changes to Azure Monitor Availability Testing

Azure Monitor Availability Testing allows you to monitor the availability and responsiveness of any HTTP or HTTPS endpoint that is accessible from the public internet. At the end of this month we are deploying some major changes to improve performance and reliability, as well as to allow us to make more improvements for the future. This post highlights and describes some of the changes needed to ensure that your tests continue running without any interruption.

Data integration with ADLS Gen2 and Azure Data Explorer using Data Factory

Introducing the latest integration in Azure Data Factory. Azure Data Lake Storage Gen2 is a data lake platform that combines advanced data lake solutions with the economic, global scale, and enterprise grade security of Azure Blob Storage. Azure Data Explorer is a fully-managed data integration service to operationalize and manage the ETL/ELT flows with flexible control flow, rich monitoring, and continuous integration and continuous delivery (CI/CD) capabilities. You can now meet the advanced needs of your analytics workloads with unmatched price performance and the security of one of the best clouds for analytics.

Additional news and updates

The PowerShell Extension is now in the Azure Data Studio Marketplace
Azure Boards and Azure Pipelines GitHub Integration Improvements – Sprint 149 Update

News from NVIDIA GPU Technology Conference

Over the years, Microsoft and NVIDIA have helped customers run demanding applications on GPUs in the cloud. Last week at NVIDIA GPU Technology Conference 2019 (GTC 2019) in San Jose, Microsoft made several announcements on our collaboration with NVIDIA to help developers and data scientists deliver innovation faster.

Microsoft and NVIDIA extend video analytics to the intelligent edge

Microsoft and NVIDIA are partnering on a new approach for intelligent video analytics at the edge to transform raw, high-bandwidth videos into lightweight telemetry. This delivers real-time performance and reduces compute costs for users. With this latest collaboration, NVIDIA DeepStream and Azure IoT Edge extend the AI-enhanced video analytics pipeline to where footage is captured, securely and at scale. Now, our customers can get the best of both worlds—accelerated video analytics at the edge with NVIDIA GPUs and secure connectivity and powerful device management with Azure IoT Edge and Azure IoT Hub.

Microsoft and NVIDIA bring GPU-accelerated machine learning to more developers

GPUs have become an indispensable tool for doing machine learning (ML) at scale. Our collaboration with NVIDIA marks another milestone in our venture to help developers and data scientists deliver innovation faster. We are committed to accelerating the productivity of all machine learning practitioners regardless of their choice of framework, tool, and application. Two of the integrations that Microsoft and NVIDIA have built together to unlock industry-leading GPU acceleration for more developers and data scientists are covered in the next two posts.

Azure Machine Learning service now supports NVIDIA’s RAPIDS

Azure Machine Learning service is the first major cloud ML service to support NVIDIA’s RAPIDS, a suite of software libraries for accelerating traditional machine learning pipelines with NVIDIA GPUs. With RAPIDS on Azure Machine Learning service, users can accelerate the entire machine learning pipeline, including data processing, training and inferencing, with GPUs from the NC_v3, NC_v2, ND or ND_v2 families. Azure Machine Learning service users are able to use RAPIDS in the same way they currently use other machine learning frameworks, and can use RAPIDS in conjunction with Pandas, Scikit-learn, PyTorch, TensorFlow, etc.

ONNX Runtime integration with NVIDIA TensorRT in preview

Announcing the open source the preview of the NVIDIA TensorRT execution provider in ONNX Runtime. Taking another step towards open and interoperable AI by enabling developers to easily leverage industry-leading GPU acceleration regardless of their choice of framework, developers can now tap into the power of TensorRT through ONNX Runtime to accelerate inferencing of ONNX models, which can be exported or converted from PyTorch, TensorFlow, and many other popular frameworks.

Technical content

Reducing security alert fatigue using machine learning in Azure Sentinel

Alert fatigue is real. Security analysts face a huge burden of triage as they not only have to sift through a sea of alerts, but to also correlate alerts from different products manually. Machine learning (ML) in Azure Sentinel is built-in right from the beginning and focuses on reducing alert fatigue while offering ML toolkits tailored to the security community; including ML innovations aimed at making security analysts, security data scientists, and engineers productive.

Breaking the wall between data scientists and app developers with Azure DevOps

Data scientists are used to developing and training machine learning models for their favorite Python notebook or an integrated development environment (IDE). The app developer is focused on the application lifecycle – building, maintaining, and continuously updating the larger business application. As AI is infused into more business-critical applications, it is increasingly clear that we need to collaborate closely to build and deploy AI-powered applications more efficiently. Together, Azure Machine Learning and Azure DevOps enable data scientists and app developers to collaborate more efficiently while continuing to use the tools and languages that are already familiar and comfortable.

Step-By-Step: Getting Started with Azure Machine Learning

In this comprehensive guide, Anthony Bartolo explains how to set up a prediction model using Azure Machine Learning Studio. The example comes from a real-life hackfest with Toyota Canada and predicts the pricing of vehicles.

Intro to Azure Container Instances

Aaron Powell covers using Azure Container Instances to run containers in a really simple approach in this handy introductory guide. It walks through a hello world demo and then some advanced scenarios on using ACR and connecting to Azure resources (such as a SQL server).

Getting started with Azure Monitor Dynamic Thresholds

This overview from Sonia Cuff discusses the new Dynamic Thresholds capability in Azure Monitor, where machine learning sets the alert threshold levels when you are monitoring metrics (e.g., CPU percentage use in a VM or HTTP request time in an application).

How to Deploy a Static Website Into Azure Blob Storage with Azure DevOps Pipeline

In this third video and blog post of Frank Boucher’s CI/CD series, he creates a release Pipeline and explains how to deploy an ARM template to create or update Azure resources and deploy a static website into a blob storage. This series explains how to build a continuous integration and continuous deployment system using Azure DevOps Pipeline to deploy a Static Website into Azure Bob Storage.

Fixing Azure Functions and Azure Storage Emulator 5.8 issue

If you happen to run into an error after updating Azure Functions to the latest version, Maxime Rouille's hotfix is for you. He not only explains what to do, but why this might happen and what his hot fix does.

Using Key Vault with Your Mobile App the Right Way

You know you need to keep your app’s secrets safe and follow best practices – but how? In this post, Matt Soucoup uses a Xamarin app to walk through how and why to use Azure Key Vault, Active Directory, and Azure Functions to keep application secrets off of your mobile app and in Key Vault – without tons of extra work for you.

How do You Structure Your Code When Moving Your API from Express to Serverless Functions?

There are a lot of articles showing how to use serverless functions for a variety of purposes. A lot of them cover how to get started, and they are very useful. But what do you do when you want to organize them a bit more as you do for your Node.js Express APIs? There's a lot to talk about on this topic, but in this post, John focuses specifically on one way you can organize your code.

Securely monitoring your Azure Database for PostgreSQL Query Store

Long running queries may interfere with the overall database performance and likely get stuck on some background process, which means that from time to time you need to investigate if there are any queries running indefinitely on your databases. See how you can set up alerting on query performance-related metrics using Azure Functions and Azure Key Vault.

Expanded Jobs functionality in Azure IoT Central

We have improved device management workflow through additional jobs functionalities that make managing your devices at scale much easier. In this brief post, learn to copy an existing job, save a job to continue working on later, stop or resume a running job, and download a job details report once your job has completed running.

Azure Stack IaaS – part five

Self-service is core to Infrastructure-as-a-Service (IaaS). Azure's IaaS gives the owner of the subscription everything they need to create virtual machines (VMs) and other resources on their own, without involving an administrator. This post shows a few examples of Azure and Azure Stack self-service management of VMs.

Additional technical content

Lesson Learned #78: DataSync – Cannot enumerate changes at the RelationalSyncProvider for table ‘customertable’ – Execution Timeout Expired
Lesson Learned #80: Monitoring Login-Logout events in Azure SQL Database using Extended Events
Lesson Learned #81: How to create a linked server from Azure SQL Managed Instance to SQL Server OnPremise or Azure VM
Lesson Learned #82: Azure SQL Database Managed Instance supports only COPY_ONLY restoring a database backup
Deep dive # 2: How to configure Exchange on-premise Server hybrid integration with Office 365/Azure Infrastructure and test REST API calls?

Azure shows

Episode 271 – Azure Stack – Tales from the field | The Azure Podcast

Azure Stack experts from Microsoft Services, Heyko Oelrichs and Rathish Ravikumar, give us an update on Azure Stack and some valuable tips and tricks based on their real-world experiences deploying it for customers.

HTML5 audio not supported

Read the transcript

One Dev Question: What new HoloLens and Azure products were released in Barcelona? | One Dev Minute

In this episode of One Dev Question, Alex Kipman discusses Microsoft Mixed Reality, featuring announcements from Mobile World Congress.

Data Driven Styling with Azure Maps | Internet of Things Show

Ricky Brundritt, PM in the Azure Maps team, walks us through data driven styling with Azure Maps. Data driven styling allows you to dynamically style layers at render time on the GPU using properties on your data. This provides huge performance benefits and allows large datasets to be rendered on the map. Data driven style expressions can greatly reduce the amount of code you would normally need to write and define this type of business logic using if-statements and monitoring map events.

New Microsoft Azure HPC Goodness | Tuesdays with Corey

Corey Sanders and Evan Burness (Principal Program Manager on the Azure Compute team) sat down to talk about new things in the High Performance Computing space in Azure.

Get ready for Global Azure Bootcamp 2019 | Azure Friday

Global Azure Bootcamp is a free, one-day, local event that takes place globally. It's an annual event run by the Azure community. This year, Global Azure Bootcamp is on Saturday, April 27, 2019. Event locations range from New Zealand to Hawaii and chances are good that you can find a location near you. You may even organize your own local event and receive sponsorship so long as you register by Friday, March 29, 2019. Join in to receive sponsorship, Azure passes, Azure for Students, lunch, and a set of content to use for the day.

How to use Azure Monitor Application Insights to record custom events | Azure Tips and Tricks

Learn how to use Azure Monitor Application Insights to make your application logging smarter with Custom Event Tracking.

How to create an Azure Kubernetes Service cluster in the Azure Portal | Azure Portal Series

The Azure Portal enables you to get started quickly with Kubernetes and containers. In this video, learn how to easily create an Azure Kubernetes Service cluster.

Phil Haack on DevOps at GitHub | Azure DevOps Podcast

Jeffrey Palermo and Phil Haack take a dive deep into DevOps at GitHub. They talk about Phil's role as Director of Engineering; how GitHub, as a company, grew while Phil worked there; the inner workings of how the GitHub website ran; and details about how various protocols, continuous integration, automated testing, and deployment worked at GitHub.

HTML5 audio not supported

Episode 3 – ExpressRoute, Networking & Hybridity, Oh My! | AzureABILITY

AzureABILITY host Louis Berman discusses ExpressRoute, networking and hybridity with Microsoft's Bryan Woodworth, who specializes in networking, connectivity, high availability, and routing for hybrid workloads in Azure.

HTML5 audio not supported

Read the transcript

Events

Microsoft Azure for the Gaming Industry

Cloud computing is increasingly important for today’s global gaming ecosystem, empowering developers of any size to reach gamers in any part of the world. In this wrap-up post from GDC 2019, learn how Azure and PlayFab are a powerful combination for game developers. Azure brings reliability, global scale, and enterprise-level security, while PlayFab provides Game Stack with managed game services, real-time analytics, and comprehensive LiveOps capabilities.

The Value of IoT-Enabled Intelligent Manufacturing

Learn how you can apply insights from real-world use cases of IoT-enabled intelligent manufacturing when you attend the Manufacturing IoT webinar on March 28th: Go from Reaction to Prediction – IoT in Manufacturing. In addition, you'll learn how you can use IoT solutions to move from a reactive to predictive model. For additional hands-on, actionable insights around intelligent edge and intelligent cloud IoT solutions, join us on April 19th for the Houston Solution Builder Conference.

Customers, partners, and industries

Power IoT and time-series workloads with TimescaleDB for Azure Database for PostgreSQL

Announcing a partnership with Timescale that introduces support for TimescaleDB on Azure Database for PostgreSQL for customers building IoT and time-series workloads. TimescaleDB allows you to scale for fast ingest and complex queries while natively supporting full SQL and has a proven track record of being deployed in production in a variety of industries including oil & gas, financial services, and manufacturing. The partnership reinforces our commitment to supporting the open-source community to provide our users with the most innovative technologies PostgreSQL has to offer.

This Week in Azure – 22 March 2019 | A Cloud Guru – Azure This Week

Lars is away this week and so Alex Mackey covers Azure Portal improvements, the Azure mobile app, Azure SQL Data Warehouse Workload Importance and Microsoft Game Stack.

Quelle: Azure

Larger, more powerful Managed Disks for Azure Virtual Machines

Today, we are excited to announce the general availability of larger and more powerful Azure Managed Disk sizes of up to 32 TiB on Premium SSD, Standard SSD, and Standard HDD disk offerings. In addition, we support disk sizes up to 64 TiB on Ultra Disks in preview.

We are also increasing the performance scale targets for Premium SSD to 20,000 IOPS and 900 MB/sec. With the general availability (GA) of larger disk sizes, Azure now offers a broad range of disk sizes for your production workload needs, with unmatched scale and performance.

Some of the benefits of using larger disk sizes are:

Grow your virtual machine (VM) disks’ capacity independently of the VM size you are using. You can attach more disk capacity per VM without having to upgrade to larger VM sizes or leveraging multiple VMs. For instance, you can now attach two 32 TiB data disks on the smallest B1 Azure VM to achieve total disks capacity of 64 TiB. Our largest VMs can now support up to 2 PiB of Disks Storage on a single VM. This provides a cost-effective solution to migrate data hosted on-premises disks, NAS, or backup data to Azure.
Lift and shift a new range of workloads to Azure with the higher performance scale limits. With Premium SSDs offering up to 20,000 IOPS and 900 MB/sec, you can accelerate a wide variety of transactional workloads and exceed the on-premises runtime performance of data analytics applications. Moreover, the higher scalability limits on Standard SSDs make it the perfect fit for hosting big data workloads and scale-out file servers that are extremely throughput intensive.
Simplify your deployments, service management, and reduce VM maintenance costs by avoiding the need to stripe multiple disks to achieve larger capacity or high performance. You can still stripe multiple large disks to achieve even higher capacity and performance.

For more information on the new disk SKUs and the scalability targets, please see the section below. To achieve the expected disk IOPS and bandwidth, we recommend that you review our guidance on how to optimize your disk performance.

Premium SSD disks

Premium SSDs are ideal for enterprise applications like Dynamics AX, Dynamics CRM, and database workloads like SQL Server, Cassandra, and MongoDB that require consistent high performance and low latency.

Standard SSD disks

Standard SSDs are suitable for web servers, low IOPS application servers, big data, and enterprise applications that need consistent performance at lower IOPS levels.

Standard HDD disks

Standard HDDs based on magnetic drives offer the most cost-effective solution for Dev/Test and backup scenarios.

Note:

To achieve the target performance of Standard SSD and Standard HDD new disk SKUs, you should use these recommended VM series.
P60/70/80 and E60/70/80 disks now have higher performance targets than scale limits offered in preview. If you deployed these disk SKUs in preview, you can follow the guidelines to make sure your existing disks are updated for the higher GA performance.

Getting started

You can create new Managed Disks or extend your existing disks to larger sizes using the Azure portal, Powershell, or CLI today! The newly introduced sizes are generally available in all regions in Azure Public Cloud, and support for national clouds including Azure US Government and China 21Vianet will be available in the coming weeks. Larger disk sizes are only supported on Managed Disks. If you are not using Managed Disks yet, convert from Unmanaged to Managed Disks now and take advantage of the unique capabilities.

We have also added support for the following scenarios:

Azure portal experience for new disk sizes up to 32 TiB
Resize your existing Managed Disks up to 32 TiB using Powershell and CLI

Our next step is to enable the preview of Azure Backup for larger disk sizes providing you full coverage for enterprise backup scenarios by the end of May 2019. Similarly, Azure Site Recovery support for on-premises to Azure, and Azure to Azure Disaster Recovery will be extended to all disk sizes soon.

Visit our service website to explore the Azure Disks portfolio. To learn about pricing, you can visit the Managed Disks pricing page.

General feedback

We look forward to hearing your feedback on the new disk sizes. Please email us at AzureDisks@microsoft.com.
Quelle: Azure

Building serverless microservices in Azure – sample architecture

Distributed applications take full advantage of living in the cloud to run globally, avoid bottlenecks, and always be available for users worldwide. This not only requires the right infrastructure to be deployed in, but also support for the decoupled architecture an application with these characteristics versus the traditional monolithic approach. This is why most cloud native applications are using a microservices architecture that helps achieve this at global scale.

The benefits of using a microservices architecture are maximized when those applications are built in the cloud, with a wide range of managed services that will make it easier to materialize the microservices promise. With those services managing infrastructure and scaling for you, and improving the way you can do critical processes like deployment or monitoring those solutions, you can maximize the amount of value delivered per cycle.

There are different patterns you might want to explore and each of them fits a specific scenario. Today we’re focusing on how building serverless microservices is a great fit for event-driven scenarios, and how you can use the Azure Serverless platform.

Building serverless, event-driven microservices

Taking an event-driven approach to build microservices-based applications when fitting the scenario and the problem to solve, can help mitigate some problems of a more traditional approach:

Scaling compute resources: With the automated and flexible scaling based on actual demand that’s provided by a serverless platform, you don’t need to worry about how the scaling happens or how to handle it on the code of your application.
Operations dependency: When deploying a microservices-based solution, there is usually a strong dependency on the operations teams for allocating infrastructure resources for deployment and execution, both initially and with each incremental change. Taking a serverless approach by using fully managed services removes that necessity, since all the underlying infrastructure is managed for you by the platform.
Costs for hosting: With a traditional deployment, the cost is determined by how much you have to pay for each hosting node, and usually implies an over allocation of resources, resulting in increased hosting expenditure. With an event-driven approach, using services with consumption-based pricing models means the price is determined by the number of requests or operations, and the costs for hosting are better adjusted to the real usage of the solution (and are usually lower).
Services discovery: Managing services integration, communication, and interactions are common problems on distributed applications. Since each service is performing a very specific action according to the single responsibility principle, more often than not a service will need to communicate with others to achieve its goal. The real challenge is keeping these connections as simple as possible and totally decoupled services. With an event-driven approach, you can take advantage of both of the following:

A centralized, unified way of communicating services via events using a pub-sub model, fully managed with Azure Event Grid.
An integrated programming model based on triggers to automatically respond to those events and bindings to connect and integrate different services seamlessly, such as the experience offered by Azure Functions and Logic Apps for event-driven compute.

Sample architecture for serverless microservices

In the sample architecture for a rideshare application for a fictitious company named Relecloud, you can learn more about the architectural design of a microservices-based application. The sample uses fully managed services from the Azure Serverless platform to build the main building blocks of microservices solutions such as:

API Gateway: Using API Management to expose the endpoints of the backend services, so the client application can consume them securely. This also helps with decoupling the client side from the backend, since you can easily manage changes on where the services are actually hosted from the gateway without affecting the client application.
Entry points: The public facing APIs that the client application will be using, powered by Azure Functions responding to HTTP requests.
Workflow orchestrator: Middle-tier service to interconnect the public facing APIs with the actual backend services that are tied to the data stores and other critical components, orchestrating the work of these services based on actions on the client side.
Async queue: Messaging service to handle services intercommunication and pass along information and data between the different services, represented by Azure Event Grid. By using an event-driven approach, we’re also favoring services decoupling, since the information exchange will have a fire-and-forget approach, with services pushing events and handlers subscribed to those events and handlers subscribing to those events for processing.
Backend services: The services that are directly operating with the data layer and other components of the solution, isolated from the rest and easily replaceable if needed (e.g. changing the type of database used to store data) without affecting the rest of the application and interactions.

Next steps

– Register for this webinar to learn how to develop microservices-based applications with a serverless architecture using fully managed Azure services.

– Browse the Relecloud Rideshare sample architecture to get step-by-step guidance on how to build the sample application and detailed information of the solution design.

– Sign up for an Azure free account if you don’t have one already, and start building serverless applications today.
Quelle: Azure