Controlling costs in Azure Data Explorer using down-sampling and aggregation

Azure Data Explorer (ADX) is an outstanding service for continuous ingestion and storage of high velocity telemetry data from cloud services and IoT devices. Leveraging its first-rate performance for querying billions of records, the telemetry data can be further analyzed for various insights such as monitoring service health, production processes, and usage trends. Depending on data velocity and retention policy, data size can rapidly scale to petabytes of data and increase the costs associated with data storage. A common solution for storage of large datasets for a long period of time is to store the data with differing resolution. The most recent data is stored at maximum resolution, meaning all events are stored in raw format. While the historic data is stored at reduced resolution, being filtered and/or aggregated. This solution is often used for time series databases to control hot storage costs.

In this blog, I’ll use the GitHub events public dataset as the playground. For more information read about how to stream GitHub events into your own ADX cluster by reading the blog, “Exploring GitHub events with Azure Data Explorer.” I’ll describe how ADX users can take advantage of stored functions, the “.set-or-append” command, and the Microsoft Flow Azure Kusto connector. This will help you to create and update tables with filtered, down-sampled, and aggregated data for controlling storage costs. The following are steps which I performed.

Create a function for down-sampling and aggregation

The ADX demo11 cluster contains a database named GitHub. Since 2016, all events from GHArchive have been ingested into the GitHubEvent table and now total more than 1 billion records. Each GitHub event is represented in a single record with event-related information on the repository, author, comments, and more.

Initially, I created the stored function AggregateReposWeeklyActivity which counts the total number of events in every repository for a given week.

.create-or-alter function with (folder = "TimeSeries", docstring = "Aggregate Weekly Repos Activity”)
AggregateReposWeeklyActivity(StartTime:datetime)
{
let PeriodStart = startofweek(StartTime);
let Period = 7d;
GithubEvent
| where CreatedAt between(PeriodStart .. Period)
| summarize EventCount=count() by RepoName = tostring(Repo.name), StartDate=startofweek(CreatedAt)
| extend EndDate=endofweek(StartDate)
| project StartDate, EndDate, RepoName, EventCount
}

I can now use this function to generate a down-sampled dataset of the weekly repository activity. For example, using the AggregateReposWeeklyActivity function for the first week of 2017 results in a dataset of 867,115 records.

Using Kusto query, create a table with historic data

Since the original dataset starts in 2016, I formulated a program that creates a table named ReposWeeklyActivity and backfills it with weekly aggregated data from the GitHubEvent table. The query runs in parallel ingestion of weekly aggregated datasets using the “.set-or-append” command. The first ingestion operation also creates the table that holds the aggregated data.

.show table GithubEvent details
| project TableName, SizeOnDiskGB=TotalExtentSize/pow(1024,3), TotalRowCount

.show table ReposWeeklyActivity details
| project TableName, SizeOnDiskGB=TotalExtentSize/pow(1024,3), TotalRowCount

Code sample:
using Kusto.Data.Common;
using Kusto.Data.Net.Client;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace GitHubProcessing
{
class Program
{
static void Main(string[] args)
{
var clusterUrl = "https://demo11.westus.kusto.windows.net:443;Initial Catalog=GitHub;Fed=True";
using (var queryProvider = KustoClientFactory.CreateCslAdminProvider(clusterUrl))
{
Parallel.For(
0,
137,
new ParallelOptions() { MaxDegreeOfParallelism = 8 },
(i) =>
{
var startDate = new DateTime(2016, 01, 03, 0, 0, 0, 0, DateTimeKind.Utc) + TimeSpan.FromDays(7 * i);
var startDateAsCsl = CslDateTimeLiteral.AsCslString(startDate);
var command = $@"
.set-or-append ReposWeeklyActivity <|
AggregateReposWeeklyActivity({startDateAsCsl})";
queryProvider.ExecuteControlCommand(command);

Console.WriteLine($"Finished: start={startDate.ToUniversalTime()}");
});
}
}
}
}

Once the backfill is complete, the ReposWeeklyActivity table will contain 153 million records.

Configure weekly aggregation jobs using Microsoft Flow and Azure Kusto connector

Once the ReposWeeklyActivity table is created and filled with the historic data, we want to make sure it stays updated with new data appended every week. For that purpose, I created a flow in Microsoft Flow that leverages Azure Kusto connector to ingest aggregation data on a weekly basis. The flow is built of two simple steps:

Weekly trigger of Microsoft Flow.
Use of “.set-or-append” to ingest the aggregated data from the past week.

For additional information on using Microsoft Flow with Azure Data Explorer see the Azure Kusto Flow connector.

Start saving

To depict the cost saving potential of down-sampling, I’ve used “.show table <table name> details” command to compare the size of the original GitHubEvent table and the down-sampled table ReposWeeklyActivity.

.show table GithubEvent details
| project TableName, SizeOnDiskGB=TotalExtentSize/pow(1024,3), TotalRowCount

.show table ReposWeeklyActivity details
| project TableName, SizeOnDiskGB=TotalExtentSize/pow(1024,3), TotalRowCount

The results, summarized in the table below, show that for the same time frame the down-sampled data is approximately 10 times smaller in record count and approximately 180 times smaller in storage size.

 

Original data

Down-sampled/aggregated data

Time span

2016-01-01 … 2018-09-26

2016-01-01 … 2018-09-26

Record count

1,048,961,967

153,234,107

Total size on disk (indexed and compressed)

725.2 GB

4.38 GB

Converting the cost savings potential to real savings can be performed in various ways. A combination of the different methods are usually most efficient in controlling costs.

Control cluster size and hot storage costs: Set different caching policies for the original data table and down-sampled table. For example, 30 days caching for the original data and two years for the down-sampled table. This configuration allows you to enjoy ADX first-rate performance for interactive exploration of raw data, and analyze activity trends over years. All while controlling cluster size and hot storage costs.
Control cold storage costs: Set different retention policies for the original data table and down-sampled table. For example, 30 days retention for the original data and two years for the down-sampled table. This configuration allows you to explore the raw data and analyze activity trends over years while controlling cold storage costs. On a different note, this configuration is also common for meeting privacy requirements as the raw data might contain user-identifiable information and the aggregated data is usually anonymous.
Use the down-sampled table for analysis: Running queries on the down-sampled table for time series trend analysis will consume less CPU and memory resources. In the example below, I compare the resource consumption of a typical query that calculates the total weekly activity across all repositories. The query statistics shows that analyzing weekly activity trends on the down-sampled dataset is approximately 17 times more efficient in CPU consumption and approximately eight times more efficient in memory consumption.

Running this query on the original GitHubEvent table consumes approximately 56 seconds of total CPU time and 176MB of memory.

The same calculation on the aggregated ReposWeeklyActivity table consumes only about three seconds of total CPU time and 16MB of memory.

Next steps

Azure Data Explorer leverages cloud elasticity to scale out to petabyte-size data, depict exceptional performance, and handle high query workloads. In this blog, I’ve described how to implement down-sampling and aggregation to control the costs associated with large datasets.

To find out more about Azure Data Explorer you can:

Try Azure Data Explorer in preview now.
Find pricing information for Azure Data Explorer.
Access documentation for Azure Data Explorer.

Quelle: Azure

Azure IoT Hub Java SDK officially supports Android Things platform

Connectivity is often the first challenge in the Internet of Things (IoT) world, that’s why more than three years ago we released Azure IoT SDKs. Azure IoT SDKs enable developers to build IoT applications that interact with IoT Hub and the IoT Hub Device Provisioning Service. The SDKs cover most popular languages in IoT development, including C, .NET, Java, Python, and Node.js, as well as popular platforms like Windows, Linux, OSX, and MBED. Since April 2018, we have added official support for iOS and Android to enable mobile IoT scenarios.

Today, we are happy to share that Azure IoT Hub Java SDK will officially support the Android Things platform. This announcement showcases our commitment to enable greater choice and flexibility in IoT deployments. Developers can leverage the benefits of the Android Things operating system on the device side, while using Azure IoT Hub as the central message hub that scales to millions of simultaneously connected devices.

All features in the Java SDK will be available on the Android Things platform, including Azure IoT Hub features we support and SDK-specific features such as retry policy for network reliability. In addition, the Android Things platform will be tested with every release. Our test suites include unit tests, integration tests, and end-to-end tests, all available on GitHub. We also publish the exact platform version we are testing on. 

Learn more about building applications for IoT Hub on Android Things by visiting the below resources:

A tutorial on building application for Android Things using Azure IoT Java SDK
A device client sample using Azure IoT Java SDK on Android Things
A repository of samples using Azure IoT Java SDK on GitHub
Source code for the Java SDK is available on GitHub

Quelle: Azure

Actuating mobility in the enterprise with new Azure Maps services and SDKs

The mobility space is at the forefront of the most complex challenges faced by cities and urban areas today. The movement of people and things is as much a driver of opportunity as it is an agent of chaos, aggravating existing challenges of traffic, pollution, and unbalanced livelihoods. Today, Azure Maps is continuing to expand the offerings of our platform, introducing a new set of capabilities in the form of SDKs and cloud-based services, to enable enterprises, partners, and cities to build the solutions that will help visualize, analyze, and optimize these mobility challenges.

The services we’re introducing are designed exclusively for the needs of the modern enterprise customer – powerful, real-time analytics and seamless cross-screen experiences, fortified by robust security services.

First, we’re officially moving the following services from public preview to general availability: Route Range {Isochrones}, Get Search Polygon, and Satellite and Hybrid Imagery. Furthermore, we’re introducing multiple new services. We’re enhancing our map canvas by introducing a stunning set of natural earth map tiles and an image compositor to make interaction with our maps more aesthetic, useful, and powerful. We’re also introducing Spatial Operations services that offer powerful analytics used by mobility applications and other industries today, as well as a new Android SDK and a Web SDK that equip Azure customers with the tools necessary to make smarter, faster, more informed decisions. And because privacy and security are top of mind, Azure Maps is now natively integrated with Azure Active Directory, making access to our services more secure while enabling roles and restrictions to our customers.

These services will provide Azure customers with the ability to offload map data processing and hosting costs all while getting the benefits of a rich set of maps and mapping services, securely with the fastest map data refresh rate available today. This refresh rate of data and services is bolstered by the recently announced partnership with TomTom who has committed to moving their map-making workloads to the Azure cloud significantly shortening the time from impetus to end user.

Android SDK (Preview)

While the Azure Maps Web SDK can work within a web control on mobile platforms, many developers prefer native support to interoperate with other native components and have these capabilities in native code… Java != JavaScript. In support of our customers who rely on applications running on Android, Azure Maps is distributing an Android SDK, complete with rendering maps and traffic, drawing, event handling, and using the variety of our map canvases. You can also connect to other Azure Maps services such as Search and Routing though the Azure Maps services APIs.

Spatial Operations (Preview)

Data analysis is central to the Internet of Things (IoT). Azure Maps Spatial Operations will take location information and analyze it on the fly to help inform our customers of ongoing events happening in time and space, enabling near real-time analysis and predictive modeling of events. Today Spatial Operations includes the following services:

Geofencing. A geofence is an “invisible fence” around a particular area. These “fences” exist in coordinates in the shape of customizable polygons and can be associated with temporal constraints so that fences are evaluated only when relevant. Furthermore, you can store a geofence Azure Maps Data Service (more on that below). With Azure Event Grid integration, you can create notifications whenever the position of an object changes with respect to a geofence – including entry, exit, or changing proximity to a geofence.

This has multiple applications. In transportation and logistics, it can be used to create alerts when cargo ships have arrived at waypoints along routes – critical for creating delivery alerts as well as for anticipating incidents of piracy. For drones, a geofence could enforce the limitation of where a drone is permitted to fly.

However, geofencing isn’t limited to transportation. In agriculture, it can enable real-time notifications when a herd has left a field. In construction, you can receive a notification when expensive equipment leaves a construction site that it shouldn’t be leaving or when it is parked in an area that may cause damage to equipment. It can also be used to warn site visitors when entering hazardous zones on sites, as implemented by Scottish IoT technology firm, Tagdat. In services, vending machines in college dorms often disappear. The servicing company can be notified if the machine leaves the premises. Geofencing is an amazingly powerful service to provide notifications and analytics when objects are moving. And, even when objects aren’t moving and should be!

Another example of a customer that is using this service is GovQA who state, “We use geofencing to identify when a requester is trying to submit a request which is outside the customer's pre-defined limits. The limits are defined by the city or county during the set up of the system. This allows the city to correctly handle the request and communicate with the requester appropriately using rules and configurations within GovQA."

Buffer. Build an area around points, lines, and polygons based on a given distance. Define the area in proximity of powerlines that should be kept clear from vegetation or create route buffers in fleet management for managing route deviations.

Closest Point. Returns the closest points between a base point and a given collection of points. It can be used to quickly identify the closest stores, charging stations, or in mobility scenarios, it could be used to identify the closest devices.

Great-circle distance. Returns the great-circle or shortest distance between two points on the surface of a sphere. In the context of drone delivery services, this API can be used to calculate the distance between an origin and destination as the crow flies taking into account the curvature of the earth so that a accurate time estimate for the delivery can be taken into account to optimize operations.

Point in Polygon. Returns a Boolean indicating whether the location is inside a given set of Polygon and MultiPolygon geometries. For example, the point in polygon API can be used to determine whether a home for sale is in the preferred area of customers.

Data Service (Preview)

Data is an imperative for maps, and bringing customer data closer to the Azure Maps service will reduce latency, increase productivity, and create powerful, new scenarios to light up in your applications. As such, Azure Maps will now have the ability for customers to upload and store up to 50MB of geospatial data for use with other Azure Maps services, such as geofencing or image composition. 

Azure Active Directory Integration (Preview)

Security and role-based access control have been of paramount concern for the modern enterprise. As such, we’re proud to announce that Azure Active Directory (AD) is now a core capability of Azure Maps. Use Azure AD to protect your customers’ information and implement secure access by providing role-based access control (RBAC).  Whether you have public applications or applications requiring a login, Azure AD and Azure Maps will support your security needs by authenticating your applications and Azure AD user(s). Additionally, this Azure AD implementation supports managed identities for Azure resources which provide Azure services (Azure App service, Azure Functions, Virtual Machines, etc.) with an automatically managed identity that can be authorized for access to Azure Maps services. 

Azure Maps Web SDK 2.0

Today we’re announcing a new module for accessing Azure Maps services to use in conjunction with the Azure Map Control. The new Service Module allows you to natively work directly with the Azure Maps services. This new module, plus the aforementioned adoption of Azure Active Directory, warranted the need for creating a new version and encapsulating them into a single Web SDK. Henceforth, we’ll containerize our services for web developers into the Azure Maps Web SDK 2.0. Note that the Azure Map Control 1.x will continue to be operational. However, we will innovate on 2.0 moving forward. The upgrade path for 1.x to 2.0 is quite simple by changing the version number! As a part of the new Azure Maps Web SDK 2.0 we’re also including some new client features:

Azure Active Directory (AAD). Azure Maps now natively supports Azure Active Directory to keep your access to Azure Maps secure. With native AAD integration, ensure your access is protected when your applications call Azure Maps.

Services Module. The new Services Module adds support for AAD and a much cleaner API interface for accessing Azure Maps services. It works with both the Web SDK and also in NodeJS. Being part of the Azure family of products, the Azure Maps Services Module was designed to align with an initiative to unify Azure SDKs and was required in order to add support for AAD.

Stroke gradients. There are times in location services development when you’d want to have gradient colors throughout the stroke of a line. Azure Maps Web SDK 2.0 now natively supports the ability to fill a line with a gradient of colors to show transition from one segment of a line to the next. As an example, these gradient lines can represent changes over time and distance, or different temperatures across a connected line of objects.

Shaded Relief map style. Below you’ll read about the new Shaded Relief Map Style. This beautiful, new map style is available immediately in the Azure Maps Web SDK 2.0.

Polygon Fill Patterns. Representations of polygons on a map can be done in a plethora of ways. In many scenarios, there will be a need to create polygons on the map. With the Azure Maps Web SDK 2.0 there is native control for shapes, borders, and fills. Now, we natively support patterns as a fill in addition to a single color fill. Patterns provide a unique way to highlight a specific area to really make it standout, especially if that area is surrounded by other color-shaded polygons. As an example, patterns can be used to show an area in transition, an area significantly different from other areas (such as financially, population, or land use), or areas highlighting facets of mobility such as no-fly zones or areas where permits are required.

Shaded Relief map style

Azure Maps comes complete with a few map styles including the road, dark gray, night, and satellite/hybrid. We’re adding a new map style – Shaded Relief – to complement the existing map styles. Shaded Relief is just that – an elegantly designed map canvas complete with the contours of the Earth. The Azure Maps Web SDK comes complete with the Shaded Relief canvas and all functions work seamlessly atop of it.

Image composition

Azure Maps is introducing a new image compositor that allows customers to render raster map images annotated with points, lines, and polygons. In many circumstances you can submit your request along with your respective point data to render a map image. For more complex implementations, you’ll want to use the map image compositor in conjunction with data stored in the aforementioned Azure Maps Data Service.

We always appreciate feedback from our customers. Feel free to comment below or post questions to Stack Overflow or our Azure Maps Feedback Forums.
Quelle: Azure

AWS Elemental MediaLive unterstützt jetzt Ressourcen-Tagging

Ab heute können Sie Tags zu Ihren AWS Elemental MediaLive-Ressourcen hinzufügen. Mit MediaLive-Tags können Sie Ihre Eingaben, Kanäle und Ressourcen für Eingabe-Sicherheitsgruppen auf verschiedene Weisen kategorisieren, z. B. nach Kostenstelle oder Besitzer. Dadurch wird die Kostenzuordnung für Live-Kanäle vereinfacht.
Quelle: aws.amazon.com

Get started quickly using templates in Azure Data Factory

Cloud data integration helps organizations integrate data of various forms and unify complex processes in a hybrid data environment. A number of times different organizations have similar data integration needs and require repeat business processes. Data Engineers or data developers in these organizations want to quickly get started with building data integration solutions and avoid building same workflows repeatedly. Today, we are announcing the support for templates in Azure Data Factory (ADF) to get started quickly with building data factory pipelines and improve developer productivity along with reducing development time for repeat processes. The template feature enables a ‘Template gallery’ for our customers that contains use-case based templates, data movement templates, SSIS templates or transformation templates that you can use to get hands-on with building your data factory pipelines.

Simply click Create pipeline from template on the Overview page or click +-> Pipeline from template on the Author page in your data factory UX to get started.

Select any template from the gallery and provide the necessary inputs to use the template. You can also read detailed description about the template or visualize the end to end data factory pipeline.

You can also create new connections to your data store or compute while providing the template inputs.

Once you click Use this template, you are taken to the template validation output. This guides you to fill in the required properties needed to publish and run the pipeline created from the template.

In addition to using out of box templates from the Template gallery, you might want to save your existing pipelines as templates as well. This might be required if different business units within your organization want to use the same pipeline but with different inputs. The templates feature in data factory allows you to save your existing pipelines as templates as well.

Ability to save your pipeline as a template requires you to enable GIT integration (Azure Dev Ops GIT or GitHub) in your data factory.

The template is then saved in your GIT repo under the templates folder.

The template is now visible to anyone who has access to your GIT repo. This template can be seen in the Templates section of the resource explorer.

You can also see the template under the My templates section in the template gallery.

Saving the template to the Git repository generates two files. It generates an ARM template along with a manifest file that is saved in your Git repo. The ARM template contains all information about your data factory pipeline, including pipeline activities, linked services, datasets etc. The manifest file contains information about the template description, template author, template tile icons and other metadata about the template.

All the ARM template and manifest files for the out of box official templates provided in the Template gallery can be seen in the official data factory GitHub location. In future, we will be working with our partners to come up with a certification process wherein anyone can submit a template that they want to enable in the Template gallery. The data factory team will certify the pull request corresponding to the submitted template and make the submitted template available in the Template gallery.

Find more information about the templates feature in data factory.

Our goal is to continue adding features to improve the usability of Data Factory tools. Get started building pipelines easily and quickly using Azure Data Factory. If you have any feature requests or want to provide feedback, please visit the Azure Data Factory forum.
Quelle: Azure

Azure.Source – Volume 69

Now in preview

Account failover now in public preview for Azure Storage

If you want to control storage account failover so that you can determine when storage account write access is required and the secondary replication state is understood, account failover for geo-redundant storage (GRS) enabled storage accounts is now available in preview. If the primary region for your geo-redundant storage account becomes unavailable for an extended period of time, you can force an account failover. As is the case with most previews, account failover should not be used with production workloads. There is no production SLA until the feature becomes generally available.

Azure Stream Analytics now supports Azure SQL Database as reference data input

Reference data is a dataset that is static or slow changing in nature that you can correlate with real-time data streams to augment the data. Stream Analytics leverages versioning of reference data to augment streaming data by the reference data that was valid at the time the event was generated. You can try using Azure SQL Database as a source of reference data input to your Stream Analytics job today. This feature is available for public preview in all Azure regions. This feature is also available in the latest release of Stream Analytics tools for Visual Studio.

Also in preview

Public preview: Azure Service Bus for Go
Public preview: Azure Service Bus for Python
Azure Database for PostgreSQL read replica preview
MongoDB to Azure Cosmos DB online migration is in preview

Now generally available

Individually great, collectively unmatched: Announcing updates to 3 great Azure Data Services

Azure Data Lake Storage Gen2 and Azure Data Explorer are now generally available, while Azure Data Factory Mapping Data Flow is available in preview. Azure Data Lake Storage (ADLS) combines the scalability, cost effectiveness, security model, and rich capabilities of Azure Blob Storage with a high-performance file system that is built for analytics and is compatible with the Hadoop Distributed File System. Azure Data Explorer (ADX) is a fast, fully managed data analytics service for real-time analysis on large volumes of streaming data. ADX is capable of querying 1 billion records in under a second with no modification of the data or metadata required. ADX also includes native connectors to Azure Data Lake Storage, Azure SQL Data Warehouse, and Power BI and comes with an intuitive query language so that customers can get insights in minutes. With Mapping Data Flow in Azure Data Factory, customers can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of their distributed infrastructure.

Azure Data Lake Storage Gen2 is now generally available
General Availability: Azure Data Explorer

An overview of Azure Data Explorer (ADX) | Azure Friday

Manoj Raheja joins Lara Rubbelke to demonstrate Azure Data Explorer (ADX) and provide an overview of the service from provisioning to querying. ADX is a fast, fully managed data analytics service for real-time analysis on large volumes of streaming data. It brings together a highly performant and scalable cloud analytics service with an intuitive query language to deliver near-instant insights.

Code-free data transformation at scale using Azure Data Factory | Azure Friday

Learn about the new code-free visual data transformation capabilities in Azure Data Factory as Gaurav Malhotra joins Lara Rubbelke to demonstrate how you can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of the distributed infrastructure.

Azure Cost Management now general availability for enterprise agreements and more!

Azure Cost Management is now generally available for all our Enterprise Agreement (EA) customers from within the Azure Portal. Azure Cost Management enables you to monitor all you spend through easy to use dashboards, create budgets, and optimize your cloud spend. This post also announces the public preview for web direct Pay-As-You-Go customers and Azure Government cloud. Azure Cost Management is available for free to all customers and partners to manage Azure costs. The Cloudyn portal will continue to be available to customers while we integrate all relevant functionality into native Azure Cost Management.

Azure Cost Management preview for Azure Government
Azure Cost Management preview for pay-as-you-go

Microsoft Healthcare Bot brings conversational AI to healthcare

The Microsoft Healthcare Bot is a white-labeled cloud service that powers conversational AI for healthcare. It’s designed to empower healthcare organizations to build and deploy compliant, AI-powered virtual health assistants and chatbots that help them put more information in the hands of their users, enable self-service, drive better outcomes, and reduce costs. The Microsoft Healthcare Bot is now available in the Azure Marketplace.

Also generally available

QnA Maker simplifies knowledge base management for your Q&A bot
Azure Automation: Update Azure Modules runbook is open source
Language improvements in Azure Stream Analytics are now generally available
Blob output partitioning in Azure Stream Analytics
TomTom is expanding its partnership with Microsoft
New Lsv2 Azure virtual machines (VMs) for big data, databases, and data warehousing are now generally available
Logic Apps is now available in US Gov Arizona region
SQL Server Migration Assistant support for Azure SQL Database Managed Instance
Support for Amazon RDS SQL Server to Azure SQL Database Managed Instance online migrations
M-series virtual machines (VMs) are now available in Australia Central 2 region
Azure DevTest Labs: Configure resource group control for your lab
Azure HDInsight available in US DoD East, US DoD Central, US Gov Texas
Azure Log Analytics is available in West US 2

News and updates

Analytics in Azure is up to 14x faster and costs 94% less than other cloud providers. Why go anywhere else?

Julia White, Corporate Vice President, Microsoft Azure covers how Azure provides the most comprehensive set of analytics services from data ingestion to storage to data warehousing to machine learning and BI. Each of these services have been finely tuned to provide industry leading performance, security and ease of use, at unmatched value. In a recent study by GigaOm, they found that Azure SQL Data Warehouse is now outperforming the competition up to 14x and up to 94% cheaper when compared with our competitors.

Configure resource group control for your Azure DevTest Lab

You now have the option to configure all your lab virtual machines (VMs) to be created in a single resource group. Learn how you can improve governance of your development and test environments by using Azure polices that you can apply at the resource group level. This enables you to use a script to either specify a new or existing resource group within your Azure subscription in which to create all your lab VMs. ARM environments created in your lab will continue to remain in their own resource groups and will not be affected by any option you select while working with this API.

Reserved instances now applicable to classic VMs, cloud services, and Dev/Test subscriptions

Two new Azure Reserved VM Instances’ (RI) features are now available that can provide you with additional savings and purchase controls. Classic VMs and Cloud Services users can now benefit from the RI discounts. In addition, Enterprise Dev/Test and Pay-As-You-Go Dev/Test subscriptions can now benefit from the RI discounts.

New connectors added to Azure Data Factory empowering richer insights

Azure Data Factory (ADF) is a fully-managed data integration service for analytic workloads in Azure, that empowers you to copy data from 80 plus data sources with a simple drag-and-drop experience. Also, with its flexible control flow, rich monitoring, and CI/CD capabilities you can operationalize and manage the ETL/ELT flows to meet your SLAs. A set of eight new Azure Data Factory connectors are now available that enable more scenarios and possibilities for your analytic workloads, including the ability to ingest data from Google Cloud Storage and Amazon S3.

Intelligent Edge support grows – Azure IoT Edge now available on virtual machines

Azure IoT Edge enables you to bring cloud intelligence to the edge and act immediately on real-time data. Azure IoT Edge already supports a variety of Linux and Windows operating systems as well as a spectrum of hardware from devices smaller than a Raspberry Pi to servers. Supporting IoT Edge in VMware vSphere offers you even more choice if you want to run AI on infrastructure you already own. VMware simplified the deployment process of Azure IoT Edge to VMs using VMware vSphere. Additionally, vSphere 6.7 and later provide passthrough support for Trusted Platform Module (TPM), allowing Azure IoT Edge to maintain its industry leading security framework by leveraging the hardware root of trust.

Completers in Azure PowerShell

Since version 3.0, PowerShell has supported applying argument completers to cmdlet parameters. We added argument completers to the Azure PowerShell modules that allow you to select valid parameter values without needing to make additional calls to Azure. These completers make the required calls to Azure to obtain the valid parameter values. Argument completers added include: Location, Resource Group Name, Resource Name, and Resource Id.

Simplify Always On availability group deployments on Azure VM with SQL VM CLI

Always On availability groups (AG) provide high availability and disaster recovery capabilities to your SQL Server database, whether on-premises, in the cloud, or a combination of both. Deploying an Always On Availability Group configuration for SQL Server on Azure VM is now possible with a few simple steps using the expanded capabilities enabled by SQL VM resource provider and Azure SQL VM CLI.

Help us shape new Azure migration capabilities: Sign up for early access!

We are enhancing Azure Migrate to deliver a unified and extensible migration experience with a goal of enabling customers and partners to plan, execute, and track their end to end migration journey using Azure Migrate. Delivering an integrated end-to-end migration experience that enables you to discover, assess, and migrate servers to Azure is the goal. Sign up for the private preview to try enhanced assessment and migration capabilities.

Microsoft Azure portal February 2019 update

In February, the Azure portal will bring you updates to several compute (IaaS) resources, the ability to export contents of lists of resources and resource groups as CSV files, an improvement to the layout of essential properties on overview pages, enhancements to the experience on recovery services pages, and expansions of setting options in Microsoft Intune.

Azure Monitor January 2019 updates

Azure Monitor now integrates the capabilities of Log Analytics and Application Insights for powerful, end-to-end monitoring of your applications. Learn what was added throughout the month of January to Application Insights, Log Analytics, Azure Monitor Workbooks, and Azure Metrics. In addition, Workbooks are now available in Azure Monitor for VMs.

Lighting up healthcare data with FHIR: Announcing the Azure API for FHIR

The healthcare industry is rapidly adopting the emerging standard HL7 FHIR®, or Fast Healthcare Interoperability Resources. This robust, extensible data model standardizes semantics and data exchange so all systems using FHIR can work together. Azure API for FHIR® enables rapid exchange of data in the FHIR format and is backed by a managed Platform-as-a Service (PaaS) offering in the cloud. Simplify data management with a single, consistent solution for protected health information.

Azure DevOps Projects supporting Azure Cosmos DB and Azure Functions

In the latest deployment of Azure DevOps Projects now available to all customers, we have added support for Azure Cosmos DB and Azure Functions as target destinations for your application. This builds on the existing Azure App Service, Azure SQL Database, and Azure Kubernetes Service (AKS) support.

Find out when your virtual machine hardware is degraded with Scheduled Events

Scheduled Events will now be triggered when Azure predicts that hardware issues will require a redeployment to healthy hardware in the near future, and provide a time window when Azure will redeploy the VMs to healthy hardware if a live migration was not possible. You can initiate the redeployment of your VMs ahead of Azure automatically doing it.

Additional updates

Data Migration Assistant support for target readiness assessment for Azure SQL DB Managed Instance
NuGet, npm, and other Artifacts tasks support proxies – Sprint 147 Update
Ev3 and ESv3 series VMs are available in Azure HDInsight
Azure StorSimple 5000/7000 series will no longer be supported starting July 9, 2019

Technical content

Processing trillions of events per day with Apache Kafka on Azure

The Siphon team shares their experiences and learnings from running one of world’s largest Kafka deployments. Besides underlying infrastructure considerations, they discuss several tunable Kafka broker and client configurations that affect message throughput, latency and durability. After running hundreds of experiments, they standardized the Kafka configurations required to achieve maximum utilization for various production use cases. They also explain how to tune a Kafka cluster to configure producers, brokers and consumers for the best possible performance.

Best practices to consider before deploying a network virtual appliance

A network virtual appliance (NVA) is a virtual appliance primarily focused on network functions virtualization. A typical network virtual appliance involves various layers four to seven functions like firewall, WAN optimizer, application delivery controllers, routers, load balancers, IDS/IPS, proxies, SD-WAN edge, and more. Common best practices include: Azure-accelerated networking support, multi-NIC support, using Azure Load Balancer for a high availability (HA) port load balancing rule, and support for Virtual Machine Scale Sets (VMSS).

Build your own deep learning models on Azure Data Science Virtual Machines

The Practical Deep Learning for Coders 2019 course from fast.ai helps software developers start building their own state-of-the-art deep learning models. Developers who complete this course will become proficient in deep learning techniques in multiple domains including computer vision, natural language processing, recommender algorithms, and tabular data. Learn how you can run this course using the Azure Data Science Virtual Machines (DSVM).

Performance best practices for using Azure Database for PostgreSQL – Connection Pooling

This blog is a continuation of a series of blog posts to share best practices for improving performance and scale when using Azure Database for PostgreSQL service. This post focuses on the benefits of using connection pooling and provides recommendations to improve connection resiliency, performance, and scalability of applications running on Azure Database for PostgreSQL.

Azure Event Grid: The Whole Story

As promised, Jeremy Likness, a Microsoft Cloud Advocate, takes a thorough look at the serverless backbone for all your event-driven computing needs: Azure Event Grid, a single service for managing routing of all events from any source to any destination.

Pentesting Azure — Thoughts on Security in Cloud Computing

Tanya Janca, a Microsoft Cloud Advocate, shares a list of her thoughts on penetration testing (pentesting) Azure applications as she sets out to read Pentesting Azure Applications by Matt Burrough. She promises a future post once she finishes reading the book.

Azure shows

Episode 265 – Azure DevOps Server | The Azure Podcast

Cynthia and Evan talk to Jamie Cool, Director of Program Management at Microsoft, who gives us all the details and potential use-cases for the Azure DevOps Server in your organization.

HTML5 audio not supported

An overview of Azure Blueprints | Azure Friday

Alex Frankel joins Scott Hanselman to discuss Azure Blueprints. Environment creation can be a long and error prone process. Azure Blueprints helps you deploy and update cloud environments in a repeatable manner using composable artifacts such as policies, role-based access control, and Azure Resource Manager templates.

Enhanced monitoring capabilities and tags/annotations in Azure Data Factory | Azure Friday

Gaurav Malhotra and Scott Hanselman explore tagging support and enhanced monitoring capabilities, including dashboards and improved debugging support in Azure Data Factory. Data integration is complex and the ability to monitor your data factory pipelines is a key requirement for dev ops personnel inside an enterprise. Now, you can tag/annotate your data factory pipelines to monitor all the pipeline executions with that particular tag. In addition, Data Factory visual tools provide dashboards to monitor your pipelines and an ability to monitor your pipeline execution by the Integration Runtime (IR) upon which the activities execute.

Logic Apps Connector to Ethereum Blockchain Networks | Block Talk

This episode provides an overview of how to use our serverless Ethereum Connector to transform smart contracts into an automated, visual workflow using the rich Azure Logic Apps Connectors ecosystem. We introduce the core concepts of Logic Apps and demonstrate a sample workflow triggered by a Solidity event, including how to read smart contract properties and write them to Azure Blob storage.

Azure IoT Device Agent for Windows | Internet of Things Show

Customers across industries, whether in an industrial setting or retail environment, are looking for ways to remotely provision and manage their IoT devices. Direct device access may not always be feasible when IoT devices are out in the field or on the factory floor. Microsoft Azure IoT Device Agent enables operators to configure, monitor and manage their devices remotely from their Azure dashboard. In this episode of the #IoTShow you will get an overview of Microsoft Azure IoT Device Agent with a demo.

Visual Studio for Mac: Publish to Azure | Visual Studio Toolbox

In this video, Cody Beyer will demonstrate how to log in and publish a web project to Azure. Join him and learn how to get the most out of Visual Studio for Mac by combining it with the power of Azure.

How to manage your Kubernetes clusters | Kubernetes Best Practices Series

Learn best practices on how to manage your Kubernetes clusters from field experts in this episode of the Kubernetes Best Practices Series. In this intermediate-level deep dive, you will learn about cluster management and multi-tenancy in Kubernetes.

How to add the Azure Cloud Shell to Visual Studio Code | Azure Tips and Tricks

In this edition of Azure Tips and Tricks, learn how to add the Azure Cloud Shell to Visual Studio Code. To add the Azure Cloud Shell, make sure you have the “Azure Account” extension installed in Visual Studio Code.

Overview of VS Code Extensions for Azure Government

In this episode of the Azure Government video series, Steve Michelotti sits down with Yujin Hong, Program Manager on the Azure Government Engineering team, to discuss many of the incredible VS Code Extensions for Azure. VS Code has quickly become the most popular editor in the world and there are numerous reasons for this, but one of the key reasons is VS Code’s extensibility. There are numerous VS Code Extensions available for Azure, and now these same extensions can be utilized for Azure Government! In this demo-heavy video, Yujin will show the unified authentication experience that enable all these cool extensions to seamlessly authenticate to Azure Government. She then walks through several demos that show how easy these extensions make it for developers to work with Storage, App Service, Cosmos DB, and Azure Functions in Azure Government. If you’re a developer who works with Azure Government, this video is for you!

Modern Data Warehouse overview | Azure SQL Data Warehouse

How do you think about building out your data pipeline in Azure? Discover how the Modern Data Warehouse solution pattern can modernize your data infrastructure in the cloud and enable new business scenarios. This is the first episode of an 8-part series on Azure SQL Data Warehouse.

Paul Stovell on Octopus Deploy – Episode 22 | The Azure DevOps Podcast

Paul Stovell, the founder and CEO of Octopus Deploy, joins the podcast today. Paul is an expert on all things automated deployment and Cloud operations. He started Octopus Deploy back in 2011, but prior to that, he worked as a consultant for about five years. Octopus Deploy is a pretty major player in the market. Their mission? To do automated deployments really, really well. Today, it helps over 20,000 customers automate their deployments, and employs 40 brilliant people. It can be integrated with Azure DevOps services and many other build services. On this week’s episode, Paul talks about his career journey and what led him to create Octopus Deploy; his accomplishments, goals, and visions for Octopus Deploy; which build servers integrate best with Octopus Deploy; his tips and tricks for how to best utilize it; and his vision for the future of DevOps.

HTML5 audio not supported

Events

Cloud Commercial Communities webinar and podcast newsletter–February 2019

In this Cloud Commercial Communities monthly webinar and podcast update, get links to both the upcoming links for February and links to webinars and podcasts from January. Each month the team focuses on core programs, updates, trends, and technologies that Microsoft partners and customers need to know to increase success using Azure and Dynamics. While much of the content is available for on-demand consumption, attending live webinars enables you to participate in Q&A with the webinar hosts.

Customers, partners, and industries

Modernizing payment management for online merchants

Learn about Guru, which is Newgen's fully-integrated portal that enables merchants to have a complete view of their payments, generate reports, capture/void transactions, and perform refunds. Guru is a fully cloud-based solution hosted completely on Microsoft Azure. It is a fully-managed SaaS solution that comes as a value addition with Newgen's Payment Gateway—a cutting edge payment technology for merchants.

Azure IoT drives next-wave innovation in infrastructure and energy

Last week at the DistribuTECH conference in New Orleans, Azure IoT partners showcased new solutions that bring the next level of “smart” to our grids. We invited eight partners to the Microsoft booth to demonstrate their approach to modernizing infrastructure, and how Azure IoT dramatically accelerates time to results. Learn how each partner showed new use cases for utilities, infrastructure, and cities that take advantage of cloud, AI, and IoT. With solutions that take full advantage of the intelligent cloud and intelligent edge, we continue to demonstrate how cloud, IoT, and AI have the power to drastically transform every industry. Smart grids will drive efficiencies to power and utility companies, grid operators, and energy prosumers.

Advancing tactical edge scenarios with Dell EMC Tactical Microsoft Azure Stack and Azure Data Box family

Microsoft, working with partners like Dell EMC, shared new capabilities that continue to deliver the power of the intelligent cloud and intelligent edge to government customers and their partners. Last year, we announced Azure Stack availability for Azure Government customers. With Azure Stack for Azure Government, agencies can efficiently modernize their on-premises legacy applications that are not ready or a fit for the public cloud due to cyber defense concerns, regulations, or other requirements. Data Box products help agencies to migrate large amounts of data, for example backup, archive or big data analytics, to Azure when they are limited by time, network availability, or costs.

Investing in our partners’ success

While Microsoft has long been a partner-oriented organization, some things are different with cloud. Specifically, partners need Microsoft to be more than just a great technology provider, you need us to be a trusted business partner. This requires long-term commitment and the ability to continually adapt and innovate as the market shifts. This has been, and continues to be, our commitment. Our partnership philosophy is grounded in the foundation that we can only deliver on our mission if there is a strong and successful ecosystem around us. Julia White, Corporate Vice President, Microsoft Azure,  highlights our key partner-oriented investments and some of the resources to help our partners successfully grow their businesses.

Azure Marketplace new offers – Volume 31

The Azure Marketplace is the premier destination for all your software needs – certified and optimized to run on Azure. Find, try, purchase, and provision applications & services from hundreds of leading software providers. You can also connect with Gold and Silver Microsoft Cloud Competency partners to help your adoption of Azure. In the first half of January we published 67 new offers.

Colfax amplifies the power of its ESAB product portfolio with IoT

With the evolution of the Internet of Things (IoT), Colfax saw an opportunity to transform its businesses. What was unique about Colfax’s IoT initiative – named Data Driven Advantage (DDA) –  was their vision of enabling customers to leverage the extensive ESAB portfolio. They selected PTC Thingworx for Azure and the Microsoft Azure IoT platform. With ESAB Digital Solutions, customers now have data to understand how processes, labor, and material contribute to the cost of each part.

This Week in Azure – 8 February 2019 | A Cloud Guru – This Week in Azure

This time on Azure This Week, Lars talks about Lsv2-series VMs are now in general availability, there’s a new version of the Microsoft Threat Modelling Tool out, Azure trivia is back every Monday, and come meet the team from A Cloud Guru next week at the Ignite Tour in Sydney.

Quelle: Azure

Amazon Elasticsearch Service unterstützt jetzt drei Availability Zone-Bereitstellungen.

Mit Amazon Elasticsearch Service können Sie Ihre Instances nun auf drei Availability Zones (AZs) bereitstellen und erreichen damit eine bessere Verfügbarkeit Ihrer Domains. Wenn Sie Replicas für Ihre Elasticsearch-Indizes aktivieren, verteilt Amazon Elasticsearch Service die primären und Replica-Shards zur Maximierung der Verfügbarkeit auf die Knoten in verschiedenen AZs. Wenn Sie während der Multi-AZ-Bereitstellung bestimmte Master-Knoten konfiguriert haben, werden diese automatisch in drei AZs platziert, um damit sicherzustellen, dass Ihr Cluster auch für den seltenen Fall einer AZ-Unterbrechung einen neuen Master auswählen kann. 
Quelle: aws.amazon.com