Stackdriver usage and costs: a guide to understand and optimize spending

Google Stackdriver is a cloud-based managed services platform designed to give you visibility into app and infrastructure services. Stackdriver’s monitoring, logging and APM tools make it easy to navigate between data sources to view performance details and find the root causes of any issues.One of the benefits of cloud-based, managed services is that you pay only for what you use. While this usage-based pricing model might provide a cost benefit when compared to standard software licensing, it can sometimes be challenging to optimize and control costs, particularly if you’re new to cloud. We’ve worked across our organization here at Google to develop a Stackdriver cost optimization solution guide to help you understand and optimize your Stackdriver usage and costs.Stackdriver, like other Google Cloud Platform (GCP) services, provides detailed usage information and granular billing. You can use these reporting features, shown below, to understand your product usage and the resulting billing. You can see that Stackdriver is pulling data from multiple sources into one dashboard, along with sizing information:Each of the products in the Stackdriver suite provides configuration capabilities that can be used to adjust the volume of metrics, logs, or traces ingested into the platform, which can help you save on usage costs. Here are a few ways to configure the usage volumes for Logging, Monitoring and Trace.Logging: You can employ logging exclusion filters and sampling for logs with high volume to reduce your logs’ ingestion volume.Monitoring: You can carefully design or redesign your metric labels to prevent high cardinality labels from creating a large number of time series.Trace: You can use sampling and span quotas to stay within the desired volume of traces.The Stackdriver cost optimization solution guide describes these and other possible optimization strategies, what generates Stackdriver costs, and how to identify usage in the first place. You can use the solution guide to understand your own product usage and then implement strategies to meet your usage or cost objectives. Be sure to let us know about other guides and tutorials you’d like to see by clicking on the “Send Feedback” button at the bottom of the solution page.
Quelle: Google Cloud Platform

Exploring container security: Encrypting Kubernetes secrets with Cloud KMS

Editor’s note: This post picks up again on our blog post series on container security at Google.At Google Cloud, we care deeply about protecting your data. That’s why we encrypt data at rest by default, including data in Google Kubernetes Engine (GKE). For Kubernetes secrets—small bits of data your application needs at build or runtime—your threat model might be different, so storage-layer encryption is insufficient. Today, we’re excited to announce in beta GKE application-layer secrets encryption, using the same keys you manage in our hosted Cloud Key Management Service (KMS).Secrets in KubernetesIn a default Kubernetes installation, Kubernetes secrets are stored in etcd in plaintext. In GKE, this is managed for you: GKE encrypts these secrets on disk, and monitors this data for insider access. But this might not be enough to protect those secrets from a potentially malicious insider, or a malicious application in your environment.Kubernetes 1.7 first introduced application-layer encryption of secrets for differential protection of secrets. Using this feature, you can encrypt your secrets locally, with a key also stored in etcd. If you’re running Kubernetes in an environment that doesn’t provide encryption by default, this helps meet security best practices; however, if a malicious intruder gained access to etcd, they would still practically have access to your secrets.A few releases later in Kubernetes 1.10, envelope encryption of secrets with a KMS provider was introduced, meaning that a local key is used to encrypt the secrets (known as a “data encryption key”), and that the key is itself encrypted with another key not stored in Kubernetes (a “key encryption key”). This model means that you can regularly rotate the key encryption key without having to re-encrypt all the secrets. Furthermore, it means that you can rely on an external root of trust for your secrets in Kubernetes—systems ike Cloud KMS or HashiCorp Vault.Using Cloud KMS to protect secrets in KubernetesApplication-layer secrets encryption is now in beta in GKE, so you can protect secrets with envelope encryption: your secrets are encrypted locally in AES-CBC mode with a local data encryption key, and the data encryption key is encrypted with a key encryption key you manage in Cloud KMS as the root of trust. It’s pretty simple as all the hard work is done for you—all you have to do is choose the key you want to use for a particular cluster.This approach provides flexibility in your security model, to meet specific requirements you may have:Root of trust: You can choose whether to protect your secrets using only Kubernetes, with application-layer software-based encryption with a key from Cloud KMS, or hardware-based encryption from Cloud HSM.Key rotation: You can implement best practices for regular key rotation for your root of trust.Separation of duties: You can separate who manages your cluster and who manages and protects your secrets.Centralized auditing: You can manage and audit key accesses centrally, and use the same key for your secrets in Kubernetes as you use for other secrets in GCP.Getting startedTo enable application-layer secrets encryption for a new cluster, specify the –database-encryption-key flag as part of cluster creation, with your Cloud KMS key KMS_KEY_ID:Note that you must use a Cloud KMS key in the same location as your cluster, and that you need to give a GKE service account account to use your key for these operations.That’s how easy it is to encrypt your Kubernetes secrets with GKE. For more detail, check out Application-layer Secrets Encryption, or this Cloud KMS hands-on lab1. We also gave a talk on Kubernetes secrets at KubeCon China that provides more color. Now encrypt those secrets!1. Take this lab for free before February 28, 2019 with the code 1j-shh-441)
Quelle: Google Cloud Platform

Azure DevOps Projects supporting Azure Cosmos DB and Azure Functions

With Azure DevOps Projects we want to make it is easy for you to set up a fully functional DevOps pipeline tailored to the development language and application platform you want to leverage.

We have been making continuous enhancements to Azure DevOps Projects and in the latest deployment now available to all customers, we have added support for Azure Cosmos DB and Azure Functions as target destinations for your application. This builds on the existing Azure App Service, Azure SQL Database, and Azure Kubernetes Service (AKS) support.

The support of Azure Cosmos DB in Azure DevOps Projects means that you will now be able to create a skeleton two tier Node.js application backed by Azure Cosmos DB in just a few clicks. Azure DevOps Projects creates all the scaffolding for your pipeline to give you everything you need to develop, deploy, and monitor your application including:

A Git code repository hosted in Azure Repos with a skeleton Node.js application
A CI/CD pipeline in Azure Pipelines for deploying the database tier to Azure Cosmos DB and the web-tier on Azure
Web Apps for containers, Azure Kubernetes Service, or as a Windows Web App in Azure
Provisioning all the Azure resources in your subscription required for the application
Application Insights integration for monitoring your application

After using Azure DevOps Projects to scaffold your application from the Azure portal you can then access the code and the CI/CD pipeline using Azure Repos and Azure Pipelines respectively.

With support for Azure Functions in Azure DevOps Projects, you will be able to create a skeleton .Net or a sample Node.js serverless application in just a few clicks. Like Azure Cosmos DB, with this workflow you will have everything you need to develop, deploy, and monitor your application including the Git Code repo, CI/CD pipeline, Application Insights, and necessary Azure resources.

These features are now available in the Azure portal. Get started by creating Azure DevOps Projects now. To learn more, please take a look at the documentation, “Azure DevOps Projects.”
Quelle: Azure

Analytics in Azure is up to 14x faster and costs 94% less than other cloud providers. Why go anywhere else?

It’s true. With the volume and complexity of data rapidly increasing, performance and security are critical requirements for analytics. But not all analytics services are built equal. And not all cloud storage is built for analytics.

Only Azure provides the most comprehensive set of analytics services from data ingestion to storage to data warehousing to machine learning and BI. Each of these services have been finely tuned to provide industry leading performance, security and ease of use, at unmatched value. In short, Azure has you covered.

Unparalleled price-performance

When it comes to analytics, price-performance is key. In July 2018, GigaOm published a study that showed that Azure SQL Data Warehouse was 67 percent faster and 23 percent cheaper than Amazon Web Service RedShift.

That was then. Today, we’re even better!

In the most recent study by GigaOm, they found that Azure SQL Data Warehouse is now outperforming the competition up to a whopping 14x times. No one else has produced independent, industry-accepted benchmarks like these. Not AWS Redshift or Google BigQuery. And the best part? Azure is up to 94 percent cheaper.

This industry leading price-performance extends to the rest of our analytics stack. This includes Azure Data Lake Storage, our cloud data storage service, and Azure Databricks, our big data processing service. Customers like Newell Brands – worldwide marketer of consumer and commercial products such as Rubbermaid, Mr. Coffee and Oster – recently moved their workload to Azure and realized significant improvements.

“Azure Data Lake Storage will streamline our analytics process and deliver better end to end performance with lower cost.” 

– Danny Siegel, Vice President of Information Delivery Systems, Newell Brands

Secure cloud analytics

All the price-performance in the world means nothing without security. Make the comparison and you will see Azure is the most trusted cloud in the market. Azure has the most comprehensive set of compliance offerings, including more certifications than any other cloud vendor combined with advanced identity governance and access management with Active Directory integration.

For analytics, we have developed additional capabilities to meet customers’ most stringent security requirements. Azure Data Lake Storage provides multi-layered security including POSIX compliant file and folder permissions and at-rest encryption. Similarly, Azure SQL Data Warehouse utilizes machine learning to provide the most comprehensive set of security capabilities across data protection, access control, authentication, network security, and automatic threat detection.

Insights for all

What’s the best compliment to Azure Analytics’ unmatched price-performance and security? The answer is Microsoft Power BI.

Power BI’s ease of use enables everyone in your organization to benefit from our analytics stack. Employees can get their insights in seconds from all enterprise data stored in Azure. And without limitations on concurrency, Power BI can be used across teams to create the most beautiful visualizations that deliver powerful insights.

Leveraging Microsoft’s Common Data Model, Power BI users can easily access and analyze enterprise data using a common data schema without needing complex data transformation. Customers looking for petabyte-scale analytics can leverage Power BI Aggregations with Azure SQL Data Warehouse for rapid query. Better yet, Power BI users can easily apply sophisticated AI models built with Azure. Powerful insights easily accessible to all.

Customers like Heathrow Airport, one of the busiest airports in the world, are empowering their employees with powerful insights:

“With Power BI, we can very quickly connect to a wide range of data sources with very little effort and use this data to run Heathrow more smoothly than ever before. Every day, we experience a huge amount of variability in our business. With Azure, we’re getting to the point where we can anticipate passenger flow and stay ahead of disruption that causes stress for passengers and employee.”

– Stuart Birrell, Chief Information Officer, Heathrow Airport

Future-proof

We continue to focus on making Azure the best place for your data and analytics. Our priority is to meet your needs for today and tomorrow.

So, we are excited to make the following announcements:

General availability of Azure Data Lake Storage: The first cloud storage that combines the best of hierarchical files system and blob storage.
General availability of Azure Data Explorer: A fast, fully managed service that simplifies ad hoc and interactive analysis over telemetry, time-series, and log data. This service, powering other Azure services like Log Analytics, App Insights, Time Series Insights, is useful to query streaming data to identify trends, detect anomalies, and diagnose problems.
Preview of new Mapping Data Flow capability in Azure Data Factory: Visual Flow provides a visual, zero-code experience to help data engineers to easily build data transformations. This complements the Azure Data Factory’s code-first experience to enable data engineers of all skill levels to collaborate and build powerful hybrid data transformation pipelines.

Azure provides the most comprehensive platform for analytics. With these updates, Azure solidifies its leadership in analytics.

Watch this space. There’s more to come!

Get started today

To learn more about how Azure provides the best price-performance, get started today.
Quelle: Azure

Individually great, collectively unmatched: Announcing updates to 3 great Azure Data Services

As Julia White mentioned in her blog today, we’re pleased to announce the general availability of Azure Data Lake Storage Gen2 and Azure Data Explorer. We also announced the preview of Azure Data Factory Mapping Data Flow. With these updates, Azure continues to be the best cloud for analytics with unmatched price-performance and security. In this blog post we’ll take a closer look at the technical capabilities of these new features.

Azure Data Lake Storage – The no compromise Data Lake

Azure Data Lake Storage (ADLS) combines the scalability, cost effectiveness, security model, and rich capabilities of Azure Blob Storage with a high-performance file system that is built for analytics and is compatible with the Hadoop Distributed File System. Customers no longer have to tradeoff between cost effectiveness and performance when choosing a cloud data lake.

One of our key priorities was to ensure that ADLS is compatible with the Apache ecosystem. We accomplished this by developing the Azure Blob File System (ABFS) driver. The ABFS driver is officially part of Apache Hadoop and Spark and is incorporated in many commercial distributions. The ABFS driver defines a URI scheme that allows files and folders to be distinctly addressed in the following manner:

abfs[s]://file_system@account_name.dfs.core.windows.net/<path>/<path>/<filename>

It is important to note that the file system semantics are implemented server-side. This approach eliminates the need for a complex client-side driver and ensures high fidelity file system transactions.

To further boost analytics performance, we implemented a hierarchical namespace (HNS) which supports atomic file and folder operations. This is important because it reduces the overhead associated with processing big data on blob storage. This speeds up job execution and lowers cost because fewer compute operations are required.

The ABFS driver and HNS significantly improve ADLS’ performance, removing scale and performance bottlenecks.  This performance enhancement is now available at the same low cost as Azure Blob Storage.

ADLS offers the same powerful data security capabilities built into Azure Blob Storage, such as:

Encryption of data in transit and at rest via TLS 1.2
Storage account firewalls
Virtual network integration
Role-based access security

In addition, ADLS’ file system provides support for POSIX compliant access control lists (ACLs). With this approach, you can provide granular security protection that restricts access to only authorized users, groups, or service principals and provides file and object data protection.

ADLS is tightly integrated with Azure Databricks, Azure HDInsight, Azure Data Factory, Azure SQL Data Warehouse, and Power BI, enabling an end-to-end analytics workflow that delivers powerful business insights throughout all levels of your organization. Furthermore, ADLS is supported by a global network of big data analytics ISV’s and system integrators, including Cloudera and Hortonworks.

Next steps

Visit the Azure Data Lake Storage product page to learn more.
Access documentation, quick starts, and tutorials.
Find pricing information for Azure Data Lake Storage.
Get started with Azure Data Lake Storage now.

Azure Data Explorer – The fast and highly scalable data analytics service

Azure Data Explorer (ADX) is a fast, fully managed data analytics service for real-time analysis on large volumes of streaming data. ADX is capable of querying 1 billion records in under a second with no modification of the data or metadata required. ADX also includes native connectors to Azure Data Lake Storage, Azure SQL Data Warehouse, and Power BI and comes with an intuitive query language so that customers can get insights in minutes.

Designed for speed and simplicity, ADX is architected with two distinct services that work in tandem: The Engine and Data Management (DM) service. Both services are deployed as clusters of compute nodes (virtual machines) in Azure.

The Data Management (DM) service ingests various types of raw data and manages failure, backpressure, and data grooming tasks when necessary. The DM service also enables fast data ingestion through a unique method of automatic indexing and compression.

The Engine service is responsible for processing the incoming raw data and serving user queries. It uses a combination of auto scaling and data sharding to achieve speed and scale. The read-only query language is designed to make the syntax easy to read, author, and automate. The language provides a natural progression from one-line queries to complex data processing scripts for efficient query execution.

ADX is available in 41 Azure regions and is supported by a growing ecosystem of partners, including ISV’s and system integrators.

Next steps

Visit the Azure Data Explorer product page to learn more.
Access documentation, quick starts, and tutorials.
Find pricing information for Azure Data Explorer.
Get started with Azure Data Explorer now.

Azure Data Factory Mapping Data Flow – Visual, zero-code experience for data transformation

Azure Data Factory (ADF) is a hybrid cloud-based data integration service for orchestrating and automating data movement and transformation. ADF provides over 80 built-in connectors to structured, semi-structured, and unstructured data sources.

With Mapping Data Flow in ADF, customers can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of their distributed infrastructure.

Mapping Data Flow combines a rich expression language with an interactive debugger to easily execute, trigger, and monitor ETL jobs and data integration processes.

Azure Data Factory is available in 21 regions and expanding, and is supported by a broad ecosystem of partners including ISV’s and system integrators.

Next steps

Visit the Azure Data Factory product page to learn more.
Access documentation, quick starts, and tutorials.
Find pricing information on Azure Data Factory.
Learn more about Mapping Data Flow.
Get started and sign-up for the preview of Azure Data Factory – Mapping Data Flow.

Azure is the best place for data analytics

With these technical innovations announced today, Azure continues to be the best cloud for analytics. Learn more why analytics in Azure is simply unmatched.
Quelle: Azure