Updated data processing terms to reflect new EU Standard Contractual Clauses

For years, Google Cloud customers who are subject to European data protection laws1 have relied on our Standard Contractual Clauses (SCCs), as previously approved by regulators, to legitimize overseas transfers of their customer personal data when using our services. Today, we are glad to announce an update to our data processing terms for Google Cloud Platform, and Google Workspace (including Workspace for Education) and Cloud Identity, to incorporate various modules of the new EU SCCs approved by the European Commission on June 4, 2021, as well as separate UK SCCs. For all Google Cloud customers, this new approach:Offers clear and transparent support for their compliance with applicable European data protection laws;Simplifies the entities involved in contracting by no longer requiring any customer to deal with an additional Google entity only for SCC purposes; andAligns more closely with potential flows of data within the services.For customers located in Europe2, Google has further simplified data transfer compliance by assuming all the responsibilities imposed by the new SCCs.We have also published a new whitepaper that outlines the European legal rules for data transfers and explains our approach to implementing the new EU SCCs – as well as separate UK SCCs – so that our customers can better understand what our updated terms mean for them and their privacy compliance.We remain committed to helping all customers who rely on our cloud services meet applicable regulatory requirements by protecting any international transfers of their data.FOOTNOTESWe refer specifically to the EU GDPR, UK GDPR and Swiss Federal Data Protection Act (FDPA)We refer specifically to customers located in the EEA, UK and SwitzerlandRelated ArticleReaffirming Google Cloud’s commitments to EU businesses in light of the EDPB’s RecommendationsReaffirming our commitment to GDPR and updated European Data Protection Board recommendations.Read Article
Quelle: Google Cloud Platform

Introducing Quota Monitoring Solution: Single Dashboard with Alerting capabilities

If you are looking for an automated way to manage quotas over a large number of projects, we are excited to introduce a Quota Monitoring Solution from Google Cloud Professional Services. By default, Google Cloud employs resource quotas to restrict how much of a particular shared Google Cloud resource you can use. Each quota represents a specific countable resource, such as API calls to a particular service or the number of compute cores used concurrently by your project.Quotas are enforced for a variety of reasons, including:To protect the community of Google Cloud users by preventing unforeseen spikes in usage and overloaded services.To help you manage resources. For example, you can set your own limits on service usage while developing and testing your applications to avoid unexpected bills from using expensive resources.Quota Monitoring Solution solution benefits anyone who manages quotas across projects, folders, or organizations. It offers an easy and centralized way to view and monitor the quota usage in a central dashboard and to use default alerting capabilities across all quotas. Specifically, the solution provides:Automated aggregation of quotas across all projects in given organizations or folders and a recurring scan at a defined frequency (e.g. hourly, daily) for new projects to automatically capture their quotas.A dashboard that provides visibility into recent resource usage against the individual quotas across all projects.Preconfigured alerting through email or other communication channels (e.g., email, SMS, Pub/Sub, etc.) when a resource reaches a certain threshold of its quota.The solution is easily deployable through Terraform so that you can adopt it into your project with minimal time investment.Outside of the Quota Monitoring Solution, there are additional ways of viewing your quota information, such as using the Google Cloud Console or using the gcloud command-line tool. You can also manually define Alerting Policies in Cloud Monitoring to send out notifications when a resource reaches a certain threshold of its quota. For example, you can define an alerting policy that triggers when the CPU usage of Compute Engine VM instances goes above 75% of the quota in any region.In case your project needs more of a particular resource than your quota allows, you can request a quota limit increase for the majority of quotas directly in the Google Cloud Console. In the vast majority of cases, quota increase requests are evaluated and processed automatically. However, depending on the nature of your request, a small number of quota increase requests needs to be handled by human reviewers. They typically process your request within 2-3 business days, so it is important to plan ahead.Ineffective quota management can lead to many different problems. For example, the lack of sufficient quota can prevent consuming additional resources, which could be needed for auto-scaling events, or for performing a GKE cluster upgrade. This can cause outages or service degradation, which could impact your customers’ experience and potentially impact your business revenues.Please note: Many services also have limits that are unrelated to the quota system. These are fixed constraints, such as maximum file sizes or database schema limitations, which cannot be increased or decreased. You can find out about these on the relevant service’s Quotas and limits page (for example, Cloud Storage quotas and limits).1. Technical ArchitectureThe diagram below shows the Quota Monitoring Solution architecture flow you can deploy in minutes using the deployment guide and accompanying terraform scripts.The solution includes a terraform script that you can deploy in a GCP project. This templates provisions the following resources in a GCP project:Cloud Scheduler- Cloud Scheduler is a fully managed, enterprise-grade cron-job scheduler. It is used to trigger Cloud Functions at scheduled intervals.Cloud Functions – Cloud Functions is an event-driven serverless compute platform. It contains the code logic to scan project quotas.Pub/Sub- Pub/Sub allows services to communicate asynchronously, with very low latency. It is used to support event-driven application design and high scalability. BigQuery –  BigQuery is a serverless, highly scalable data warehouse. It is used to store data for project quotas.Data Studio – Data Studio is a Dashboard and reporting tool. It is used to display quotas across multiple projects in a single view. You can configure other visualization tools of your choice, like Looker. In the future, we will also provide a Looker-based Dashboard.Cloud Monitoring Custom Log Metric and Alert – Google Cloud Monitoring offers logging and alerting capabilities. This is used to enable alerting that gives timely awareness to quota issues in your cloud applications so you can request a quota increase quickly. In this solution, Cloud Scheduler also works as an interface for you to provide configurations:You can provide folder IDs, organization IDs as parent nodes for which you want to monitor quotas. Parent nodes can be a single folder ID or organization ID or list of folder IDs or Organization IDs. You can also configure metric threshold and email addresses for Alerting. Currently, you can receive alerts via Email, Mobile App, PagerDuty, SMS, Slack, Web Hooks and Pub/Sub.Metric threshold is used to generate alerts. For example you can choose 70% or 80% as a threshold to generate alerts. The alerts will be sent to the configure alerting channel. Please note:Any changes in the projects like addition of new projects or deletion of existing projects will be reflected automatically in the subsequent scheduled scanning.Any changes in the GCP Cloud Monitoring APIs will be reflected. For example,  the introduction of new quota metrics will be reflected automatically in the solution without making any code changes.Any changes in the Cloud Monitoring alert notification channels will also be available automatically without the need to make any code changes.1.1 WorkflowIf we put all these components together, the workflow starts at Cloud Scheduler. You can provide your preferences of folders and organization IDs, metric threshold, email addresses and configure to run the job at scheduled intervals. Create a service account and grant access to view quota usage in the target organization/s or folder/s.Cloud scheduler automatically runs at configured frequency, for example daily, and passes user configurations to Cloud Functions. Based on the service account access, the first Cloud Functions instance lists projects of the parent node, generates the list of project IDs, and publishes them to the Pub/Sub topic.The second Cloud Function instance is triggered from the message generated in the previous step and receives the project IDs as a separate message from the topic.Upon receiving the project IDs, the second Cloud Function instance fetches the quotas for each project using publicly available GCP cloud monitoring APIs. Cloud Function loads the project’s quota data in the BigQuery table. Alerting A scheduled query on the BigQuery table filters all quotas with usage greater than the configured threshold.The third Cloud Function’s instance logs the data for metrics in Cloud Logging.Preconfigured custom log metrics in Google Cloud Monitoring create alerts, which are sent to the configured notification channel. ReportingOnce the data is loaded into BigQuery, Data Studio fetches the data from BigQuery and displays it in the Dashboard in a single view across all projects and metrics.Data Studio Dashboard is easy to use, customize, and share. You can also configure scheduled reporting from this Dashboard to receive Quota monitoring reports in email as PDF.2. Deployment ProcessThe Google Cloud custom quota monitoring solution can be deployed in a few minutes using the deployment guide and accompanying terraform scripts. Following are three simple steps:Create a service account and grant accessRun terraform script to provision all resources in a Google Cloud projectConfigure the Data Studio Dashboard For step-by-step details, please see the Readme section of the deployment guide on PSO Github. 3. How to Customize?This is an open source code available on Google Cloud PSO Github repository. You can fork the repository and customize as per your requirements. During the deployment, Terraform downloads the code. The code and the data stays in the customer’s environment. 4. SummaryQuota Monitoring solution lets you to view and monitor quotas for Google Cloud services from one central location for an organization or folder. Service Quotas enables you to easily and consistently track quota usage and receive alerts to save time and effort in setting up quota monitoring for new projects.If you have questions or suggestions for this solution, please fill out this form or reach out to us at pso-quota-monitoring@google.com AcknowledgementWe would like to thank Darpan Shah, Raghavendra Kyatham, and Marlon Pimentel for their contribution in building the solution as well as Rahul Khandkar, Karina Rivera, and Rohan Karne for sponsoring this project. We would also like to extend our gratitude to all Technical Account Managers who helped prototype and roll out the solution globally, especially Naveen Nelapudi, Vijay Narayanan, Rohit Iyer, Emily Wong, and Isha Rana.Related ArticleBest practices for optimizing your cloud costsFollowing these best practices will help optimize your cloud costs to the needs of your business, so you can get through these unpredicta…Read Article
Quelle: Google Cloud Platform

Kontoübergreifende Amazon-Redshift-Datenfreigabe ist nun allgemein in den AWS-GovCloud-Regionen (USA) verfügbar

Mit der Datenfreigabe von Amazon Redshift können Sie Live-Daten und transaktionskonsistente Daten über verschiedene Redshift-Cluster hinweg freigeben, ohne die Komplexität und Verzögerungen, die mit Datenkopien und Datenbewegungen verbunden sind. Die Möglichkeit der gemeinsamen Nutzung von Daten in Clustern, die sich im selben AWS-Konto befinden, ist bereits in AWS-GovCloud-Regionen (USA) verfügbar. Jetzt ist auch die gemeinsame Nutzung von Daten über Redshift-Cluster in verschiedenen AWS-Konten allgemein in AWS-GovCloud-Regionen (USA) verfügbar. Die kontoübergreifende Datenfreigabe wird für alle Amazon-Redshift-RA3-Knotentypen unterstützt. Es entstehen keine zusätzlichen Kosten für die Nutzung der kontoübergreifenden Freigabe auf Ihren Amazon-Redshift-Clustern. 
Quelle: aws.amazon.com

AWS Elastic Beanstalk unterstützt die dynamische Auswahl des Instance-Typs

AWS Elastic Beanstalk unterstützt jetzt die dynamische Auswahl des Instance-Typs für die Umgebungen von Elastic Beanstalk. Das bedeutet, dass Elastic Beanstalk automatisch alle kompatiblen Instance-Typen abruft, damit Sie eine Vielzahl von Anwendungen ausführen können. Mit dem dynamischen Instance-Typ können Sie den am besten geeigneten Instance-Typ auswählen, um die Leistung Ihrer Anwendung zu optimieren. Wenn Sie beispielsweise über Anwendungen für Machine Learning verfügen, können Sie die Leistung optimieren, indem Sie einen beschleunigten Computing-Instance-Typ wie p3 oder p4d auswählen.
Quelle: aws.amazon.com

Amazon Detective unterstützt S3- und DNS-Suchtypen und fügt Suchdetails hinzu

Amazon Detective erweitert die Unterstützung für Sicherheitsuntersuchungen für Amazon Simple Storage Service (S3) und DNS-bezogene Ergebnisse zu Amazon GuardDuty und bietet eine vollständige Abdeckung aller Erkennungen von GuardDuty. Darüber hinaus macht es Detective jetzt für einen Sicherheitsanalysten noch einfacher, Entitäten und Verhaltensweisen mit einer überarbeiteten Benutzererfahrung zu untersuchen. 
Quelle: aws.amazon.com

Amazon Connect Chat unterstützt jetzt die Übergabe eines Kundennamens und von Kontaktattribute über die Chat-Benutzeroberfläche

Amazon Connect Chat unterstützt jetzt die Übergabe eines Kundennamens und von Kontaktattributen über die Chat-Benutzeroberfläche, damit Sie die Chat-Benutzererfahrung personalisieren können. Kontaktattribute umfassen relevante Metadaten, die mit dem Kontakt verknüpft sind, wie z. B. Kunden-ID, Treuestatus oder sogar Kontext zu der Webseite, auf der sich der Kunde beim Starten des Chats befand. Kontaktattribute sind in Amazon-Connect-Flows verfügbar und machen es einfach, einzigartige und überzeugende Kundenerlebnisse zu schaffen, z. B. die Priorisierung eines Platinum-Kunden oder das Ausführen eines Agenten-Bildschirms mit den angezeigten relevanten Kundeninformationen. Darüber hinaus können Sie den Kundennamen auch über die Chat-Benutzeroberfläche freigeben und so sicherstellen, dass der Name sowohl für den Agenten als auch für den Kunden während der gesamten Interaktion sichtbar ist und Ihre Agenten die Konversation personalisieren können.
Quelle: aws.amazon.com

Optimieren Sie Ihr Amazon-Forecast-Modell mit der Genauigkeitsmetrik Ihrer Wahl

Wir freuen uns, Ihnen mitteilen zu können, dass Sie in Amazon Forecast jetzt die Genauigkeitsmetrik Ihrer Wahl auswählen können, um AutoML anzuweisen, das Training eines Prädiktors für die ausgewählte Genauigkeitsmetrik zu optimieren. Darüber hinaus haben wir drei weitere Genauigkeitsmetriken hinzugefügt, um Ihren Prädiktor zu bewerten: durchschnittlicher gewichteter Quantilverlust (Durchschnitt wQL), mittlerer absoluter prozentualer Fehler (MAPE) und mittlerer absoluter skalierter Fehler (MASE).
Quelle: aws.amazon.com