Börsenprospekt: Porsche hat Ausstiegsklausel mit Cariad vereinbart
Der Sportwagenhersteller Porsche will sich nicht allein auf die Software von Cariad verlassen. Bei Problemen ist noch ein Ausstieg möglich. (Cariad, Börse)
Quelle: Golem
Der Sportwagenhersteller Porsche will sich nicht allein auf die Software von Cariad verlassen. Bei Problemen ist noch ein Ausstieg möglich. (Cariad, Börse)
Quelle: Golem
Nach 30 Jahren liefert das James-Webb-Teleskop der Wissenschaft wieder Bilder aller Ringe des Planeten Neptun. (James-Webb-Teleskop, Nasa)
Quelle: Golem
Erste Piloten sind bei 50G und 100G, erklärte ein Nokia-CTO. Huawei berichtete von 50G und Fiber To The Room durch flexible Glasfaser mit Heißkleber. (Nokia, Huawei)
Quelle: Golem
Bevor Amazon Ende September neue Echo-Produkte vorstellen wird, gibt es neue Hinweise auf Echo Auto 2. Auch eine Alexa-Soundbar wäre denkbar. (Echo, Amazon)
Quelle: Golem
Kostenlos auf Youtube, heute um 16 Uhr: Die Azure-Experten Aaron Siller und Andreas Zeisler erläutern, wie sich Office-Dateien mit der Zugriffssteuerung der Azure Information Protection absichern lassen. (Golem Karrierewelt, Internet)
Quelle: Golem
Der G Cloud ist ein leichter Handheld, der speziell für Cloud Gaming entwickelt wurde. Das mobile Gerät setzt also zwingend Internet voraus. (Logitech, Steam)
Quelle: Golem
In einer Kooperation haben Microsoft und Canonical das Windows Subsystem für Linux so umgearbeitet, dass Systemd zum Einsatz kommt. (WSL, Ubuntu)
Quelle: Golem
Das von Google-Gründer Larry Page ins Leben gerufene Luftfahrt-Start-up wird geschlossen. Probleme gab es schon früher. (Flugauto, Technologie)
Quelle: Golem
Analytics data is growing exponentially and so is the dependence on the data in making critical business and product decisions. In fact, the best decisions are said to be the ones which are backed by data. In data, we trust! But do we trust the data ? As the data volumes have grown – one of the key challenges organizations are facing is how to maintain the data quality in a scalable and consistent way across the organization. While data quality is not a newly found need, the needs used to be contained when the data footprint was small and data consumers were few. In such a world, data consumers knew who the producers were and producers knew what the consumers needed. But today, data ownership is getting distributed and data consumption is finding new users and use cases. So the existing data quality approaches find themselves limited and are isolated to certain pockets of the organization. This often exposes data consumers to inconsistent and inaccurate data which ultimately impacts the decisions made from that data. As a result, organizations today are losing 10s of millions of dollars due to the low quality of data.These organizations are looking for solutions that empower their data producers to consistently create high quality data cloud scale. Building Trust with Dataplex data quality Earlier this year, at Google Cloud, we launched Dataplex, an intelligent data fabric that enables governance and data management across distributed data at scale. One of the key things Dataplex enables out-of-box is for data producers to build trust in the data with a built-in data quality. Dataplex data quality task delivers a declarative, data-ops centric experience for validating data across BigQuery and Google Cloud Storage. Producers can now easily build and publish quality reports or can easily include data validations as part of their data production pipeline. Reports can be aggregated across various data quality dimensions and the execution is entirely serverless.Dataplex data quality task provides – A declarative approach for defining “what good looks like” that can be managed as part of a CI/CD workflow. A serverless and managed execution with no infrastructure to provision. Ability to validate across data quality dimensions like freshness, completeness, accuracy and validity.Flexibility in execution – either by using Dataplex serverless scheduler (at no extra cost) or executing the data validations as part of a pipeline (e.g. Apache Airflow).Incremental execution – so you save time and money by validating new data only. Secure and performant execution with zero data-copy from BigQuery environments and projects. Programmatic consumption of quality metrics for Dataops workflows. Users can also execute these checks on data that is stored in BigQuery and Google Cloud Storage but is not yet organized with Dataplex. For Google Cloud Storage data that is managed by Dataplex, Dataplex auto-detects and auto-creates tables for structured and semi-structured data. These tables can be referenced with the Dataplex data quality task as well. Behind the scenes – Dataplex makes use of an open source data quality engine – Cloud Data Quality Engine – to run these checks. Providing an open platform is one of our key goals and we have made contributions to this engine to integrate seamlessly with Dataplex’s metadata and serverless environment.You can learn more about this in our product documentation. Building enterprise trust at American Eagle Outfitters One of our enterprise customers – American Eagle Outfitters (AEO) – is continuing to build trust in their critical data using Dataplex Data Quality Task. Kanhu Badtia, lead data engineer from AEO, shares their rationale and experience with Dataplex data quality task: “AEO is a leading global specialty retailer offering high-quality & on-trend clothing under its American Eagle® and Aerie® brands. Our company operates stores in the United States, Canada, Mexico, and Hong Kong, and ships to 81 countries worldwide through its websites. We are a data-driven organization that utilizes data from physical and digital store fronts, from social media channels, from logistics/delivery partners and many other sources through established compliant processes. We have a team of data scientists and analysts who create models, reports and dashboards that inform responsible business decision-making on such matters as inventory, promotions, new product launches and other internal business reviews. As the data engineering team at AEO, our goal is to provide highly trusted data for our internal data consumers. Before Dataplex – AEO had methods for maintaining data quality that were effective for their purpose. However, those methods were not scalable with the continual expansion of data volume and demand for quality results from our data consumers. Internal data consumers identified and reported quality issues where ‘bad data’ was impacting business critical dashboards/reports . As a result, our teams were often in “fire-fighting” mode – finding & fixing bad data. We were looking for a solution that would standardize and scale data quality across the production data pipelines. The majority of AEO’s business data is in Google’s BigQuery or in Google Cloud Storage (GCS). When Dataplex launched the data quality capabilities, we immediately started a proof-of-concept. After a careful evaluation, we decided to use it as the central data quality framework for production pipelines. We liked that – It provides an easy declarative (YAML) & flexible way of defining data quality. We were able to parameterize it to use across multiple tables. It allows validating data in any BigQuery table with a completely serverless and native execution using existing slot reservations. It allows executing these checks as part of the ETL pipelines using DataPlex Airflow Operators. This is a huge win as pipelines can now pause further processing if critical rules do not pass. Data quality checks are executed in parallel which gives us the required execution efficiency in pipelines. Data quality results are stored centrally in BigQuery & can be queried to identify which rules failed/succeeded and how many rows failed. This enables defining custom thresholds for success. Organizing data in Dataplex Lakes is optional when using Dataplex data quality. Our team truly believes that data quality is an integral part of any data-driven organization and Dataplex DQ capabilities align perfectly with that fundamental principle. For example, here is a sample Google Cloud Composer / Airflow DAG that loads & validates the “item_master” table and stops downstream processing if the validation fails.It includes simple rules for uniqueness, completeness and more complex rules for referential integrity or business rules such as checking daily price variance. We publish all data quality results centrally to a BigQuery table, such as this:Sample data quality output tableWe query this output table for data quality issues & fail the pipeline in case of critical rule failure. This stops low quality data from flowing downstream. We now have a repeatable process for data validation that can be used across the key data production pipelines. It standardizes the data production process and effectively ensures that bad data doesn’t break downstream reports and analytics.”Learn moreHere at Google – we are excited to enable our customer’s journey to high quality, trusted data. To learn more about our current data quality capabilities please refer to – Dataplex Data Quality OverviewSample Airflow DAG with Dataplex Data Quality taskRelated ArticleStreamline data management and governance with the unification of Data Catalog and DataplexData Catalog will be unified with Dataplex, providing an enterprise-ready data fabric that enables data management and governance at scale.Read Article
Quelle: Google Cloud Platform
This blog post has been co-authored by Darius Ryals, General Manager of Partner Promises and Azure Chief Information Security Officer.
Today we’re announcing Azure Payment HSM has achieved Payment Card Industry Personal Identification Number (PCI PIN) making Azure the first hyperscale cloud service provider to obtain this certification.
Financial technology has rapidly disrupted the payments industry and securing payment transactions is of the utmost importance. Azure helps customers secure their critical payment infrastructure in the cloud and streamlines global payments security compliance. Azure remains committed to helping customers achieve compliance with the Payment Card Industry’s leading compliance certifications.
Enhanced security and compliance through Azure Payment HSM
Azure Payment HSM is a bare metal infrastructure as a service (IaaS) that provides cryptographic key operations for real-time payment transactions in Azure. The service empowers financial institutions and service providers to accelerate their digital payment strategy through the cloud. Azure Payment HSM is certified across stringent security and compliance requirements established by the PCI Security Standards Council (PCI SSC) including PCI DSS, PCI 3DS, and PCI PIN and offers HSMs certified to FIPS 140-2 Level 3 and PCI HSM v3.
Azure Payment HSM enables a wide range of use cases. These include payment processing for card and mobile payment authorization and 3D-Secure authentication; payment credential issuing for cards, wearables, and connected devices; securing keys and authentication data for POS, mPOS, Remote key loading, PIN generation, and PIN routing; sensitive data protection for point-to-point encryption, security tokenization, and EMV payment tokenization.
Azure Payment HSM is designed to meet the low latency and high-performance requirements for mission-critical payment applications. The service is comprised of single-tenant HSMs offering customers complete remote administrative control and exclusive access. HSMs are provisioned and connected directly to users’ virtual networks, and HSMs are under users’ sole administration control. HSMs can be easily provisioned as a pair of devices and configured for high availability.
Azure Payment HSM provides great benefits for both payment HSM users with a legacy on-premises HSM footprint and those new payment ecosystem entrants who may choose a cloud-native approach from the outset. The customer could be a payment service provider acting on behalf of multiple financial institutions or a financial institution that wishes to directly access the Azure Payment HSM.
Leverage Azure Payment HSM PCI PIN certification
PINs are used to verify cardholder identity during online and offline payment card transactions.
The PCI PIN Security Standard contains requirements for the secure management, processing, and transmission of PIN data and applies to merchants and service providers that store, process, transmit, or can impact the security of PIN data.
Azure Payment HSM customers can reduce their compliance burden by leveraging Azure’s PCI PIN Attestation of Compliance (AOC) which addresses Azure’s portion of responsibility for each PCI PIN requirement and contains the list of certified Azure regions. The Azure Payment HSM Shared Responsibility Matrix is also available to help customers significantly reduce time, effort, and cost during their own PCI PIN assessments by simplifying the compliance process.
Learn more
When moving payment systems to the cloud, payment security must adhere to Payment Industry’s mandate compliance without failure. Financial institutions and service providers in the payment ecosystem including issuers, service providers, acquirers, processors, and payment networks would benefit from Azure Payment HSM. To learn how Microsoft Azure capabilities can help, see the resources below:
Azure Payment HSM
Azure Payment HSM documentation
Azure PCI PIN AOC
Azure PCI DSS AOC
Azure PCI 3DS AOC
Quelle: Azure