Change streams for Cloud Spanner: now generally available

At this year’s Google Data Cloud Summit, we announced Cloud Spanner change streams. Today, we are thrilled to announce the general availability of change streams. With change streams, Spanner users are now able to track and stream out changes (inserts, updates, and deletes) from their Cloud Spanner database in near real-time.Change streams provides a wide range of options to integrate change data with other Google Cloud services. Common use cases include:Analytics: Send change events to BigQuery to ensure that BigQuery has the most recent data available for analytics.  Event triggering: Send data change events to Pub/Sub for further processing by downstream systems. Compliance: Save the change events in Google Cloud Storage for archiving purposes. Getting started with change streamsThis section walks you through a simple example of creating a change stream, reading its data, and sending the data to BigQuery for analytics.If you haven’t already done so, get yourself familiar with Cloud Spanner basics with the Spanner Qwiklab.Creating a change streamSpanner change streams are created with DDL, similar to creating tables and indexes. Change stream DDL requires the same IAM permission as any other schema change (spanner.databases.updateDdl).A change stream can track changes on a set of columns, a set of tables, or an entire database. Each change stream can have a retention period of anywhere from one day to seven days, and you can set up multiple change streams to track exactly what you need for your specific objectives.  Learn more about creating and managing change streams.Suppose you have a table Orders like this:code_block[StructValue([(u’code’, u’CREATE TABLE Orders (rn OrderID INT64 NOT NULL,rn CustomerID INT64 NOT NULL,rn ProductId INT64 NOT NULL,rn OrderDate DATE,rn Price INT64,rn) PRIMARY KEY(OrderID);’), (u’language’, u”)])]The DDL to create a change stream that tracks the entire Orders table with a (implicit) default retention of 1 day would be defined as:code_block[StructValue([(u’code’, u’CREATE CHANGE STREAM OrdersStream FOR Orders;’), (u’language’, u”)])]Creating change streams is a long running operation. You can check the progress on the change streams page in the Cloud console:Once created, you can click on the change stream name to view more details:Your database DDL should look something like:Now that the change stream has been created, you can process the change stream data.Streaming data to BigQueryThere are several ways to process change stream data. The easiest way is to use the Spanner connector for Apache Beam which allows you to build scalable data processing pipelines for Google Cloud Dataflow. We provide Dataflow templates for processing and writing change data to BigQuery or Google Cloud Storage respectively. Learn more about how Cloud Spanner change streams work with Dataflow.In this example, we will use the Spanner change streams to BigQuery template to write change stream data to BigQuery. First, navigate to your project’s Dataflow Jobs page in the Google Cloud Console. Click on CREATE JOB FROM TEMPLATE, choose the Change streams to BigQuery template, then fill in the required fields:Click RUN JOB, and wait for Dataflow to build the pipeline and launch the job. Once your Dataflow pipeline is running, you can view the job graph, execution details, and metrics on the Dataflow Jobs page:Now let’s write some data into the tracked table Orders:Under the hood, when Spanner detects a data change in a data set tracked by a change stream, it writes a data change record synchronously with that data change, within the same transaction. Spanner co-locates both of these writes so they are processed by the same server, minimizing write processing. Learn more about how Spanner writes and stores change streams.Finally, when you view your BigQuery dataset, you will see the row that you just inserted, with some additional information from the change stream records.You are all set! As long as your Dataflow pipeline is running, the data changes to the tracked tables will be seamlessly streamed to your BigQuery dataset. Learn more about monitoring your pipeline.More ways to process change stream dataInstead of using the Google-provided Dataflow templates for BigQuery and Google Cloud Storage, you can choose to build a custom Dataflow pipeline to process change data with Apache Beam. For this case, we provide the SpannerIO Dataflow connector that outputs change data as an Apache Beam PCollection of DataChangeRecord objects. This is  a great choice if you want to define your own data transforms, or want a different sink than BigQuery or Google Cloud Storage. Learn more about how to create custom Dataflow pipelines that consume and forward change stream data. Alternatively, you can process change streams with the Spanner API. This approach, which is particularly well-suited for more latency-sensitive applications, does not rely on Dataflow. The Spanner API is a powerful interface that lets you read directly from a change stream to implement your own connector and stream changes to the pipeline of your choice. With the Spanner API, a change stream is divided into multiple partitions, each of which can be used to query a change stream in parallel for higher throughput. Spanner dynamically creates these partitions based on load and size. Each partition is associated with a Spanner database split, allowing change streams to scale as effortlessly as the rest of Spanner. Learn more about using the change stream query API.What’s nextSpanner change streams is available to all customers today at no additional cost –  you’ll pay only for any extra compute and storage of the change stream data at the regular Spanner rates. Since change streams are built right into Spanner, there’s no software to install, and you get external consistency, industry-leading availability, and effortless scale with the rest of the database. Start exploring change streams from the change stream overview today!Related ArticleBoost the power of your transactional data with Cloud Spanner change streamsChange streams track changes in your Spanner database and integrate this data with other systems for analytics, event triggering, and com…Read Article
Quelle: Google Cloud Platform

Application Rationalization through Google Cloud’s CAMP Framework

On April 6th, 2022, Google Cloud established a new partnership with CAST, to help accelerate the migration and application modernization programs of customers worldwide, complementing the Google capabilities already available through the Google Cloud Application Modernization Program (CAMP).Application Rationalization (App Rat) is the first step towards a cloud adoption or migration journey, through which you go over the application inventory to determine which applications should be Retired, Retained, Refactored, Replatformed, or Reimagined.Why is this important to you?Have the majority of your in-house applications still not moved to the cloud? How much time does your development team spend on support (bug fix, tickets, etc.) versus feature(s) development? Have Infrastructure/Platform dependencies ever delayed product rollout? Would an auto-scalable, managed cloud, increase stakeholder buy-in? Can Google simplify this journey?Google Cloud Application Modernization Program (CAMP) has been designed as an end-to-end framework to help guide organizations through their modernization journey by assessing where they are today, and provide a path forward. When it comes to App Rationalization, this depends on what your role is. Step 1 (Assess): Who is the target audience? The Platform team (or) the Application team?This determines what kind of challenges we are trying to solve. For e.g. the centralized platform team wants to set some guardrails on how the App teams deploy their apps. Streamlining this would allow the platform team to mature themselves into the SRE territory. The application team, on the other hand, loves flexibility, and the ability to perform Continuous Delivery.These examples are only the tip of the iceberg. Most of the enterprise customers have a majority of their applications in the legacy world. Unless we move those business critical applications to the cloud, it’s impossible to mature as an enterprise. For more information, check State of DevOps 2021 report.Step 2 (Analyze): Google Cloud offers the tooling and the framework to analyze your legacy applications.Platform Owner (persona), usually have very little information on which workloads are a good fit for modernization. Google’s StratoZone® SaaS platform provides customers with a data-driven cloud decision framework. The StratoProbe® Data Collector Application delivers the ability to easily deploy and scale the discovery of a customer’s IT environment for Private, Public, or Hybrid-cloud planning. To ease and accelerate the VM migration journey, Google Cloud offers assistance and guidance in making the right decisions when deciding to go to cloud.Google’s mFit aims at unblocking customers in their transformation by providing workload selection for successful on-boarding, at scale, to Anthos, GKE and Cloud Run , in both pre-sales (e.g. proof-of-concept/proof-of-value) and post-sales (e.g. pilot and at scale execution) scenarios.App and/or Business Owners (persona), get involved in a 1-week workshop, using CAST Highlight, which would provide rapid portfolio assessment through automated source code analysis for Cloud Readiness, Open Source risks, Resiliency, and Agility.Step 3 (Plan & execute): Each organization is different. Some may follow the “Migration Factory” approach, and some may follow “Modernization Factory”, and some may follow both. Irrespective of which approach you choose to follow, it is important to plan just enough, so that you can start your execution. Ensure to set the OKRs, that would help with the right measurements, before you start the execution. The actual learning from the execution helps the team(s) to learn more about the cloud migration process, and refine it based on their organization.Using CAST Highlight in the assessment step previously, we get the recommendation for the analyzed applications. From there, for certain workloads, we can use Migrate to Containers, to automate the containerization of suitable workloads. However, there are certain applications that require manual code changes. You have a few options for that,Our experts can help you get started. Our partners can help youStep 4 (Measure & reiterate): Measure the progress using the predefined metrics in the previous step. Celebrate the wins. Consistently share the learnings and best practices with the developer community. Pick the next challengeTake the next stepTell us what you’re solving for. A Google Cloud expert will help you find the best solution.Related ArticleGoogle Cloud Application Modernization Program: Get to the future fasterThe Google Cloud App Modernization Program (CAMP) can help you accelerate your modernization process and adopt DevOps best practicesRead Article
Quelle: Google Cloud Platform

Why IT leaders choose Google Cloud certification for their teams

As organizations worldwide move to the cloud, it’s become increasingly crucial to provide teams with confidence and the right skills to get the most out of cloud technology. With demand for cloud expertise exceeding the supply of talent, many businesses are looking for new, cost-effective ways to keep up.When ongoing skills gaps stifle productivity, it can cost you money. In Global Knowledge’s 2021 report, 42% of IT decision-makers reported having “difficulty meeting quality objectives” as a result of skills gaps, and, in an IDC survey cited in the same Global Knowledge report, roughly 60% of organizations described a lack of skills as a cause for lost revenue. In today’s fast-paced environment, businesses with cloud knowledge are in a stronger position to achieve more. So what more could you be doing to develop and showcase cloud expertise in your organization?Google Cloud certification helps validate your teams’ technical capabilities, while demonstrating your organization’s commitment to the fast pace of the cloud.What certification offers that experience doesn’t is peace of mind. I’m not only talking about self-confidence, but also for our customers. Having us certified, working on their projects, really gives them peace of mind that they’re working with a partner who knows what they’re doing. Niels Buekers, managing director at Fourcast BVBAWhy get your team Google Cloud certified?When you invest in cloud, you also want to invest in your people. Google Cloud certification equips your teams with the skills they need to fulfill your growing business. Speed up technology implementation Organizations want to speed up transformation and make the most of their cloud investment.Nearly 70% of partner organizations recognize that certifications speed up technology implementation and lead to greater staff productivity, according to a May 2021 IDC Software Partner Survey. The same report also found that 85% of partner IT consultants agree that “certification represents validation of extensive product and process knowledge.”Improve client satisfaction and successGetting your teams certified can be the first step to improving client satisfaction and success. Research of more than 600 IT consultants and resellers in a September 2021 IDC study found that “fully certified teams met 95% of their clients’ objectives, compared to a 36% lower average net promoter score for partially certified teams.”Motivate your team and retain talentIn today’s age of the ongoing Great Resignation, IT leaders are rightly concerned about employee attrition, which can result in stalled projects, unmet business objectives, and new or overextended team members needing time to ramp up. In other words, attrition hurts.But when IT leaders invest in skills development for their teams, talent tends to stick around. According to a business value paper from IDC, comprehensive training leads to 133% greater employee retention compared to untrained teams. When organizations help people develop skills, people stay longer, morale improves, and productivity increases. Organizations wind up with a classic win-win situation as business value accelerates. Finish your projects ahead of scheduleWith your employees feeling supported and well equipped to handle workloads, they can also stay engaged and innovate faster with Google Cloud certifications. “Fully certified teams are 35% more likely than partially certified teams to finish projects ahead of schedule, typically reaching their targets more than two weeks early,” according to research in an IDC InfoBrief.Certify your teamsGoogle Cloud certification is more than a seal of approval – it can be your framework to increase staff tenure, improve productivity, satisfy your customers, and obtain other key advantages to launch your organization into the future. Once you get your teams certified, they’ll join a trusted network of IT professionals in the Google Cloud certified community, with access to resources and continuous  learning opportunities.To discover more about the value of certification for your team, download the IDC paper today and invite your teams to join our upcoming webinar and get started on their certification journey.Related ArticleHow to become a certified cloud professionalHow to become a certified cloud professionalRead Article
Quelle: Google Cloud Platform

Built with BigQuery: Gain instant access to comprehensive B2B data in BigQuery with ZoomInfo

Editor’s note: The post is part of a series highlighting our partners, and their solutions, that are Built with BigQuery.To fully leverage the data that’s critical for modern businesses, it must be accurate, complete, and up to date. Since 2007, ZoomInfo has provided B2B teams with the accurate firmographic, technographic, contact, and intent data they need to hit their marketing, sales, and revenue targets. While smart-analytics teams have used ZoomInfo data sets in Google BigQuery to integrate them with other sources to deliver reliable and actionable insights powered by machine learning, Google Cloud and ZoomInfo recently have partnered to give organizations even richer data sets and more powerful analytics tools. Today, customers now have instant access to ZoomInfo data and intelligence directly within Google BigQuery. ZoomInfo is available as a virtual view in BigQuery, so analysts can explore the data there even before importing it. Once ZoomInfo data has been imported into BigQuery, data and operations teams will be able to use it in their workflows quickly and easily, saving their sales and marketing teams time, money, and resources. ZoomInfo data sets include: Contact and company. Capture essential prospect and customer data — from verified email addresses and direct-dial business phone and mobile numbers, to job responsibilities and web mentions. Get B2B company insights, including organizational charts, employee and revenue growth rates, and look-alike companies.Technographics and scoops. Uncover the technologies that prospects use — and how they use them — to inform your marketing and sales efforts. Discover trends to shape the right outreach messaging and determine a buyer’s needs before making the first touch. Buyer intent. ZoomInfo’s buyer intent engine captures real-time buying signals from companies researching relevant topics and keywords related to your business solution across the web.Website IP traffic. Enrich data around traffic from your digital properties, so your customer-facing teams can take immediate action and turn traffic into sales opportunities.In the future, ZoomInfo data sets will be available in the Google Cloud Marketplace as well as Google Cloud Analytics Hub (now in preview) alongside popular Google and third-party data sets including Google Trends, Google Analytics, and Census Bureau data. The new features will help ZoomInfo and Google Cloud customers such as Wayfair Professional, one of the world’s largest home retailers. Wayfair Professional is a long-time user of ZoomInfo’s Company Data Brick, API, and enrichment services. Wayfair Professional has historically accessed ZoomInfo data through file transfer, which involved shuffling encrypted CSVs back and forth over SFTP and manual file processing to ingest it into Google BigQuery. Ryan Sigurdson, senior analytics manager at Wayfair Professional, shared that moving their monthly offline company enrichment workflow to BigQuery could save them weeks of manual work and maintenance every month. Built with BigQueryZoomInfo is one of over 700 tech companies powering their products and businesses using data cloud products from Google, such as BigQuery, Looker, Spanner, and Vertex AI. Recently at the Data Cloud Summit, Google Cloud announced Built with BigQuery, which helps ISVs like ZoomInfo get started building applications using data and machine learning products. By providing dedicated access to technology, expertise, and go to market programs, this initiative can help tech companies to accelerate, optimize, and amplify their success.ZoomInfo’s SaaS solutions have been built on Google Cloud for years. By partnering with Google Cloud, ZoomInfo can leverage an all-in-one cloud platform to develop its data collection, data processing, data storage, and data analytics solutions. “Enabling customers to gain superior insights and intelligence from data is core to the ZoomInfo strategy. We are excited about the innovation Google Cloud is bringing to market and how it is creating a differentiated ecosystem that allows customers to gain insights from their data securely, at scale, and without having to move data around,” says Henry Schuck, ZoomInfo’s chief executive officer. “Working with the Built with BigQuery team enables us to rapidly gain deep insight into the opportunities available and accelerate our speed to market.”Google Cloud provides a platform for building data-driven applications like ZoomInfo, from simplified data ingestion, processing, and storage to powerful analytics, AI/ML, and data sharing capabilities, all integrated with the open, secure, and sustainable Google Cloud platform. With a diverse partner ecosystem and support for multicloud, open source tools, and APIs, Google Cloud provides technology companies the portability and extensibility they need to avoid data lock-in. To learn more about ZoomInfo on Google Cloud, visit https://www.zoominfo.com/offers/google-bigquery.To learn more about Built with BigQuery, visit https://cloud.google.com/solutions/data-cloud-isvsRelated ArticleGet value from data quickly with Informatica Data Loader for BigQueryWith Informatica’s Data Loader on Google Cloud, accelerate data uploads and keep data flowing to get insights and answers faster.Read Article
Quelle: Google Cloud Platform

Enterprise DevOps Guidebook – Chapter 1

The Google Cloud DORA team has been hard at work releasing our yearly Accelerate State of DevOps report. This research provides an independent view into the practices and capabilities that organizations, irrespective of their size, industry, and region, can employ to drive better performance. Year over year, the State of DevOpsreport helps organizations benchmark themselves against others in the industry as elite, high, medium, or low performers and provides recommendations for how organizations can continually improve. The table below highlights elite, high, medium, and low performers at a glance from the last report.To give more prescriptive advice on how to successfully implement DORA best practices with Google Cloud, we are excited to announce the DevOps Enterprise Guidebook. The guidebook will be your resource providing a concrete action plan for implementing recommendations using Google Cloud’s DORA research to initiate performance improvements.We will release the guidebook in chapter increments. The goal of this first chapter is to give your organization a better understanding of how to use DORA’s resources to measure your performance and to begin your first DevOps team experiment. Some resources include the DevOps Quick check, where you can measure your teams’ software delivery performance in less than a minute with just five multiple choice questions, or a more indepthcapabilities assessment, an assessment we deploy in your organization that gives us a robust measurement of your organization’s capabilities as they pertain to software delivery.Future chapters will touch on other main topics we have identified in the State of DevOps reports such as shifting left on security, cloud adoption, and easy to use DevOps tools. We want to make it easy for your organization to get the most out of investing in DevOps and with the launch of the guidebook we believe the focused recommendations will help more organizations successfully implement DevOps practices that will lead to business and organizational success.2022 State of DevOps SurveyFor the 2022 State of DevOps report we will be focusing on a topic that has been top of mind recently: security. This year we are doing a deeper investigation into how security practices and capabilities predict overall software delivery and operations performance. We invite you to join the over 32,000 professionals worldwide who have participated in the DORA reports by completing our 2022 State of DevOps survey.The survey will remain open until midnight PDT on July 22, 2022. Please help us encourage more voices by sharing this survey with your network, especially with your colleagues from underrepresented parts of our industry. We look forward to hearing from you and your teams!Related Article2021 Accelerate State of DevOps report addresses burnout, team performanceThe SODR is continually one of the most downloaded assets on the GCP website. We are releasing the updated version of the report with new…Read Article
Quelle: Google Cloud Platform