Four steps to managing your Cloud Logging costs on a budget

As part of our ongoing series on cost management for observability data in Google Cloud, we’re going to share four steps for getting the most out of your logs while on a budget. While we’ll focus on optimizing your costs within Google Cloud, we’ve found that this works with customers with infrastructure and logs on prem and in other clouds as well.Step 1: Analyze your current spending on logging toolsTo get started, create an itemized list of what volume of data is going where and what it costs. We’ll start with the billing report and the obvious line items including those under Operations Tools/Cloud Logging:Log Volume – the cost to write log data to disk once (see our previous blog post for an explanation)Log Storage Volume – the cost to retain logs for more than 30 days If you’re using tools outside Cloud Logging, you’ll also need to include any costs related to these solutions. Here’s a list to get you started:Log vendor and hardware costs — what are you paying to observability vendors? If you’re running your own logging solution, you’ll want to include the cost of compute and disk.If you export logs within Google Cloud, include Cloud Storage and BigQuery costsProcessing costs — consider the costs for Kafka, Pub/Sub or Dataflow to process logs. Network egress charges may apply if you’re moving logs outside Google Cloud.Engineering resources dedicated to managing your logging tools across your enterprise often are significant too!Step 2: Eliminate waste — don’t pay for logs you don’t needWhile not all costs scale directly with volume, optimizing your log volume is often the best way to reduce spend. Even if you are using a vendor with a contract that locks you into a fixed price for a period of time, you may still have costs in your pipeline that can be reduced by avoiding wasteful logs such as Kafka, Pub/Sub or Dataflow costs. Finding chatty logs in Google CloudThe easiest way to understand which sources are generating the highest volume of logs within Google Cloud is to start with our pre-built dashboards in Cloud Monitoring. To access the available dashboards:Go to Monitoring -> DashboardsSelect “Sample Library” -> “Logging”This blog post has some specific recommendations for optimizing logs for GKE and GCE using prebuilt dashboards.As a second option, you can use Metrics Explorer and system metrics to analyze the volume of logs. For example, type “log bytes ingested” into the filter. This specific metric corresponds to the Cloud Logging “Log Volume” charge. There are many ways to filter this data. To get a big picture, we often start with grouping by both “resource_type” and “project_id”. To narrow down the resource type in a particular project, add a “project_id” filter. Select “sum” under the Advanced Options -> Click on Aligner and select “sum”. Sort by volume to see the resources with the highest log volume.While these rich metrics are great for understanding volumes, you’ll probably want to eventually look at the logs to see whether they’re critical to your observability strategy. In Logs Explorer, the log fields on the left side help you understand volumes and filter logs from a resource type.Reducing log volume with the Logs Router Now that we understand what types of logs are expensive, we can use the Log Router and our sink definitions to reduce these volumes. Your strategy will depend on your observability goals, but here are some general tools we’ve found to work well.The most obvious way to reduce your log volume is not to send the same logs to multiple storage destinations. One common example of this is when a central security team uses an aggregated log sink to centralize their audit logs but individual projects still ingest these logs. Instead, use exclusion filters on the _Default log sink and any other log sinks in each project to avoid these logs. Exclusion filters also work on log sinks to BigQuery, Pub/Sub, or Cloud Storage.Similarly, if you’re paying to store logs in an external log management tool, you don’t have to save these same logs to Cloud Logging. We recommend keeping a small set of system logs from GCP services such as GKE in Cloud Logging in case you need assistance from GCP support but what you store is up to you, and you can still export them to the destination of your choice!Another powerful tool to reduce log volume is to sample a percentage of chatty logs. This can be particularly useful with 2XX log balancer logs, for example. This can be a powerful tool, but we recommend you design a sampling strategy based on your usage, security and compliance requirements and document it clearly.Step 3: Optimize costs over the lifecycle of your logsAnother option to reduce costs is to avoid storing logs for more time than you need them. Cloud Logging charges based on the monthly log volume retained per month. There’s no need to switch between hot and cold storage in Cloud Logging; doubling the default amount of retention only increases the cost by 2%. You can change your custom log retention at any time.If you are storing your logs outside of Cloud Logging, it is a good idea to compare the cost to retain logs and make a decision. Step 4: Setup alerts to avoid surprise billsOnce you are confident that the volume of logs being routed through log sinks fit in your budget, set up alerts so that you can detect any spikes before you get a large bill. To alert based on the volume of logs ingested into Cloud Logging:Go to the Logs-based metrics page. Scroll down to the bottom of the page and click the three dots on “billing/bytes_ingested” under System-defined metrics. Click “ Create alert from metric”Add filters (For example: use resource_id or project_id. This is optional). Select the logs based metric for the alert policy.You can also set up similar alerts on the volume for log sinks to Pub/Sub, BigQuery or Cloud Storage.ConclusionOne final way to stretch your observability budget is to use more Cloud Operations. We’re always working to bring our customers the most value possible for their budget such as our latest feature, Log Analytics, which adds querying capabilities but also makes the same data available for analytics, reducing the need for data silos. Many small customers can operate entirely on our free tier. Larger customers have expressed their appreciation for the scalable Log Router functionality available at no extra charge that would otherwise require an expensive event store to process data. So it’s no surprise that a 2022 IDC report showed that more than half of respondents surveyed stated that managing and monitoring tools from public cloud platforms provide more value compared to third-party tools. Get started with Cloud Logging and Monitoring today.
Quelle: Google Cloud Platform

How an open data cloud is enabling Airports of Thailand and EVme to reshape the future of travel

Aviation and accommodation play a big role in impacting the tourism economy, but analysis of recent data also highlights tourism’s impact on other sectors, from financial services to healthcare, to retail and transportation. With travel recovery in full swing post pandemic, Google search queries related to “travel insurance” and “medical tourism” in Thailand have increased by more than 900% and 500% respectively. Financial institutions and healthcare providers must therefore find ways to deliver tailored offerings to travelers who are seeking peace of mind from unexpected changes or visiting the country to receive specialized medical treatment.Interest in visiting Thailand for “gastronomy tourism” is also growing, with online searches increasing by more than 110% year-on-year.  Players in the food and beverage industry should therefore be looking at ways to better engage tourists keen on authentic Thai cuisine.Most importantly, digital services will play an integral role in travel recovery. More than one in two consumers in Thailand are already using online travel services, with this category expected to grow 22% year-on-year and contribute US$9 billion to Thailand’s digital economy by 2025. To seize growth opportunities amidst the country’s tourism rebound, businesses cannot afford to overlook the importance of offering always-on, simple, personalized, and secure digital services.That is why Airports of Thailand (AOT), SKY ICT (SKY) and EVME PLUS (EVme) are adopting Google Cloud’s open data cloud to deliver sustainable, digital-first travel experiences.Improving the passenger experience in the cloudWith Thailand reopening its borders, there has been an upturn in both inbound and outbound air travel. To accommodate these spikes in passenger traffic across its six international airports, AOT migrated its entire IT footprint to Google Cloud, which offers an open, scalable, and secure data platform, with implementation support from its partner SKY, an aviation technology solutions provider.Tapping on Google Cloud’s dynamic autoscaling capabilities, the IT systems underpinning AOT’s ground aviation services and the SAWASDEE by AOT app can now accommodate up to 10 times their usual workloads. AOT can also automatically scale down its resources to reduce costs when they are no longer in use. Using the database management services of Google Cloud to eliminate data silos, the organization is able to enhance its capacity to deliver real-time airport and flight information to millions of passengers. As a result, travelers enjoy a smoother passenger experience, from check-in to baggage collection.At the same time, SKY uses Google Kubernetes Engine (GKE) to transform SAWASDEE by AOT into an essential, all-in-one travel app that offers a full range of tourism-related services. GKE allows AOT to automate application deployment and upgrades without causing downtime. This frees up time for the tech team to accelerate the launch of new in-app features, such as a baggage tracker service, airport loyalty programs, curated travel recommendations, an e-payment system, and more.EVme drives sustainable travel with dataBeing able to travel more efficiently is only one part of the future of travel. More than ever, sustainability is becoming a priority for consumers when they plan their travel itineraries. For instance, search queries related to “sustainable tourism” in Thailand have increased by more than 200% in the past year, with close to four in 10 consumers sharing that they are willing to pay more for a sustainable product or service.To meet this increasing demand and support Thailand’s national efforts to become a low-carbon society, EVme, a subsidiary of PTT Group, is building its electric vehicle lifestyle app on Google Cloud, the industry’s cleanest cloud. It has also deployed the advanced analytics and business intelligence tools of Google Cloud to offer its employees improved access to data-driven insights, which helps them better understand customer needs and deliver personalized interactions. These insights have helped EVme determine the range of electric vehicle models it offers for rental via its app, so as to cater to different preferences. At the same time, the app can also share crucial information, such as the availability of public electric vehicle charging stations, while providing timely support and 24-hour emergency assistance to customers.As we empower organizations across industries with intelligent, data-driven capabilities to make smarter business decisions and be part of an integrated ecosystem that delivers world-class visitor experiences, our collaborations with AOT, SKY, and EVme will enhance their ability to serve travelers with personalized, digital-first offerings powered by our secure and scalable open data cloud.
Quelle: Google Cloud Platform

How to easily migrate your apps to containers — free deep dive and workshop

Just here for the event registration link?Click here.Are you looking to migrate your applications to Google Cloud? Thinking about using containers for some of those apps? If so, you’re in luck! Google Cloud is hosting a free workshop on May 24th, 2023, that will teach you everything you need to know about migrating your app to containers in Google Cloud. The workshop starts at 9AM PST and will be led by Google Cloud experts who will walk you through some of your migration options, the costs involved, and the security considerations. We’ll also feature a hands-on lab so you can get familiar with some of the tools we use to achieve your migration goals. And we’ll wrap up with a live Q&A so you have the opportunity to ask questions of the experts and get your specific questions answered.Whether you’re a developer, a system administrator, or a business decision-maker, this workshop will give you the insights you need to make an informed decision about how to migrate your apps to Google Cloud. Click here to register for this free workshop. We hope to see you there!Need a bit more info before you sign up? No problem. Let’s chat about some of the benefits of migrating on-prem workloads to containers in Google Cloud: Wide range of container services: Choose between Google Kubernetes Engine (GKE), Cloud Run, and Anthos, giving you the flexibility to choose the container service that best meets your needs.Global network infrastructure: With our global network of data centers, you can deploy containers close to your users. This improves performance and reduces latency.Tools and resources: There’s a variety of tools and resources to help you manage and deploy your containers, including the Google Cloud console, the gcloud command-line tool, and the GKE dashboard.Commitment to security: Google Cloud takes security seriously, and our container services are built on a secure foundation. This includes features like role-based access control (RBAC), network policies, and encryption.Still have questions? We’ve got answers, and hope you’ll join us for this free workshop on May 24th at 9AM PST. And if you can’t wait until then, you can also check out our new whitepaper: The future of infrastructure will be containerized.We hope to see you on the 24th!
Quelle: Google Cloud Platform

At Google I/O, generative AI gets to work

Over the past decade, artificial intelligence has evolved from experimental prototypes and early successes to mainstream enterprise use. And the recent advancements in generative AI have begun to change the way we create, connect, and collaborate. As Google CEO Sundar Pichai said in his keynote, every business and organization is thinking about how to drive transformation. That’s why we’re focused on making it easy and scalable for others to innovate with AI.In March, we announced exciting new products that infuse generative AI into our Google Cloud offerings, empowering developers to responsibly build with enterprise-level safety, security, and privacy. They include Gen App Builder, which lets developers quickly and easily create generative chat and enterprise search applications, and Generative AI support in Vertex AI, which expands our machine learning development platform with access to foundation models from Google and others to quickly build, customize and deploy models. We also introduced our vision for Google Workspace, and delivered generative AI features to trusted testers in Gmail and Google Docs that help people write.Last month we introduced Security AI Workbench, an industry-first extensible platform powered by our new LLM security model Sec-PaLM, which incorporates Google’s unique visibility into the evolving threat landscape and is fine-tuned for cybersecurity operations.Today at Google I/O, we are excited to share the next steps not only in our own AI journey, but also those of our customers and partners as well. We’ve already seen a number of organizations begin to develop with and deploy our generative AI offerings. These organizations have been able to move their ideas from experimentation to enterprise-ready applications with the training models, security, compute infrastructure, and cost controls needed to provide their customers with transformative experiences. Our open ecosystem, which provides opportunities for every kind of partner, continues to grow as well. And we are also pleased to share new services and capabilities across Google Cloud and Workspace, including Duet AI—our AI-powered collaborator—to enable more users and developers to start seeing the impact AI can have on their organization.Customers bringing ideas to life with generative AILeading companies in a variety of industries like eDreams ODIGEO, GitLab, Oxbotica, and more, are using our generative AI technologies to create engaging content, synthesize and organize information, automate business processes, and build amazing customer experiences. A few examples we showcased today include:Adore Me, a New York-based intimate apparel brand, is creating production-worthy copy with generative AI features in Docs and Gmail. This is accelerating projects and processes in ways that even surprised the company.Canva, the visual communication platform, uses Google Cloud’s rich generative AI capabilities in language translation to better support its non-English speaking users. Users can now easily translate presentations, posters, social media posts, and more into over a hundred languages. The company is also testing ways that Google’s PaLM technology can turn short video clips into longer, more compelling stories. The result will be a more seamless design experience while growing the Canva brand.Character.AI, a leading conversational AI platform, selected Google Cloud as its preferred cloud infrastructure provider because we offer the speed, security and flexibility required to meet the needs of its rapidly growing community of creators. We are enabling Character.AI to train and infer LLMs faster and more efficiently, and enhancing the customer experience by inspiring imagination, discovery, and understanding. Deutsche Bank is testing Google’s generative AI and large language models (LLMs) at scale to provide new insights to financial analysts, driving operational efficiencies and execution velocity. There is an opportunity to significantly reduce the time it takes to perform banking operations and financial analysts’ tasks, empowering employees by increasing their productivity while helping to safeguard customer data privacy, data integrity, and system security.Instacart is always looking for opportunities to adopt the latest technological innovations, and by joining the Workspace Labs program, they have access to the new features and can discover how generative AI will make an impact for their teams.Orange is exploring a next-generation contact center with Google Cloud. With customers in 26 countries, the global telecommunications firm is testing generative AI to transcribe the call, summarize the exchange between the customer and service representatives, and suggest possible follow up actions to the agent based on the discussion. This experiment has the potential to dramatically improve both the efficiency and quality of customer interactions. Orange is working closely with Google to help ensure data protection and make sure that systematic employee review of Generative AI output and transparency can be implemented.Replit is developing a collaborative software development platform powered by AI. Developers using Replit’s Ghostwriter coding AI already have 30% of their code written by generative AI today. With real-time debugging of the code output and context awareness of the program’s files, Ghostwriter frees up developers’ time for more challenging and creative aspects of programming.Uber is creating generative AI for customer-service chatbots and agent assist capabilities, which handle a range of common service issues with human-like interactions with the aim of achieving greater customer satisfaction and cost efficiency. Additionally, Uber is working on using our synthetic data systems (a technique for improving the quality of LLMs) in areas like product development, fraud detection, and employee productivity.Wendy’s is working with Google Cloud on a groundbreaking AI solution, Wendy’s FreshAI, designed to revolutionize the quick service restaurant industry. The technology is transforming Wendy’s drive-thru food ordering experience with Google Cloud’s generative AI and LLMs—with the ability to discern the billions of possible order combinations on the Wendy’s menu. In June, Wendy’s plans to launch its first pilot of the technology in a Columbus, Ohio-area restaurant, before expanding to more drive-thru locations.Leading companies build with generative AI on Google CloudPartnering creates a strong ecosystem of real-world options for customersAt Google Cloud, we are dedicated to being the most open hyperscale cloud provider, and that includes our AI ecosystem. Today, we are excited to expand upon the partnerships announced earlier this year for every layer of the AI stack—chipmakers, companies building foundation models and AI platforms, technology partners enabling companies to develop and deploy machine learning (ML) models, app-builders solving customer use cases with generative AI, and global services and consulting firms that help enterprise customers implement all of this technology at scale. We announced new or expanded partnerships with SaaS companies like Box, Dialpad, Jasper, Salesforce, and UKG; and consultancies including Accenture, BCG, Cognizant, Deloitte, and KPMG. Together with our previous announcements with companies like AI21 Labs, Aible, Anthropic, Anyscale, Bending Spoons, Cohere, Faraday, Glean, Gretel, Labelbox, Midjourney, Osmo, Replit, Snorkel AI, Tabnine, Weights & Biases, and many more, they provide the a wide range of options for businesses and governments looking to bring generative AI into their organizations. Introducing new generative AI capabilities for Google CloudTo help cloud users of all skill levels solve their everyday work challenges, we’re excited to announce Duet AI for Google Cloud, a new generative AI-powered collaborator. Duet AI serves as your expert pair programmer and assists cloud users with contextual code completion, offering suggestions tuned to your code base, generating entire functions in real-time, and assisting you with code reviews and inspections. It can fundamentally transform the way cloud users of all skill sets build new experiences and is embedded across Google Cloud interfaces—within the integrated development environment (IDE), Google Cloud Console, and even chat. For developers looking to create generative AI applications more simply and efficiently, we are also introducing new foundation models and capabilities across our Google Cloud AI products. And to continue to enable and inspire more customers and partners, we are opening up generative AI support in Vertex AI and expanding access to many of these new innovations to more organizations.New foundation models are now available in Vertex AI. Codey, our code generation foundation model, helps accelerate software development with code generation, code completion, and code chat. Imagen, our text-to-image foundation model, lets customers generate and customize studio-grade images. And Chirp, our state-of-the-art speech model, allows customers to more deeply engage with their customers and constituents inclusively in their native languages with captioning and voice assistance. They can each be accessed via APIs, tuned through our intuitive Generative AI Studio, and feature enterprise-grade security and reliability, including encryption, access control, content moderation, and recitation capabilities that let organizations see the sources behind model outputs. Text Embeddings API is a new API endpoint that lets developers build recommendation engines, classifiers, question-answering systems, similarity matching, and other sophisticated applications based on semantic understanding of text or images. Reinforcement Learning from Human Feedback (RLHF) allows organizations to incorporate human feedback to deeply customize and improve model performance. Underpinning all of these innovations is our AI-optimized infrastructure. We provide the widest choice of compute options among leading cloud providers and are excited to continue to build them out with the introduction of new A3 Virtual Machines based on NVIDIA’s H100 GPU. These VMs, alongside the recently announced G2 VMs, offer a comprehensive range of GPU power for training and serving AI models.Extending generative AI across Google Workspace Earlier this year, we shared our vision for bringing generative AI to Workspace, and gave many users early access to features that helped them write in Gmail and Google Docs. Today, we are excited to announce Duet AI for Google Workspace, which brings together our powerful generative AI features and lets users collaborate with AI so they can get more done every day. We’re delivering the following features to trusted testers via Workspace Labs: In Gmail, we’re adding the ability to draft responses that consider the context of your existing email thread—and making the experience available on mobile.In Google Slides and Meet, we’re enabling you to easily generate images from text descriptions. Custom images in slides can help bring your story to life, and in Meet they can be used to create custom backgrounds.In Google Sheets, we’re automating data classification and the creation of custom plans—helping you analyze and organize data faster than ever. Moving the industry forward, responsiblyCustomers continue to amaze us with their ideas and creativity, and we look forward to continuing to help them discover their own paths forward with generative AI. While the potential for impact on business is great, we remain committed to taking a responsible approach, guided by our AI Principles. As we gather more feedback from our customers and users, we will continue to bring new innovations to market, with a goal to enable organizations of every size and industry to increase efficiency, connect with customers in new ways, and unlock entirely new revenue streams.
Quelle: Google Cloud Platform

Dialing up the impact of digital natives in the MENA region with Google Cloud

Entrepreneurs with the passion to drive positive impact have been selecting the Middle East and North Africa (MENA) region as the launchpad for their businesses since 2015, based on insights from Google Cloud’s digital natives unit. Today, the region is taking center stage due to the thousands of startups, digital natives, and web3 companies thriving. With more than 5,500 technology startups in the region, an amicable business climate, ample access to venture capital, and digital transformation being a top priority on government agendas, digital natives in the region have been on the rise.Forbes recently announced the Top 50 most funded startups in the MENA region. Collectively, these companies raised a whopping USD 3.2 billion in 2022, with startups in the region continuing to attract significant funding to date in comparison to other regions. The list also highlighted that UAE-based companies were the most represented for raising USD 964 million in total funding for that year, followed by the Kingdom of Saudi Arabia (KSA) where USD 946.7 million were raised, and Egypt reigning third place for raising USD 508.5 million.On the list, UAE-based fintech Tabby ranked second with USD 275 million in funds, and Sary, a Saudi based online marketplace, came in seventh for securing USD 112 million. Breadfast, an on-demand supermarket and household essentials provider based in Egypt also secured USD 26 million in funds during the same year.Tech-enabled success for digital natives in the MENA regionThe common factor between companies such as Tabby, Sary and Breadfast is that they are all fully tech-enabled businesses running on Google Cloud. These three companies leverage Google Cloud’s scalable, secure and reliable platform, and innovative cloud solutions to create seamless experiences every day for their customers across KSA, United Arab Emirates (UAE), Egypt, Kuwait, and Pakistan.Tabby provides “buy now, pay later” solutions via an online application that has been built on Google Cloud from day one. Tabby has successfully grown a customer base of 2.5 million active shoppers in the region since its start in 2019, with the support of the scalability provided by Google Cloud that provides uninterrupted and secure financial services to customers. With an online retail boom on the horizon for the MENA region, Tabby is poised for a growth trajectory as the volume of active e-shoppers will continue to rise and more markets become activated in the region’s digital economy. Tabby’s development team is able to take several strides ahead of market demand by developing a seamless and innovative product that can accommodate an average of 10 million shoppers per day. By running the entire IT infrastructure on Google Cloud, the team dedicates their time and resources to focus on what is important to the business and that is to provide a product that caters to customer and market requirements, rather than exhaust resources on time consuming tasks such as the daily management of IT assets. Tabby also believes in the power of big data and turns to Google Cloud’s data analytics solutions such as Big Query to roll out new monetary policies for customers. Before a new credit policy is introduced to shoppers, Tabby tests its viability on Big Query and analyzes different implementation scenarios in real-time to test out its effectiveness. This helps the team roll out policies that have been proven to be effective with shoppers.Throughout the year, the MENA region experiences a peak in shopping cycles connected with local festivities such as the holy month of Ramadan, White Friday and Christmas. It is around high peak shopping periods that Tabby’s application experiences significant spikes, as the team manages 140 million requests per day in comparison to 80 million requests on a regular day. Nonetheless, with the support of Google Cloud’s scalable infrastructure Tabby holds a record of zero down-time during peak periods, and can scale operations successfully with low latency — ultimately locking in an excellent service to customers.“From the first day Tabby went live in 2019 to date, we have experienced zero-downtime in our systems during high traffic periods because of Google Cloud’s scalable and flexible infrastructure. We are able to support 2.5 million shoppers across the Middle East because we run on a robust and reliable infrastructure. Scalability is key for the team at Tabby. We are able to build new products very quickly on Google Cloud in comparison to other cloud providers.”A report by eCommerce DB revealed that Saudi Arabia is the 27th largest market globally for e-commerce with a projected revenue of USD 11,977 million by the end of 2023. Mordor Intelligence also revealed that the Saudi e-commerce market is expected to show a compound annual growth rate (CAGR 2023-2027) of 13.9%, resulting in a projected market volume of US$20,155.8 million by 2027. Enter Sary, a Saudi-based B2B marketplace that connects businesses of all sizes to millions of shoppers in Saudi Arabia, Egypt and Pakistan via mobile and web applications. Sary is not a common marketplace, it aims to support local businesses and empower homegrown names to reach customers at scale via its platform in the countries where it operates.Sary is home to 70,000 businesses from all walks of life and as the company set out to expand its footprint it was time to move away from an unsophisticated cloud setup to a more advanced and robust cloud provider that provides the security and scalability that supports plans to tap into new markets.Sary attributes a big part of its success to running a robust infrastructure on Google Cloud, as it witnessed an 84% increase in operational system throughput since migrating the entire IT infrastructure. This means that businesses relying on the platform as their main marketplace are able to process orders at scale without any down-time or system interruptions, and generate positive revenue streams. Sary also leverages Google Kubernetes Engine (GKE) to automatically scale system bandwidth based on the volume of traffic the website or application receives. This solution helps the company manage IT costs effectively, while still delivering an uncompromised service to customers.“The support we receive every day from the Google Cloud team has been phenomenal. They have been with us every step of the way. We are able to free up time to focus on what is important and that is to deliver business value to our customers who depend on Sary for their success.”Egypt is another country that is rising as a strategic player in the MENA digital natives scene over the recent years. The 2022 Egypt Venture Investment Reportrevealed that the startup ecosystem observed a 168% year-on-year increase in capital investments to reach a new all time high record of USD 491 million. Breadfast is one of the companies disrupting the scene in Egypt as an early adopter of operating a cloud-native supply chain, before the arrival of rapid online grocery delivery companies in the country. Now a household name, Breadfast is a cloud native on-demand supermarket and household essentials provider that delivers to over 200,000 homes in Cairo.  The team at Breadfast built a fully tech-enabled business across all operational touchpoints that comprises manufacturing facilities, supply fulfillment points, 30 dark stores, 15 specialized coffee outlets and last-mile delivery. Running a tech-driven business generates additional costs that can be optimized when working with a cloud provider. And ever since Breadfast migrated the entire IT infrastructure to Google Cloud in 2022, the company has become more profitable as operating costs were reduced by 35% while improving system throughput with the support of Google Cloud’s scalable and secure infrastructure. To fulfill its brand promise of product delivery within 60 minutes anywhere in Cairo, Breadfast also turns to Google Cloud‘s resilient infrastructure that delivers efficient operational throughput to ensure no interruptions affect server vitality and impact order processing timelines. Breadfast successfully increased system up-time to 99.5 % since it migrated to Google Cloud, and was able to deliver six million orders across the city within a span of 30 minutes in 2022.“In our line of business time is of the essence. Two minutes of downtime in our systems takes 12 hours to fix on ground, which can have a downward impact on our customers. We decided to migrate our IT infrastructure to Google Cloud as the trusted cloud provider because of its resilience and the operational uptime is now at 99.5% ever since we made the move. This enabled Breadfast to deliver millions of orders in 2022.”Build your business with Google CloudGoogle Cloud opened up its secure and scalable infrastructure to businesses in the Middle East and North Africa region, where artificial intelligence (AI) and machine learning (ML) is embedded in cloud solutions that bring meaning to data and can help automate almost everything. Google Cloud also provides digital natives with the freedom to run applications where they need them with open, hybrid, and multi-cloud solutions. This way, an application is built once and can run anywhere, even on-premises.With no configuration required, digital natives can access limitless data effortlessly with Google Cloud solutions such as Big Query and Looker. These unique data analytics solutions are the single source of truth as they rely on AI and ML to design solutions that provide a deep understanding of customer data. Powered by data-driven understanding of customers, businesses today can preempt customer trends and bring them the right products and solutions based on their needs. Businesses can also accurately track down granular information such as if a driver delivered an order on time, and which item needs to be restocked in a warehouse.Google Cloud provides data loss prevention solutions which help digital natives encrypt critical data like customer information and financial records. Businesses can also discover, classify and protect their most sensitive data and detect customer churn or fraudulent activity using machine learning capabilities embedded in Big Query.To help entrepreneurs in the MENA region supercharge business growth, Google Cloud runs the Google for Startups Cloud Program that offers access to startup experts, cloud cost coverage up to USD 100,000 for each of the first two years, technical training, business support, and Google-wide offers. Sign up here for the program.Note: All customer metrics featured in the blogpost were derived from direct customer interviews with Google Cloud.
Quelle: Google Cloud Platform

Committed use discounts for RHEL and RHEL for SAP now available on Compute Engine

Optimizing your costs is a major priority for us here at Google Cloud. We are pleased to announce the general availability of committed use discounts (“CUDs”) for Red Hat Enterprise Linux and Red Hat Enterprise Linux for SAP. If you run consistent and predictable workloads on Compute Engine, you can utilize CUDs to save on Red Hat Enterprise Linux subscription costs by as much as 24% compared to on-demand (or “PAYG”) prices. “Red Hat Enterprise Linux on Google Cloud provides a consistent foundation for hybrid cloud environments and a reliable, high-performance operating environment for applications and cloud infrastructure. The introduction of committed use discounts for Red Hat Enterprise Linux for Google Cloud makes it even easier for customers to deploy on the world’s leading enterprise Linux platform to unlock greater business value in the cloud.” — Gunnar Hellekson, Vice President and General Manager, Red Hat Enterprise Linux Business Unit, Red Hat What are committed use discounts for Red Hat Enterprise Linux?Red Hat Enterprise Linux and Red Hat Enterprise Linux for SAP committed use discounts (collectively referred to as “Red Hat Enterprise Linux CUDs”) are resource-based commitments available for purchase in one- or three-year terms. When you purchase Red Hat Enterprise Linux CUDs, you are committing to paying the monthly Red Hat Enterprise Linux subscription fees for the duration you’ve selected for the number of licenses you specify, regardless of your actual usage. In exchange, you can save as much as 24% on Red Hat Enterprise Linux subscription costs compared to on-demand rates. Because you are billed monthly regardless of actual Red Hat Enterprise Linux usage, CUDs are ideal for your predictable and steady-state usage, to maximize your savings and make for easier budget planning. How do committed use discounts work for Red Hat Enterprise Linux?Red Hat Enterprise Linux CUDs are project- and region-specific, similar to the other software license CUDs available today. This means you will need to purchase Red Hat Enterprise Linux CUDs in the same region and project as the instances consuming these subscriptions. After you purchase Red Hat Enterprise Linux CUDs, discounts automatically apply to any running virtual machine (VM) instances within a selected project in the specified region. If you have multiple projects under the same billing account, commitments can also be shared across projects by turning on billing account sharing.When commitments expire, your running VMs continue to run at on-demand rates. It is important to note that after you purchase a commitment, you cannot edit or cancel it. You must pay the agreed-upon monthly amount for the duration of the commitment. Refer to Purchasing commitments for licenses for more information. How much can I save by using committed use discounts for Red Hat Enterprise Linux?By purchasing Red Hat Enterprise Linux CUDs, you can save as much as 20% on one-year commitments and up to 24% on three-year commitments compared to the current on-demand prices. However, it is important to remember that with CUDs, you will be charged for monthly subscription fees regardless of your actual Red Hat Enterprise Linux usage. Therefore, to maximize the discounts you can receive from CUDs, we recommend purchasing CUDs for steady and predictable workloads. Here is a helpful comparison between maximum discounts possible using CUDs versus its relative on-demand prices:Price as of this article’s publish date. Hourly costs are approximate. Calculations are derived based on the full CUD prices (as of this article’s publish date), assuming VMs running 730 hours per month,12 months per year. Discounts compared to current on-demand pricing, rounded to the nearest whole number.Based on our research, CUDs are a good fit for many Red Hat Enterprise Linux VMs, the majority of which run 24/7 workloads. When evaluating whether or not purchasing Red Hat Enterprise Linux CUD is a good choice for you, consider the following: Based on list prices for a one-year term, Red Hat Enterprise Linux CUDs can help you save on subscription costs if you utilize a Red Hat Enterprise Linux instance for ~80% or more of the time within the one year CUD term. For a three-year Red Hat Enterprise Linux CUD, you can start saving when a Red Hat Enterprise Linux instance runs for ~76% or more of the time. Additionally, remember that Red Hat Enterprise Linux CUDs automatically apply to all running VM instances within the same region and project. (However, one Red Hat Enterprise Linux CUD can only be applied to one VM instance at a time.)*Savings are estimates only. This analysis assumes only one Red Hat Enterprise Linux (large) instance running under the CUD project and region.What if I need to upgrade my Red Hat Enterprise Linux version after purchasing a commitment? Red Hat Enterprise Linux CUDs are version-agnostic and are not affected when you perform operating system (OS) upgrades or downgrades. For example, if you purchased a commitment for Red Hat Enterprise Linux 7, you may upgrade to Red Hat Enterprise Linux 8 and continue to use the same commitment without any action on your end. Additionally, commitments are not affected by future pricing changes to on-demand prices for Compute Engine resources.How can I purchase committed use discounts for Red Hat Enterprise Linux?The easiest way to purchase Red Hat Enterprise Linux CUDs is through the Google Cloud console. In the Google Cloud console, go to the Committed Use Discounts page. Click Purchase commitment to purchase a new commitment. Click New license committed use discount to purchase a new license commitment. Name your commitment and choose the region where you want it to apply. Choose a duration of the commitment, either 1 or 3 years. Choose a License family. Choose the License type and quantity. Choose the Number of licenses. Click Purchase.You can also purchase Red Hat Enterprise Linux commitments using the Google Cloud CLI or the Compute Engine API. For more information, refer to Purchasing commitments for licenses. We hope this helps you find the most cost-optimal plan for your Red Hat Enterprise Linux deployment needs.
Quelle: Google Cloud Platform

Introducing BigQuery Partner Center — a new way to discover and leverage integrated partner solutions

At Google, we are committed to building the most open and extensible Data Cloud. We want to provide our customers with more flexibility, interoperability and agility when building analytics solutions using BigQuery and tightly integrated partner products. We have therefore significantly expanded our Data Cloud partner ecosystem, and are increasing our investment in technology partners in a number of new areas.At the Google Data Cloud & AI Summit in March, we introduced BigQuery Partner Center, a new user interface in the Google Cloud console that enables our customers to easily discover, try, purchase and use a diverse range of partner products that have been validated through the Google Cloud Ready – BigQuery program. Google Cloud Ready – BigQuery is a program whereby Google Cloud engineering teams evaluate and validate BigQuery partner integrations and connectors using a series of tests and benchmarks based on standards set by Google. Customers can be assured of the quality of the integrations when using these partner products with BigQuery. These validated partners and solutions are now accessible directly from BigQuery Partner Center.Navigating in BigQuery Partner CenterCustomers can start exploring BigQuery Partner Center by launching the BigQuery Cloud Console.A video demo of how to discover and install a free trial of Confluent Cloud from the BigQuery Partner Center.Discover: In the Partner Center, you can find a list of validated partners organized in the following categories:BI, ML, and Advanced AnalyticsConnectors & Development ToolsData Governance, Master Data ManagementData Quality & ObservabilityETL & Data IntegrationTry: You will have the option to try out the product by signing up for a free trial version offered by the partner.Buy: If you choose to purchase any of the partner products, you can do it directly from Google Cloud Marketplace by clicking on the Marketplace hyperlink tag.Here’s an overview of how you can discover and use some of BigQuery’s partner solutions.Confluent Cloud is now available in BigQuery Partner Center to help customers easily connect and create Confluent streaming data pipelines into BigQuery, and extract real-time insights to support proactive decision-making while offloading operational burdens associated with managing open source Kafka..Fivetran offers a trial experience through the BigQuery Partner Center, which allows customers to replicate data from key applications, event streams, and file stores to BigQuery continuously. Moreover, customers can actively monitor their connector’s performance and health using logs and metrics provided through Google Cloud Monitoring.Neo4j provides an integration through BigQuery Partner Center that allows users to extend SQL analysis with graph-native data science and machine learning by working seamlessly between BigQuery and Neo4j Graph Data Science; whether using BigQuery SQL or notebooks. Data science teams can now improve and enrich existing analysis and ML using the graph-native data science capabilities within Neo4j by running in-memory graph analysis directly from BigQuery.Expanding partner ecosystem through Google Cloud Ready – BigQuery We are also excited to share that, since we introduced the Google Cloud Ready – BigQuery initiative last year, we have recognized over 50 technology partners that have successfully met a core set of integration requirements with BigQuery. To unlock more use cases that are critical to customers’ data-driven transformation journey, Google Cloud engineering teams closely worked with partners across many categories to test compatibility, tune functionality, and optimize integrations to ensure our customers have the best experience when using these partner products with BigQuery.For example, in the Data Quality & Observability category, we have most recently validated products from Anomalo, Datadog, Dynatrace, Monte Carlo and New Relic to enable better detection and remediation of data quality issues.In the Reverse ETL & Master Data Management category, we worked with Hightouch and Tamr to expand data management use cases for data cleansing, preparation, enrichment and data synchronization from BigQuery back to SaaS based applications.Data Governance and Security partners like Immuta and Privacera can provide enhanced data access controls and management capabilities for BigQuery, while Carto offers advanced geospatial and location intelligence capabilities that are well integrated with BigQuery.We also continue to expand partnerships in key categories such as Advanced Analytics and Data Integration with industry leading partners like Starburst, Sisense, Hex, and Hevo Data to ensure our customers have flexibility and options in choosing the right partner products to meet their business needs.With the general availability of BigQuery Partner Center, customers can now conveniently discover, try out and install a growing list of Google Cloud Ready – BigQuery validated partners from the BigQuery Cloud Console directly.Getting startedTo explore in the new Partner Center, launch the BigQuery Cloud Console.To see a full list of partner solutions and connectors that have been validated to work well with BigQuery, visit here. To learn more about the Google Cloud Ready – BigQuery validation program, visit our documentation page. If you are a partner interested in becoming “Google Cloud Ready” for BigQuery, please fill out this intake form. If you have any questions, feel free to contact us.
Quelle: Google Cloud Platform

Bringing our world-class expertise together under Google Cloud Consulting

Every day, we see how much our customers value Google Cloud experts working alongside their teams to drive innovation. We also know that being connected to the right services and partners at the right time accelerates customer success. Last year, we expanded our custom AI solution practice and launched our Global Delivery Center to deliver deep product expertise at a global scale. Today, we’re excited to announce the next step on our journey to bring all our services together with the launch of Google Cloud Consulting and our unified services portfolio at cloud.google.com/consulting. The Google Cloud Consulting portfolio provides a unified services capability, bringing together offerings, across multiple specializations, into a single place. This includes services from learning to technical account management to professional services and customer success. Through this single portfolio, you’ll have access to detailed descriptions of the various services, with examples of how you can leverage them to solve specific business challenges. This will make it easy to identify the right package of services for your business and will ensure you get the most out of your investment. At Google Cloud, we always work closely with our ecosystem of partners to deliver innovation and value to our customers, and Google Cloud Consulting further reinforces our commitment to being partner-first. By bringing together capabilities across the customer lifecycle — from onboarding to enablement to co-delivery and assurance — this unified portfolio makes it simpler for partners to work with Google Cloud Consulting and help drive the best outcomes for customers. “Our partnership with Google Cloud Consulting is helping us to grow our Google Cloud practice globally and accelerate our customers’ adoption of the platform. We are pushing the bounds of innovation together as the AI wave approaches,” said Ankur Kashyap, SVP and Global Head of Google Ecosystem Unit, HCLTech.Broadcom, a provider of enterprise security solutions, recently worked with Google Cloud Consulting to migrate its infrastructure from Amazon Web Services (AWS), and found the combination of technology and expertise critical for success. “Google’s deep technical skills and its data, security and AI offerings have accelerated our transformation towards becoming a software-led company,” said Andy Nallappan, Vice President, CTO and CSO, Broadcom. Kroger, the American retailer, worked with Google Cloud Consulting and Deloitte to accelerate its technical objectives. “Google Cloud Consulting and Deloitte brought us a technology architecture and application framework that we could implement in record time. We’re already seeing results across our stores, with associate tasks being optimized and overall productivity increasing,” said Jim Clendenen, VP, Enterprise Retail Systems, Kroger. Whether you’re just getting started in the cloud or seeking new ways to innovate, our portfolio of offerings is built to help you: Leverage Google Cloud professional service engineers and consultants to kickstart your cloud journey, from testing, planning and executing migrations to optimizing your operations.Work alongside our partners to provide expertise and assurance services.Benefit from access to cutting-edge tools, including best-in-class Artificial Intelligence and Machine Learning (AI/ML) solutions, data resources, and security services that you can use to build robust data platforms and protect your business from security threats. Receive bespoke, hands-on guidance from our Technical Account Managers who build familiarity with your applications, systems, and business goals, and proactively advise and accelerate your digital transformation. Train and certify your teams in Google Cloud with a range of learning services that can boost your long-term self-sufficiency, and help you foster a culture of innovation. These end-to-end capabilities are designed to meet you wherever you are in your cloud journey, so you can both build your business in the cloud and make digital breakthroughs safely.At Google Cloud, we’re committed to providing technology and services to help you grow and succeed. From developing innovative solutions, to pioneering with generative AI, to securely managing your data in the cloud, to transforming user experiences, we’re with you at all the key moments of your cloud journey. As we look forward to 2023, we’ll continue to expand the service catalog and focus on making it even easier to find and transact these services, and further streamline the experience of engaging with our services. Click here to see Google Cloud Consulting’s full portfolio of offerings. 
Quelle: Google Cloud Platform

Accelerate time to value with Google’s Data Cloud for your industry

Many data analytics practitioners today are interested in ways they can accelerate new scenarios and use cases to enable business outcomes and competitive advantage. As many enterprises look at rationalizing their data investments and modernizing their data analytics platform strategies, the prospect of migrating to a new cloud-first data platform like Google’s Data Cloud can be perceived as a risky and daunting task — not to mention the expense of the transition from redesign and remodeling of legacy data models in traditional data warehouse platforms to the refactoring of analytics dashboards and reporting for end users. The time and cost of this transition is not trivial. Many enterprises are looking for ways to deliver innovation at cloud speed without the time and costs of traditional replatforming where millions are spent on this type of transition. When access to all data within the enterprise and beyond is the future – it’s a big problem if you can’t leverage all of your data for its insights, and at cloud scale, because you’re stuck in the technologies and approaches of which aren’t designed to match your unique industry requirements. So, what is out there to address these challenges? Google’s Data Cloud for industries combines pre-built industry content, ecosystem integrations, and solution frameworks to accelerate your time to value. Google has developed a set of solutions and frameworks to address these issues as part of its latest offering called Google Cloud Cortex Framework, which is part of Google’s Data Cloud. Customers like Camanchaca accelerated build time for analytical models by 6x, and integrated Cortex content for improved supply chain and sustainability insights and saved 12,000 hours deploying 60 data models in less than 6 months. Accelerating time to value with Google Cloud Cortex FrameworkCortex Framework provides accelerators to simplify your cloud transition and data analytics journey in your industry. This blog explores some essentials you need to know about Cortex and how you can adopt and leverage its content to rapidly onramp your enterprise data from key applications such as SAP and Salesforce, along with data from Google, third-party, public and community data sets. Cortex is available today and it allows enterprises to accelerate time to value by providing endorsed connectors delivered by Google and our partners, reference architectures, ready to use data models and templates with BigQuery, Vertex AI examples, and an application layer that includes microservices templates for data sharing with BigQuery that developers can easily deploy, enhance, and make their own depending on the scope of their data analytics project or use case. Cortex content helps you get there faster — with lower time and complexity to implement. Let’s now explore some details of Cortex and how you can best take advantage of it with Google’s Data Cloud.   First, Cortex is both a framework for data analytics and a set of deployable accelerators; the below image provides an overview of the essentials of Cortex Framework focusing on key areas of endorsed connectors, reference architectures, deployment templates, and innovative solution accelerators delivered by Google and our partners. We’ll explore each of these focus areas of Cortex in greater depth below.Why Cortex Framework?  Leading connectors: First, Cortex provides leading connectors delivered by Google and our partners. These connectors have been tested and validated to provide interoperability with Cortex data models in BigQuery, Google’s cloud-scale enterprise data warehouse. By taking the guesswork out of selecting which tooling works to integrate to Cortex with BigQuery, we’re taking the time, effort, and cost out of evaluating the various tooling available in the market. Deployment accelerators: Cortex provides a set of predefined deployable templates and content for enterprise use cases with SAP and Salesforce that include BigQuery data models, Looker dashboards, Vertex AI examples, and microservices templates for synchronous and asynchronous data sharing with surrounding applications. These accelerators are available free of charge today via Cortex Foundation and can easily be deployed in hours. The figure below provides an overview of Cortex Foundation and focus areas for templates and content available today:Reference architectures: Cortex provides reference architectures for integrating with leading enterprise applications such as SAP and Salesforce as well as Google and third-party data sets and data providers. Reference architectures include blueprints for integration and deployment with BigQuery that are based on best practices for integration with Google’s Data Cloud and partner solutions based on real-world deployments. Examples include best practices and reference architectures for CDC (Change Data Capture) processing and BigQuery architecture and deployment best practices. The image below shows an example of reference architectures based on Cortex published best practices and options for CDC processing with Salesforce. You can take advantage of reference architectures such as this one today and benefit from these best practices to reduce the time, effort and cost of implementation based on what works and has been successful in real-world customer deployments.Innovative solutions: Cortex Foundation includes support for various use cases and insights across a variety of data sources. For example, Cortex Demand Sensing is a solution accelerator offering leveraging Google Cloud Cortex Framework to deliver accelerated value to Consumer Packaged Goods (CPG) customers who are looking to infuse innovation into their Supply Chain Management and Demand Forecasting processes.An accurate forecast is critical to reducing costs, and maximizing profitability. One gap for many CPG organizations is a near-term forecast that leverages all of the available information from various internal and external data sources to predict near-term changes in demand. As an enhanced view of demand materializes, CPG companies also need to manage and match demand and supply to identify near term changes in demand and their root cause, and then shape supply and demand to improve SLAs and increase profitability. Our approach shown below for Demand Sensing integrates SAP ERP and other data sets (e.g. Weather Trends, Demand Plan, etc) together with our Data Cloud solutions like BigQuery, Vertex AI and Looker to deliver extended insights and value to demand planners to improve the accuracy of demand predictions and help to defer cost and drive new revenue opportunities.The ecosystem advantageBuilding an ecosystem means connections with a diverse set of partners that accelerate your time to value. Google Cloud is excited to announce a range of new partner innovations that bring you more choice and optionality. Over 900 partnersput trust in BigQuery and Vertex AI to power their business by being part of the “Built with” Google Cloud initiative. These partners build their business on top of our data platform, enabling them to scale at high performance – both their technology and their business. In addition to this, more than 50 data platform partners offer fully validated integrations through our Google Cloud Ready – BigQuery initiative. A look aheadOur solutions roadmap will target expansion of Cortex Foundation templates and content support for additional solutions in sales and marketing, supply chain, and expansion of use cases and models for finance. You will also see significant expansion with predefined BigQuery data models and content for Google Ads, Google Marketing Platform, and other cross-media platforms and applications and improvements with deployment experience and expansion into analytical accelerators that span across data sets and industries. If you would like to connect with us to share more details on what we are working on and our roadmap, we’re happy to engage with you! Please feel free to contact us at cortex-framework@google.com to learn more about the work we are doing and how we might help with your specific use cases or project. We’d love to hear from you!Ready to start your journey?With Cortex Framework, you come first in benefiting from our open source Data Foundation solutions content and packaged industry solutions content available on our Google Cloud Marketplace and Looker. The Cortex content is available free of charge so you can easily get started with your Google Data Cloud journey today!Learn more about Google Cloud Cortex Framework and how you can accelerate business outcomes with less risk, complexity and cost. Cortex will help you get there faster with your enterprise data sources and establish a cloud-first data foundation with Google’s Data Cloud. Join the Data Cloud Summitto learn how customers like Richemont & Cartier use Cortex Framework to speed up time to value.
Quelle: Google Cloud Platform

Pub/Sub schema evolution is now GA

Pub/Sub schemas are designed to allow safe, structured communication between publishers and subscribers. In particular, the use of schemas provides that guarantee that any message published adheres to a schema and encoding, which the subscriber can rely on when reading the data. Schemas tend to evolve over time. For example, a retailer is capturing web events and sending them to Pub/Sub for downstream analytics with BigQuery. The schema now includes additional fields that need to be propagated through Pub/Sub. Up until now Pub/Sub has not allowed the schema associated with a topic to be altered. Instead, customers had to create new topics. That limitation changes today as the Pub/Sub team is excited to introduce schema evolution, designed to allow the safe and convenient update of schemas with zero downtime for publishers or subscribers.Schema revisionsA new revision of schema can now be created by updating an existing schema. Most often, schema updates only include adding or removing optional fields, which is considered a compatible change.All the versions of the schema will be available on the schema details page. You are able to delete one or multiple schema revisions from a schema, however you cannot delete the revision if the schema has only one revision. You can also quickly compare two revisions by using the view diff functionality.Topic changesCurrently you can attach an existing schema or create a new schema to be associated with a topic so that all the published messages to the topic will be validated against the schema by Pub/Sub. With schema evolution capability, you can now update a topic to specify a range of schema revisions against which Pub/Sub will try to validate messages, starting with the last version and working towards the first version. If first-revision is not specified, any revision <= last revision is allowed, and if last revision is not specified, then any revision >= first revision is allowed.Schema evolution exampleLet’s take a look at a typical way schema evolution may be used. You have a topic T that has a schema S associated with it. Publishers publish to the topic and subscribers subscribe to a subscription on the topic:Now you wish to add a new field to the schema and you want publishers to start including that field in messages. As the topic and schema owner, you may not necessarily have control over updates to all of the subscribers nor the schedule on which they get updated. You may also not be able to update all of your publishers simultaneously to publish messages with the new schema. You want to update the schema and allow publishers and subscribers to be updated at their own pace to take advantage of the new field. With schema evolution, you can perform the following steps to ensure a zero-downtime update to add the new field:1. Create a new schema revision that adds the field.2. Ensure the new revision is included in the range of revisions accepted by the topic.3. Update publishers to publish with the new schema revision.4. Update subscribers to accept messages with the new schema revision.Steps 3 and 4 can be interchanged since all schema updates ensure backwards and forwards compatibility. Once your migration to the new schema revision is complete, you may choose to update the topic to exclude the original revision, ensuring that publishers only use the new schema.These steps work for both protocol buffer and Avro schemas. However, some extra care needs to be taken when using Avro schemas. Your subscriber likely has a version of the schema compiled into it (the “reader” schema), but messages must be parsed with the schema that was used to encode them (the “writer” schema). Avro defines the rules for translating from the writer schema to the reader schema. Pub/Sub only allows schema revisions where both the new schema and the old schema could be used as the reader or writer schema. However, you may still need to fetch the writer schema from Pub/Sub using the attributes passed in to identify the schema and then parse using both the reader and writer schema. Our documentation provides examples on the best way to do this.BigQuery subscriptionsPub/Sub schema evolution is also powerful when combined with BigQuery subscriptions, which allow you to write messages published to Pub/Sub directly to BigQuery. When using the topic schema to write data, Pub/Sub ensures that at least one of the revisions associated with the topic is compatible with the BigQuery table. If you want to update your messages to add a new field that should be written to BigQuery, you should do the following:1. Add the OPTIONAL field to the BigQuery table schema.2. Add the field to your Pub/Sub schema.3. Ensure the new revision is included in the range of revisions accepted by the topic.4. Start publishing messages with the new schema revision.With these simple steps, you can evolve the data written to BigQuery as your needs change.Quotas and limitsSchema evolution feature comes with following limits:20 revisions per schema name at any time are allowed.Each individual schema revision does not count against the maximum 10,000 schemas per project.Additional resourcesPlease check out the additional resources available at to explore this feature further:DocumentationClient librariesSamplesQuotas
Quelle: Google Cloud Platform