Influence the Future of WordPress by Taking the 2023 Annual Survey Today

Each year, the WordPress community—users, site builders, contributors, and more—gives valuable feedback through an annual survey. Note: this questionnaire is provided by the WordPress project, not by WordPress.com. (Learn more about that distinction here.)

The results can influence the direction of the WordPress project by identifying areas that need attention, as well as helping track trends over time. This survey helps those who build WordPress—including our engineers here at WordPress.com—understand more about how the software is used and by whom.

To ensure your WordPress experience gets represented in the 2023 survey results, take the survey now.

Take the 2023 survey

The survey will be open for five weeks. Results will be published on the WordPress News blog in early December.

Please help spread the word about the survey by sharing it with your network. The more people who complete the survey and share their experience with WordPress, the more the project will benefit.

Thank you in advance for your valuable input on the future of WordPress!

A note about security and privacy

Data security and privacy are paramount to the WordPress project and community. With this in mind, all data will be anonymized: no email addresses or IP addresses will be associated with published results. To learn more about WordPress.org’s privacy practices, view the privacy policy.
Quelle: RedHat Stack

Welcome to Google Cloud Next ’23

Editor’s note: Content updated at 9am PT to reflect announcements made on stage in the opening keynote at Google Cloud Next ’23.This week, Google Cloud will welcome thousands of people to San Francisco for our first in-person Google Cloud Next event since 2019. I am incredibly excited to bring so many of our customers and partners together to showcase the amazing innovations we have been working on across our entire portfolio of Infrastructure, Data and AI, Workspace Collaboration, and Cybersecurity solutions. It’s been an exciting year so far for Google Cloud. We’ve achieved some noteworthy milestones, including in Q2 2023, reaching a $32B annual revenue run rate and seeing our second quarter of profitability, which is all based on the success of our customers across every industry. This year, we have shared some incredible stories about how we are working with leading organizations like Culture Amp, Deutsche Borse, eDreams ODIGEO, HSBC, IHOP, IPG Mediabrands, John Lewis Partnership, The Knot Worldwide, Macquarie Bank, Priceline, Shopify, the Singapore Government, U.S. Steel, and Wendy’s. Today, we are announcing new or expanded relationships with The Estée Lauder Companies, FOX Sports, GE Appliances, General Motors, HCA Healthcare, and more. I’d like to thank all of these customers and the millions of others around the world for trusting us as they progress on their digital transformation journeys.Today at Google Cloud Next ’23, we’re proud to announce new ways we’re helping every business, government, and user benefit from generative AI and leading cloud technologies, including: AI-optimized Infrastructure: The most advanced AI-optimized infrastructure for companies to train and serve models. We offer this infrastructure in our cloud regions, to run in your data centers with Google Distributed Cloud, and on the edge. Vertex AI: Developer tools to build models and AI-powered applications, with major advancements to Vertex AI for creating custom models and building custom Search and Conversation apps with enterprise data; Duet AI: Duet AI is an always-on AI collaborator that is deeply integrated in Google Workspace and Google Cloud. Duet AI in Workspace gives every user a writing helper, a spreadsheet expert, a project manager, a note taker for meetings, and a creative visual designer, and is now generally available. Duet AI in Google Cloud collaborates like an expert coder, a software reliability engineer, a database pro, an expert data analyst, and a cybersecurity adviser — and is expanding its preview and will be generally available later this year; and Many more significant announcements across Developer Tools, Data, Security, Sustainability, and our fast-growing cloud ecosystem.New infrastructure and tools to help customersThe advanced capabilities and broad applications that make gen AI so revolutionary demand the most sophisticated and capable infrastructure. We have been investing in our data centers and network for 25 years, and now have a global network of 38 cloud regions, with a goal to operate entirely on carbon-free energy 24/7 by 2030.Our AI-optimized infrastructure is a leading choice for training and serving gen AI models. In fact, more than 70% of gen AI unicorns are Google Cloud customers, including AI21, Anthropic, Cohere, Jasper, MosaicML, Replit, Runway, and Typeface; and more than half of all funded gen AI startups are Google Cloud customers, including companies like Copy.ai, CoRover, Elemental Cognition, Fiddler AI, Fireworks.ai, PromptlyAI, Quora, Synthesized, Writer, and many others.Today we are announcing key infrastructure advancements to help customers, including:Cloud TPU v5e: Our most cost-efficient, versatile, and scalable purpose-built AI accelerator to date. Now, customers can use a single Cloud TPU platform to run both large-scale AI training and inference. Cloud TPU v5e scales to tens of thousands of chips and is optimized for efficiency. Compared to Cloud TPU v4, it provides up to a 2x improvement in training performance per dollar and up to a 2.5x improvement in inference performance per dollar.A3 VMs with NVIDIA H100 GPU: Our A3 VMs powered by NVIDIA’s H100 GPU will be generally available next month. It is purpose-built with high-performance networking and other advances to enable today’s most demanding gen AI and large language model (LLM) innovations. This allows organizations to achieve three times better training performance over the prior-generation A2. GKE Enterprise: This enables multi-cluster horizontal scaling ;-required for the most demanding, mission-critical AI/ML workloads. Customers are already seeing productivity gains of 45%, while decreasing software deployment times by more than 70%. Starting today, the benefits that come with GKE, including autoscaling, workload orchestration, and automatic upgrades, are now available with Cloud TPU v5e.Cross-Cloud Network: A global networking platform that helps customers connect and secure applications across clouds. It is open, workload-optimized, and offers ML-powered security to deliver zero trust. Designed to enable customers to gain access to Google services more easily from any cloud, Cross-Cloud Network reduces network latency by up to 35%.Google Distributed Cloud: Designed to meet the unique demands of organizations that want to run workloads at the edge or in their data center. In addition to next-generation hardware and new security capabilities, we’re also enhancing the GDC portfolio to bring AI to the edge, with Vertex AI integrations and a new managed offering of AlloyDB Omni on GDC Hosted.  Our Vertex AI platform gets even betterOn top of our world-class infrastructure, we deliver what we believe is the most comprehensive AI platform — Vertex AI — which enables customers to build, deploy and scale machine learning (ML) models. We have seen tremendous usage, with the number of gen AI customer projects growing more than 150 times from April-July this year. Customers have access to more than 100 foundation models, including third-party and popular open-source versions, in our Model Garden. They are all optimized for different tasks and different sizes, including text, chat, images, speech, software code, and more. We also offer industry specific models like Sec-PaLM 2 for cybersecurity, to empower global security providers like Broadcom and Tenable; and Med-PaLM 2 to assist leading healthcare and life sciences companies including Bayer Pharmaceuticals, HCA Healthcare, and Meditech. Vertex AI Search and Conversation are now generally available, enabling organizations to create Search and Chat applications using their data in just minutes, with minimal coding and enterprise-grade management and security built in. In addition, Vertex AI Generative AI Studio provides user-friendly tools to tune and customize models, all with enterprise-grade controls for data security. These include developer tools like Text Embeddings API, which lets developers build sophisticated applications based on semantic understanding of text or images, and Reinforcement Learning from Human Feedback (RLHF), which incorporates human feedback to deeply customize and improve model performance. Today, we’re excited to announce several new models and tooling in the Vertex AI platform:PaLM 2, Imagen and Codey Upgrades: We’re updating PaLM 2 to 32k context windows so enterprises can easily process longer form documents like research papers and books. We’re also improving Imagen’s visual appeal, and extending support for new languages in Codey.Tools for tuning: For PaLM 2 and Codey, we’re making adapter tuning generally available and in preview respectively, which can help improve LLM performance with as few as 100 examples. We’re also introducing a new method of tuning for Imagen, called Style Tuning, so enterprises can create images aligned to their specific brand guidelines or other creative needs with a small amount of reference images.New models: We’re announcing availability of Llama 2 and Code Llama from Meta, and Technology Innovative Institute’s Falcon LLM, a popular open-source model, as well as pre-announcing Claude 2 from Anthropic. In the case of Llama 2, we will be the only cloud provider offering both adapter tuning and RLHF.Vertex AI extensions: Developers can access, build, and manage extensions that deliver real-time information, incorporate company data, and take action on the user’s behalf. This opens up endless new possibilities for gen AI applications that can operate as an extension of your enterprise, enabled by the ability to access proprietary information and take action on third-party platforms like your CRM system or email.Grounding: We are announcing an enterprise grounding service that works across Vertex AI foundation models, Search and Conversation that gives customers the ability to ground responses in their own enterprise data to deliver more accurate responses. We are also working with a few early customers to test grounding with the technology that powers Google Search.Digital Watermarking on Vertex AI: Powered by Google DeepMind SynthID, this offers a state-of-the art technology that embeds the watermark directly into the image of pixels, making it invisible to the human eye and difficult to tamper with. Digital watermarking provides customers with a scalable approach to creating and identifying AI-generated images responsibly. We are the first hyperscale cloud provider to offer this technology for AI-generated images.Colab Enterprise: This managed service combines the ease-of-use of Google’s Colab notebooks with enterprise-level security and compliance capabilities. Data scientists can use Colab Enterprise to collaboratively accelerate AI workflows with access to the full range of Vertex AI platform capabilities, integration with BigQuery, and even code completion and generation. Equally important to discovering and training the right model is controlling your data. From the beginning, we designed Vertex AI to give you full control and segregation of your data, code, and IP, with zero data leakage. When you customize and train your model with Vertex AI — with private documents and data from your SaaS applications, databases, or other proprietary sources — you are not exposing that data to the foundation model. We take a snapshot of the model, allowing you to train and encapsulate it together in a private configuration, giving you complete control over your data. Your prompts and data, as well as user inputs at inference time, are not used to improve our models and are not accessible to other customers.Duet AI in Workspace and Google CloudWe unveiled Duet AI at I/O in May, introducing powerful new features across Workspace and showcasing developer features such as code and chat assistance in Google Cloud. Since then, trusted testers around the world have experienced the power of Duet AI while we worked on expanding capabilities and integrating it across a wide range of products and services throughout Workspace and Google Cloud. Let’s start with Workspace, the world’s most popular productivity tool, with more than 3 billion users and more than 10 million paying customers who rely on it every day to get things done. With the introduction of Duet AI just a few months ago, we delivered a number of features to make your teams more productive, like helping you write and refine content in Gmail and Google Docs, create original images in Google Slides, turn ideas into action and data into insights with Google Sheets, foster more meaningful connections in Google Meet, and more. Since then, thousands of companies and more than a million trusted testers have used Duet AI as a powerful collaboration partner — a coach, source of inspiration, and productivity booster — all while helping to ensure every user and organization has control over their data. Today, we are introducing a number of new enhancements:Duet AI in Google Meet: Duet AI will take notes during video calls, send meeting summaries, and even automatically translate captions in 18 languages. In addition, to ensure every meeting participant is clearly seen, heard, and understood, Duet AI in Meet announced studio look, studio lighting, and studio sound. Duet AI in Google Chat: You’ll be able to chat directly with Duet AI to ask questions about your content, get a summary of documents shared in a space, and catch up on missed conversations. We’ve also delivered a refreshed user interface, new shortcuts, and enhanced search to allow you to stay on top of conversations, as well as huddles in Chat which allow teams to start meetings from the place where they are already collaborating.Workspace customers of all sizes and from all industries are using Duet AI and seeing improvements in customer experience, productivity and efficiency. Instacart is creating enhanced customer service workflows and industrial technology company Trimble can now deliver solutions faster to their clients. Adore Me, Uniformed Services University and Thoughtworks are increasing productivity by using Duet AI to quickly write content such as emails, campaign briefs, and project plans with just a simple prompt. Today, we are making Duet AI in Google Workspace generally available, while expanding the preview capabilities of Duet AI in Google Cloud, with general availability coming later this year. Beyond Workspace, Duet AI can now provide AI assistance across a wide range of Google Cloud products and services — as a coding assistant to help developers code faster, as an expert adviser to help operators quickly troubleshoot application and infrastructure issues, as a data analyst to provide quick and better insights, and as a security adviser to recommend best practices to help prevent cyber threats.Customers are already realizing value from Duet AI in Google Cloud: L’Oréal is able to achieve better and faster business decisions from their data, and Turing, in early testing, is reporting engineering productivity gains of one-third.Our Duet AI in Google Cloud announcements include advancements for:Software development: Duet AI provides expert assistance across your entire software development lifecycle, enabling developers to stay in flow-state longer by minimizing context switching to help them be more productive. In addition to code completion and code generation, it can help you modernize applications faster by assisting you with code refactoring; and by using Duet AI in Apigee, any developer can now easily build APIs and integrations using simple natural language prompts. Application and infrastructure operations: Operators can chat with Duet AI in natural language across a number of services directly in the Google Cloud Console to quickly retrieve “how to” information about infrastructure configuration, deployment best practices, and expert recommendations on cost and performance optimization. Data Analytics: Duet AI in BigQuery provides contextual assistance for writing SQL queries as well as Python code, generates full functions and code blocks, auto-suggests code completions and explains SQL statements in natural language, and can generate recommendations based on your schema and metadata. These capabilities can allow data teams to focus more on outcomes for the business. Accelerating and modernizing databases: Duet AI in Cloud Spanner, AlloyDB and Cloud SQL, helps generate code to structure, modify, or query data using natural language. We’re also bringing the power of Duet AI to Database Migration Service (DMS), helping automate the conversion of database code, such as stored procedures, functions, triggers, and packages, that could not be converted with traditional translation technologies.Security Operations: We are bringing Duet AI to our security products including Chronicle Security Operations, Mandiant Threat Intelligence and Security Command Center, which can empower security professionals to more efficiently prevent threats, reduce toil in security workflows, and uplevel security talent. Duet AI delivers contextual recommendations from PaLM 2 LLM models and expert guidance, trained and tuned with Google Cloud-specific content, such as documentation, sample code, and Google Cloud best practices. In addition, Duet AI was designed using Google’s comprehensive approach to help protect customers’ security and privacy, as well as ourAI principles. With Duet AI, your data is your data. Your code, your inputs to Duet AI, and your recommendations generated by Duet AI will not be used to train any shared models nor used to develop any products.Simplify analytics at scale with a unified data and AI foundationData sits at the center of gen AI, which is why we are bringing new capabilities to Google’s Data and AI Cloud that will help unlock new insights and boost productivity for data teams. In addition to the launch of Duet AI, which assists data engineers and data analysts across BigQuery, Looker, Spanner, Dataplex, and our database migration tools, we have several other important announcements today in data and analytics:BigQuery Studio: A single interface for data engineering, analytics, and predictive analysis, BigQuery Studio helps increase efficiency for data teams. In addition, with new integrations to Vertex AI foundation models, we are helping organizations AI-enable their data lakehouse with innovations for cross-cloud analytics, governance, and secure data sharing.AlloyDB AI: Today we’re introducing AlloyDB AI, an integral part of AlloyDB, our PostgreSQL-compatible database service. AlloyDB AI offers an integrated set of capabilities for easily building GenAI apps, including high-performance, vector queries that are up to 10x faster than Standard PostgreSQL. In addition, with AlloyDB Omni, you can also run AlloyDB virtually everywhere. This includes on-premises, on Google Cloud, AWS, Azure, or through Google Distributed Cloud. Data Cloud Partners: Our open data ecosystem is an asset for customers’ gen AI strategies, and we’re continuing to expand the breadth of partner solutions and datasets available on Google Cloud. Our partners, like Confluent, DataRobot, Dataiku, Datastax, Elastic, MongoDB, Neo4j, Redis, SingleStore, and Starburst are all launching new capabilities to help customers accelerate and enhance gen AI development with data. Our partners are also adding more datasets to Analytics Hub, which customers can use to build and train gen AI models. This includes trusted data from Acxiom, Bloomberg, TransUnion, ZoomInfo, and more.These innovations help organizations harness the full potential of data and AI through a unified data foundation. With Google Cloud, companies can now run their data anywhere and bring AI and machine learning tools directly to their data, which can lower the risk and cost of data movement.  Addressing top security challenges Google Cloud is the only leading security provider that brings together the essential combination of frontline intelligence and expertise, a modern SecOps platform, and a trusted cloud foundation, all infused with the power of gen AI, to help drive the security outcomes you’re looking to achieve. Earlier this year, we introduced Security AI Workbench, an industry-first extensible platform powered by our next generation security LLM, Sec-PaLM 2, which incorporates Google’s unique visibility into the evolving threat landscape and is fine-tuned for cybersecurity operations. And just a few weeks ago, we announced Chronicle CyberShield, a security operations solution that allows governments to break down information silos, centralize security data to help strengthen national situational awareness, and initiate a united response. In addition to the Duet AI innovations mentioned earlier, today we are also announcing:Mandiant Hunt for Chronicle: This service integrates the latest insights into attacker behavior from Mandiant’s frontline experts with Chronicle Security Operations’ ability to quickly analyze and search security data, helping customers gain elite-level support without the burden of hiring, tooling, and training. Agentless vulnerability scanning: These posture management capabilities in Security Command Center detect operating system, software, and network vulnerabilities on Compute Engine virtual machines. Network security advancements: Cloud Firewall Plus adds advanced threat protection and next-generation firewall (NGFW) capabilities to our distributed firewall service, powered by Palo Alto Networks; and Network Service Integration Manager allows network admins to easily integrate trusted third-party NGFW virtual appliances for traffic inspection.Assured Workloads Japan Regions: Customers can have controlled environments that enforce data residency in our Japanese regions, options for local control of encryption keys, and administrative access transparency. We also continue to grow our Regulated and Sovereignty solutions partner initiative to bring innovative third-party solutions to customers’ regulated cloud environments. Expanding our ecosystemOur ecosystem is already delivering real-world value for businesses with gen AI, and bringing new capabilities, powered by Google Cloud, to millions of users worldwide. Partners are also using Vertex AI to build their own features for customers – including Box, Canva, Salesforce, UKG, and many others. Today at Next ‘23, we’re announcing:DocuSign is working with Google to pilot how Vertex AI could be used to help generate smart contract assistants that can summarize, explain and answer what’s in complex contracts and other documents.SAP is working with us to build new solutions utilizing SAP data and Vertex AI that will help enterprises apply gen AI to important business use cases, like streamlining automotive manufacturing or improving sustainability.Workday’s applications for Finance and HR are now live on Google Cloud and they are working with us to develop new gen AI capabilities within the flow of Workday, as part of their multicloud strategy. This includes the ability to generate high-quality job descriptions and to bring Google Cloud gen AI to app developers via the skills API in Workday Extend, while helping to ensure the highest levels of data security and governance for customers’ most sensitive information.In addition, many of the world’s largest consulting firms, including Accenture, Capgemini, Deloitte, and Wipro, have collectively planned to train more than 150,000 experts to help customers implement Google Cloud GenAI.We are in an entirely new era of digital transformation, fueled by gen AI. This technology is already improving how businesses operate and how humans interact with one another. It’s changing the way doctors care for patients, the way people communicate, and even the way workers are kept safe on the job. And this is just the beginning.Together, we are creating a new way to cloud. We are grateful for the opportunity to be on this journey with our customers. Thank you for your partnership, and have a wonderful Google Cloud Next ‘23.
Quelle: Google Cloud Platform

Accelerate your cloud transformation with Delivery Navigator

Our goal at Google Cloud Consulting is to make it easy for our partners and customers to transform, create, and innovate on Google Cloud. Too often we find that cloud migrations and transformations aren’t as efficient as they could be because access to the latest and greatest tools, techniques, and ways of working aren’t known or easily accessible. This takes away from our partners’ and customers’ abilities to focus on what’s important to their businesses. Now imagine a world where that cloud transformation expertise—and other leading technical practices—are in a single place, at your fingertips, whenever you need them. That’s why today we’re announcing that we’re opening up our internal, integrated platform for delivering cloud projects, called Delivery Navigator, to our partners. You can learn more about the product in our 90-second overview video here.Created by uniting Google technology and methodologiesWe started building Delivery Navigator almost two years ago as a way for our practitioners to create consistent, repeatable, agile, high-quality experiences for our customers. By uniting our technology and our implementation methodologies based on thousands of projects, we’re providing our partners with the same methods and assets, so they can accelerate delivery readiness with customers. Specifically, we’re bringing together a library of transformation methods with project-management tool integration and telemetry, all supported by helpful features that leverage our in-house generative AI technology. Like many Google products, we believe that if we focus on the user, everything else will follow. This includes our ecosystem of delivery partners who deliver value to our customers every day. We know first-hand how difficult it can be to build momentum for your transformation when valuable time is being spent looking for the right template or tracking project hygiene. We also understand that each of our customers may experience these transformations differently, with different industry standards leading to variations in delivery approaches, nomenclature, and scoping estimates. Delivery Navigator aims to keep practitioners focused on driving creative solutions, innovation and other value-added customer outcomes, by:Compiling standards, technical knowledge, and leading delivery practices: We want to save your teams time by making it easy to find standard, reusable delivery methods and code snippets for everything from establishing Cloud Foundations to building a Data Science Development Platform, reducing variability in scoping estimates and the need to start from scratch. We also think it’s important to establish a common vocabulary when talking about scope and deliverables with our partners and customers. Providing helpful project telemetry, so you can keep your team on-track: We want to help you mitigate delivery risk by enabling timely project visibility across key health performance indicators, while reducing the toil of generating a regular project status by using our standardized metrics and reports.Integrating with your project management tools: Day-to-day, we want you to be able to manage your project, your way. Standard Delivery Navigator methods are designed to be connected into popular project management tools such as Asana, Jira, and Smartsheet, along with project health and status integration. We recognize every customer has their own tooling for project management, so we have built the solution to allow that to continue.  Help us build a new cloud methodology communityWe believe what’s good for cloud adoption, and the ecosystem at large, is good for us all. Our goal is to co-create value and great experiences for our customers, faster. Our vision is to ensure Delivery Navigator becomes a vibrant cloud delivery methodology community that includes our partners, and eventually our customers, too. We see the platform as a differentiated opportunity for Google, our partners, and our customers to come together, collaborate, share ideas, and drive continuous improvement into the cloud ecosystem. While initially the platform will contain a portion of our delivery knowledge, if you believe you have more to contribute, we’d love to talk to you about contributing to the breadth and depth of the content. Delivery Navigator will first open to partners through our public preview launch, scheduled for early Q4. You can learn more about Delivery Navigator and subsequent product launch phases on our Partner Advantage portal, or join our broader Partner Advantage program as a new user, here.
Quelle: Google Cloud Platform

What’s new with Google Cloud

Want to know the latest from Google Cloud? Find it here in one handy location. Check back regularly for our newest updates, announcements, resources, events, learning opportunities, and more. Tip: Not sure where to find what you’re looking for on the Google Cloud blog? Start here: Google Cloud blog 101: Full list of topics, links, and resources.Week of Sep 18 – Sep 22Meet the inaugural cohort of the Google for Startups Accelerator: AI First program featuring groundbreaking businesses from eight countries across Europe and Israel using AI and ML to solve complex problems. Learn how Google Cloud empowers these startups and check out the selected ventures here.BigQuery is introducing new SQL capabilities for improved analytics flexibility, data quality and security. Some examples include schema support for Flexible column name, Authorized store proceduces, ANY_VALUE (HAVING) also known as MAX_BY and MIN_BY and many more. Check out full details here.Cloud Logging is introducing to Preview the ability to save charts from Cloud Logging’s Log Analytics to a custom dashboard in Cloud Monitoring. Viewing, copying and sharing the dashboards are supported in Preview. For more information, see Save a chart to a custom dashboard.Cloud Logging now supports customizable dashboards in its Logs Dashboard. Now you, can add your own charts to see what’s most valuable to you on the Logs Dashboard. Learn more here.Cloud Logging launches several usability features for effective troubleshooting. Learn more in this blog post.Search your logs by service name with the new option in Cloud Logging. Now you can use the Log fields to select by service which makes it easier to quickly find your Kubernetes container logs. Check out the details here.Community Security Analytics (CSA) can now be deployed via Dataform to help you analyze your Google Cloud security logs. Dataform simplifies deploying and operating CSA on BigQuery, with significant performance gains and cost savings. Learn more why and how to deploy CSA with Dataform in this blog post.Dataplex data profiling and AutoDQ are powerful new features that can help organizations to improve their data quality and build more accurate and reliable insights and models. These features and now Generally Available. Read more in this blog post.  Week of Sep 4 – Sep 8Introducing Looker’s Machine Learning Accelerator. This easy to install extension allows business users to train, evaluate, and predict with machine learning models right in the Looker interface.Learn about how Freestar has built a super low latency, globally distributed application powered by Memorystore and the Envoy proxy. This reference walks users through the finer details of architecture and configuration, that they can easily replicate for their own needs.Week of Aug 28 – Sep 1You can access comprehensive and up-to-date environmental information to develop sustainability solutions and help people adapt to the impacts of climate change through Google Maps Platform’s environment APIs. The Air Quality,  and Solar APIs are generally available today. Get startedor learn more in this blog post.Google Cloud’s Global Partner Ecosystems & Channels team launched the Industry Value Networks (IVN) initiative at Google Cloud Next ’23. IVNs combine expertise and offerings from systems integrators (SIs), independent software vendors (ISVs) and content partners to create comprehensive, differentiated, repeatable, and high-value solutions that accelerate time-to-value and reduce risk for customers. To learn more about the IVN initiative, please see this blog postWeek of Aug 21 – Aug 25You can now easily export data from Earth Engine into BigQuery with our new connector. This feature allows for improved workflows and new analyses that combine geospatial raster and tabular data. This is the first step in toward deeper interoperability between the two platforms, supporting innovations in geospatial sustainability analytics. Learn more in this blog post or join our session at Cloud Next.Week of Aug 14 – Aug 18You can now view your log query results as a chart in the Log Analytics page in Cloud Logging. With this new capability available in Preview, users can write a SQL filter and then use the charting configuration to build a chart. For more information, see Chart query results with Log Analytics.Week of Aug 7 – Aug 11You can now use Network Analyzer and Recommender API to query the IP address utilization of your GCP subnets, to identify subnets that might be full or oversized. Learn more in a dedicated blog post here.Memorystore has introduced version support for Redis 7.0. Learn more about the included features and upgrade your instance today!Week of July 31 – August 4Attack Path Simulation is now generally available in Security Command Center Premium. This new threat prevention capability automatically analyzes a customer’s Google Cloud environment to discover attack pathways and generate attack exposure scores to prioritize security findings. Learn more or get started now.Week of July 24-28Cloud Deploy has updated the UI with the ability to Create a Pipeline along with a Release.  The feature is now GA.  Read moreOur newly published Data & Analytics decision tree helps you select the services on Google Cloud that best match your data workloads needs, and the accompanying blog provides an overview of the services offered for data ingestion, processing, storage, governance, and orchestration.Customer expectations from the ecommerce platforms are at all time high and they now demand a seamless shopping experience across platforms, channels and devices. Establishing a secure and user-friendly login platform can make it easier for users to self-identify and help retailers gain valuable insights into customer’s buying habits. Learn more about how they can better manage customer identities to support an engaging ecommerce user experience using Google Cloud Identity Platform. Our latest Cloud Economics post just dropped, exploring how customers can benchmark their IT spending against peers to optimize investments. Comparing metrics like tech spend as a percentage of revenue and OpEx uncovers opportunities to increase efficiency and business impact. This data-driven approach is especially powerful for customers undergoing transformation.Week of July 17-21Cloud Deploy now supports deploy parameters. With deploy parameters you can pass parameters for your release, and those values are provided to the manifest or manifests before those manifests are applied to their respective target. A typical use for this would be to apply different values to manifests for different targets in a parallel deployment.  Read moreCloud Deploy is now listed among other Google Cloud services which can be configured to meet Data Residency Requirements. Read moreLog Analytics in Cloud Logging now supports most regions. Users can now upgrade buckets to use Log Analytics in Singapore, Montréal, London, Tel Aviv and Mumbai. Read more for the full list of support regions.Week of July 10-14Cloud CDN now supports private origin authentication in GA. This capability improves security by allowing only trusted connections to access the content on your private origins and preventing users from directly accessing it.Workload Manager – Guided Deployment Automation is now available in Public Preview, with initial support for SAP solutions.  Learn how to configure and deploy SAP workloads directly from a guided user interface, leveraging end-to-end automation built on Terraform and Ansible.Artifact Registry – Artifact registry now supports clean up policies now in Preview.  Cleanup policies help you manage artifacts by automatically deleting artifacts that you no longer need, while keeping artifacts that you want to store. Read moreWeek of July 3-7Cloud Run jobs now supports long-running jobs. A single Cloud Run jobs task can now run for up to 24 hours. Read More.How Google Cloud NAT helped strengthen Macy’s security. Read moreWeek of June 26-30Cloud Deploy parallel deployment is now generally available. You can deploy to a target that’s configured to represent multiple targets, and your application is deployed to those targets concurrently.  Read More.Cloud Deploy canary deployment strategy is now generally available. A canary deployment is a progressive rollout of an application that splits traffic between an already-deployed version and a new version.  Read MoreWeek of June 19 – June 23Google Cloud’s Managed Service for Prometheus now supports Prometheus exemplars. Exemplars provide cross-signals correlation between your metrics and your traces so you can more easily pinpoint root cause issues surfaced in your monitoring operations.Managing logs across your organization is now easier with the general availability of user-managed service accounts. You can now choose your own service account when sending logs to a log bucket in a different project.Data Engineering and Analytics Day – Join Google Cloud experts on June 29th to learn about the latest data engineering trends and innovations, participate in hands-on labs, and learn best practices of Google Cloud’s data analytics tools. You will gain a deeper understanding of how to centralize, govern, secure, streamline, analyze, and use data for advanced use cases like ML processing and generative AI.Week of June 5 – June 9TMI: Shifting Down, Not Left- The first post in our new modernization series, The Modernization Imperative. Here, Richard Seroter talks about the strategy of ‘shifting down’ and relying on managed services to relieve burdens on developers. Cloud Econ 101: The first in a new series on optimizing cloud tools to achieve greater return on your cloud investments. Join us biweekly as we explore ways to streamline workloads, and explore successful cases of aligning technology goals to drive business value.Global External HTTP(S) Load Balancer and Cloud CDN’s advanced traffic management using flexible pattern matching is now GA. This allows you to use wildcards anywhere in your path matcher. You can use this to customize origin routing for different types of traffic, request and response behaviors, and caching policies. In addition, you can now use results from your pattern matching to rewrite the path that is sent to the origin.Dataform is Generally Available. Dataform offers an end-to-end experience to develop, version control, and deploy SQL pipelines in BigQuery. Using a single web interface, data engineers and data analysts of all skill levels can build production-grade SQL pipelines in BigQuery while following software engineering best practices such as version control with Git, CI/CD, and code lifecycle management. Learn more.The Public Preview of Frontend Mutual TLS Support on Global External HTTPS Load Balancing is now available. Now you can use Global External HTTPS Load Balancing to offload Mutual TLS authentication for your workloads. This includes client mTLS for Apigee X Northbound Traffic using Global HTTPS Load Balancer.FinOps from the field: How to build a FinOps Roadmap – In a world where cloud services have become increasingly complex, how do you take advantage of the features, but without the nasty bill shock at the end? Learn how to build your own FinOps roadmap step by step, with helpful tips and tricks from FinOps workshops Google has completed with customers.Security Command Center (SCC) Premium, our built-in security and risk management solution for Google Cloud, is now generally available for self-service activation for full customer organizations. Customers can get started with SCC in just a few clicks in the Google Cloud console. There is no commitment requirement, and pricing is based on a flexible pay-as-you-go model. Dataform is Generally Available. Dataform offers an end-to-end experience to develop, version control, and deploy SQL pipelines in BigQuery. Using a single web interface, data engineers and data analysts of all skill levels can build production-grade SQL pipelines in BigQuery while following software engineering best practices such as version control with Git, CI/CD, and code lifecycle management. Learn more.Week of May 29 – June 2Google Cloud Deploy. The price of an active delivery pipeline is reduced. Also, single-target delivery pipelines no longer incur a charge. Underlying service charges continue to apply. See Pricing Page for more details.Week of May 22 – 26Security Command Center (SCC) Premium pricing for project-level activation is now 25% lower for customers who use SCC to secure Compute Engine, GKE-Autopilot, App Engine and Cloud SQL. Please see our updated rate card. Also, we have expanded the number of finding types available for project-level Premium activations to help make your environment more secure. Learn more.Vertex AI Embeddings for Text: Grounding LLMs made easy: Many people are now starting to think about how to bring Gen AI and large language models (LLMs) to production services. You may be wondering “How to integrate LLMs or AI chatbots with existing IT systems, databases and business data?”, “We have thousands of products. How can I let LLM memorize them all precisely?”, or “How to handle the hallucination issues in AI chatbots to build a reliable service?”. Here is a quick solution: grounding with embeddings and vector search. What is grounding? What are embedding and vector search? In this post, we will learn these crucial concepts to build reliable Gen AI services for enterprise use with live demos and source code. Week of May 15 – 19Introducing the date/time selector in Log Analytics in Cloud Logging. You can now easily customize the date and time range of your queries in the Log Analytics page by using the same date/time-range selector used in Logs Explorer, Metrics Explorer and other Cloud Ops products. There are several time range options, such as preset times, custom start and end times, and relative time ranges. For more information, see Filter by time in the Log Analytics docs.Cloud Workstations is now GA. We are thrilled to announce the general availability of Cloud Workstations with a list of new enhanced features, providing fully managed integrated development environments (IDEs) on Google Cloud. Cloud Workstations enables faster developer onboarding and increased developer productivity while helping support your compliance requirements with an enhanced security posture. Learn MoreWeek of May 8 – 14Google is partnering with regional carriers Chunghwa Telecom, Innove (subsidiary of Globe Group) and AT&T to deliver the TPU (Taiwan-Philippines-U.S.) cable system — connecting Taiwan, Philippines, Guam, and California — to support growing demand in the APAC region. We are committed to providing Google Cloud customers with a resilient, high-performing global network. NEC is the supplier, and the system is expected to be ready for service in 2025. Introducing BigQuery differential privacy, SQL building blocks that analysts and data scientists can use to anonymize their data. We are also partnering with Tumult Labs to help Google Cloud customers with their differential privacy implementations.Scalable electronic trading on Google Cloud: A business case with BidFX: Working with Google Cloud, BidFX has been able to develop and deploy a new product called Liquidity Provision Analytics (“LPA”), launching to production within roughly six months, to solve the transaction cost analysis challenge in an innovative way. LPA will be offering features such as skew detection for liquidity providers, execution time optimization, pricing comparison, top of book analysis and feedback to counterparties. Read more here.AWS EC2 VMs discovery and assessment – mFit can discover EC2 VMs inventory in your AWS region and collect guest level information from multiple VMs to provide technical fit assessment for modernization. See demo video.Generate assessment report in Microsoft Excel file – mFit can generate detailed assessment report in Microsoft Excel (XLSX) format which can handle large amounts of VMs in a single report (few 1000’s) which an HTML report might not be able to handle.Regulatory Reporting Platform: Regulatory reporting remains a challenge for financial services firms. We share our point of view on the main challenges and opportunities in our latest blog, accompanied by an infographic and a customer case study from ANZ Bank. We also wrote a white paper for anyone looking for a deeper dive into our Regulatory Reporting Platform.Week of May 1-5Microservices observability is now generally available for C++, Go and Java. This release includes a number of new features and improvements, making it easier than ever to monitor and troubleshoot your microservices applications. Learn more on our user guide.Google Cloud Deploy Google Cloud Deploy now supports Skaffold 2.3 as the default Skaffold version for all target types. Release Notes.Cloud Build: You can now configure Cloud Build to continue executing a build even if specified steps fail. This feature is generally available. Learn more hereWeek of April 24-28General Availability: Custom Modules for Security Health Analytics is now generally available. Author custom detective controls in Security Command Center using the new custom module capability.Next generation Confidential VM is now available in Private Preview with a Confidential Computing technology called AMD Secure Encrypted Virtualization-Secure Nested Paging (AMD SEV-SNP) on general purpose N2D machines. Confidential VMs with AMD SEV-SNP enabled builds upon memory encryption and adds new hardware-based security protections such as strong memory integrity, encrypted register state (thanks to AMD SEV-Encrypted State, SEV-ES), and hardware-rooted remote attestation. Sign up here!Selecting Tier_1 networking for your Compute Engine VM can give you the bandwidth you need for demanding workloads. Check out this blog on Increasing bandwidth to Compute Engine VMs with TIER_1 networking.Week of April 17-21Use Terraform to manage Log Analytics in Cloud Logging: You can now configure Log Analytics on Cloud Logging buckets and BigQuery linked datasets by using the following Terraform modules:Google_logging_project_bucket_configgoogle_logging_linked_datasetWeek of April 10-14Assured Open Source Software is generally available for Java and Python ecosystems. Assured OSS is offered at no charge and provides an opportunity for any organization that utilizes open source software to take advantage of Google’s expertise in securing open source dependencies.BigQuery change data capture (CDC) is now in public preview. BigQuery CDC provides a fully-managed method of processing and applying streamed UPSERT and DELETE operations directly into BigQuery tables in real time through the BigQuery Storage Write API. This further enables the real-time replication of more classically transactional systems into BigQuery, which empowers cross functional analytics between OLTP and OLAP systems. Learn more here.Week of April 3 – 7Now Available: Google Cloud Deploy now supports canary release as a deployment strategy. This feature is supported in Preview. Learn moreGeneral Availability: Cloud Run services as backends to Internal HTTP(S) Load Balancers and Regional External HTTP(S) Load Balancers. Internal load balancers allow you to establish private connectivity between Cloud Run services and other services and clients on Google Cloud, on-premises, or on other clouds. In addition you get custom domains, tools to migrate traffic from legacy services, Identity-aware proxy support, and more. Regional external load balancer, as the name suggests, is designed to reside in a single region and connect with workloads only in the same region, thus helps you meet your regionalization requirements. Learn more.New Visualization tools for Compute Engine Fleets: TheObservability tab in the Compute Engine console VM List page has reached General Availability. The new Observability tab is an easy way to monitor and troubleshoot the health of your fleet of VMs Datastream for BigQuery is Generally Available: Datastream for BigQuery is generally available, offering a unique, truly seamless and easy-to-use experience that enables near-real time insights in BigQuery with just a few steps. Using BigQuery’s newly developed change data capture (CDC) and Storage Write API’s UPSERT functionality, Datastream efficiently replicates updates directly from source systems into BigQuery tables in real-time. You no longer have to waste valuable resources building and managing complex data pipelines, self-managed staging tables, tricky DML merge logic, or manual conversion from database-specific data types into BigQuery data types. Just configure your source database, connection type, and destination in BigQuery and you’re all set. Datastream for BigQuery will backfill historical data and continuously replicate new changes as they happen.Now available: Build an analytics lakehouse on Google Cloud whitepaper. The analytics lakehouse combines the benefits of data lakes and data warehouses without the overhead of each. In this paper, we discuss the end-to-end architecture which enable organizations to extract data in real-time regardless of which cloud or datastore the data reside in, use the data in aggregate for greater insight and artificial intelligence (AI) – all with governance and unified access across teams. Download now. Week of March 27 – 31Faced with strong data growth, Squarespace made the decision to move away from on-premises Hadoop to a cloud-managed solution for its data platform. Learn how they reduced the number of escalations by 87% with the analytics lakehouse on Google Cloud. Read nowLast chance: Register to attend Google Data Cloud & AI Summit: Join us on Wednesday, March 29, at 9 AM PDT/12 PM EDT to discover how you can use data and AI to reveal opportunities to transform your business and make your data work smarter. Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences, boost revenue, and reduce costs. Register today for this no cost digital event.New BigQuery editions: flexibility and predictability for your data cloud: At the Data Cloud & AI Summit, we announced BigQuery pricing editions—Standard, Enterprise and Enterprise Plus—that allow you to choose the right price-performance for individual workloads. Along with editions, we also announced autoscaling capabilities that ensure you only pay for the compute capacity you use, and a new compressed storage billing model that is designed to reduce your storage costs. Learn more about latest BigQuery innovations and register for the upcoming BigQuery roadmap session on April 5, 2023.Introducing Looker Modeler: A single source of truth for BI metrics: At the Data Cloud & AI Summit, we introduced a standalone metrics layer we call Looker Modeler, available in preview in Q2. With Looker Modeler, organizations can benefit from consistent governed metrics that define data relationships and progress against business priorities, and consume them in BI tools such as Connected Sheets, Looker Studio, Looker Studio Pro, Microsoft Power BI, Tableau, and ThoughtSpot.Bucket based log based metrics — now generally available — allow you to track, visualize and alert on important logs in your cloud environment from many different projects or across the entire organization based on what logs are stored in a log bucket.Week of March 20 – 24Chronicle Security Operations Feature Roundup – Bringing a modern and unified security operations experience to our customers is and has been a top priority with the Google Chronicle team. We’re happy to show continuing innovation and even more valuable functionality. In our latest release roundup we’ll highlight a host of new capabilities focused on delivering improved context, collaboration, and speed to handle alerts faster and more effectively. Learn how our newest capabilities enable security teams to do more with less here.Announcing Google’s Data Cloud & AI Summit, March 29th! Can your data work smarter? How can you use AI to unlock new opportunities? Join us on Wednesday, March 29, to gain expert insights, new solutions, and strategies to reveal opportunities hiding in your company’s data. Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences, boost revenue, and reduce costs. Register today for this no cost digital event.Artifact Registry Feature Preview – Artifact Registry now supports immutable tags for Docker repositories. If you enable this setting, an image tag always points to the same image digest, including the default latest tag. This feature is in Preview. Learn moreWeek of March 13 – 17A new era for AI and Google Workspace- Google Workspace is using AI to become even more helpful, starting with new capabilities in Docs and Gmail to write and refine content. Learn more.Building the most open and innovative AI ecosystem – In addition to the news this week on AI products, Google Cloud has also announced new partnerships, programs, and resources. This includes bringing bringing the best of Google’s infrastructure, AI products, and foundation models to partners at every layer of the AI stack: chipmakers, companies building foundation models and AI platforms, technology partners enabling companies to develop and deploy machine learning (ML) models, app-builders solving customer use-cases with generative AI, and global services and consulting firms that help enterprise customers implement all of this technology at scale. Learn more.From Microbrows to Microservices – Ulta Beauty is building their digital store of the future, but to maintain control over their new modernized application they turned to Anthos and GKE – Google Cloud’s managed container services, to provide an eCommerce experience as beautiful as their guests. Read our blog to see how a newly-minted Cloud Architect learnt Kubernetes and Google Cloud to provide the best possible architecture for his developers. Learn more.Now generally available, understand and trust your data with Dataplex data lineage – a fully managed Dataplex capability that helps you understand how data is sourced and transformed within the organization. Dataplex data lineage automatically tracks data movement across BigQuery, BigLake, Cloud Data Fusion (Preview), and Cloud Composer (Preview), eliminating operational hassles around manual curation of lineage metadata. Learn more here.Rapidly expand the reach of Spanner databases with read-only replicas and zero-downtime moves. Configurable read-only replicas let you add read-only replicas to any Spanner instance to deliver low latency reads to clients in any geography. Alongside Spanner’s zero-downtime instance move service, you have the freedom to move your production Spanner instances from any configuration to another on the fly, with zero downtime, whether it’s regional, multi-regional, or a custom configuration with configurable read-only replicas. Learn more here.To prepare for the busiest shopping season of the year, Black Friday and Cyber Monday, Lowe’s relies heavily on Google’s agile SRE Framework to ensure business and technical alignment, manage bots, and create an always-available shopping experience. Read more. Week of March 6 – 10Automatically blocking project SSH keys in Dataflow is now GA.This service option allows Dataflow users to prevent their Dataflow worker VMs from accepting SSH keys that are stored in project metadata, and results in improved security. Getting started is easy: enable the block-project-ssh-keys service option while submitting your Dataflow job.Celebrate International Women’s Day: Learn about the leaders driving impact at Google Cloud and creating pathways for other women in their industries. Read more.Google Cloud Deploy now supports Parallel Deployment to GKE and Cloud Run workloads. This feature is in Preview. Read more.Sumitovant doubles medical research output in one year using LookerSumitovant is a leading biopharma research company that has doubled their research output in one year alone. By leveraging modern cloud data technologies, Sumitovant supports their globally distributed workforce of scientists to develop next generation therapies using Google Cloud’s Looker for trusted self-service data research. To learn more about Looker check out https://cloud.google.com/lookerWeek of Feb 27 – Mar 3, 2023Accelerate Queries on your BigLake Tables with Cached Metadata (Preview!) Make your queries on BigLake Tables go faster by enabling metadata caching. Your queries will avoid expensive LIST operation for discovering files in the table and experience faster file and hive partition pruning. Follow the documentation here.Add geospatial intelligence to your Retail use cases by leveraging the CARTO platform on top of your data in BigQueryLocation data will add a new dimension to your Retail use cases, like site selection, geomarketing, and logistics and supply chain optimization. Read more about the solution and various customer implementations in the CARTO for Retail Reference Guide, and see a demonstration in this blog.Google Cloud Deploy support for deployment verification is now GA! Read more or Try the DemoWeek of Feb 20 – Feb 24, 2023Logs for Network Load Balancing and logs for Internal TCP/UDP Load Balancingare now GA!Logs are aggregated per-connection and exported in near real-time, providing useful information, such as 5-tuples of the connection, received bytes, and sent bytes, for troubleshooting and monitoring the pass-through Google Cloud Load Balancers. Further, customers can include additional optional fields, such as annotations for client-side and server-side GCE and GKE resources, to obtain richer telemetry.The newly published Anthos hybrid cloud architecture reference design guideprovides opinionated guidance to deploy Anthos in a hybrid environment to address some common challenges that you might encounter. Check out the architecture reference design guidehere to accelerate your journey to hybrid cloud and containerization.Week of Feb 13- Feb 17, 2023Deploy PyTorch models on Vertex AI in a few clicks with prebuilt PyTorch serving containers – which means less code, no need to write Dockerfiles, and faster time to production.Confidential GKE Nodes on Compute-Optimized C2D VMs are now GA. Confidential GKE Nodes help to increase the security of your GKE clusters by leveraging hardware to ensure your data is encrypted in memory, helping to defend against accidental data leakage, malicious administrators and “curious neighbors”. Getting started is easy, as your existing GKE workloads can run confidentially with no code changes required.Announcing Google’s Data Cloud & AI Summit, March 29th!Can your data work smarter? How can you use AI to unlock new opportunities? Register for Google Data Cloud & AI Summit, a digital event for data and IT leaders, data professionals, developers, and more to explore the latest breakthroughs. Join us on Wednesday, March 29, to gain expert insights, new solutions, and strategies to reveal opportunities hiding in your company’s data. Find out how organizations are using Google Cloud data and AI solutions to transform customer experiences, boost revenue, and reduce costs. Register today for this no cost digital event.Running SAP workloads on Google Cloud? Upgrade to our newly released Agent for SAP to gain increased visibility into your infrastructure and application performance. The new agent consolidates several of our existing agents for SAP workloads, which means less time spent on installation and updates, and more time for making data-driven decisions. In addition, there is new optional functionality that powers exciting products like Workload Manager, a way to automatically scan your SAP workloads against best-practices. Learn how to install or upgrade the agent here.Leverege uses BigQuery as a key component of its data and analytics pipeline to deliver innovative IoT solutions at scale. As part of the Built with BigQuery program, this blog post goes into detail about Leverege IoT Stack that runs on Google Cloud to power business-critical enterprise IoT solutions at scale. Download white paper Three Actions Enterprise IT Leaders Can Take to Improve Software Supply Chain Security to learn how and why high-profile software supply chain attacks like SolarWinds and Log4j happened, the key lessons learned from these attacks, as well as actions you can take today to prevent similar attacks from happening to your organization.Week of Feb 3 – Feb 10, 2023Immersive Stream for XRleverages Google Cloud GPUs to host, render, and stream high-quality photorealistic experiences to millions of mobile devices around the world, and is now generally available. Read more here.Reliable and consistent data presents an invaluable opportunity for organizations to innovate, make critical business decisions, and create differentiated customer experiences. But poor data quality can lead to inefficient processes and possible financial losses. Today we announce new Dataplex features: automatic data quality (AutoDQ) and data profiling, available in public preview. AutoDQ offers automated rule recommendations, built-in reporting, and serveless execution to construct high-quality data. Data profiling delivers richer insight into the data by identifying its common statistical characteristics. Learn more.Cloud Workstations now supports Customer Managed Encryption Keys (CMEK), which provides user encryption control over Cloud Workstation Persistent Disks. Read more.Google Cloud Deploy now supports Cloud Run targets in General Availability. Read more.Learn how to use NetApp Cloud Volumes Service as datastores for Google Cloud VMware Engine for expanding storage capacity. Read moreWeek of Jan 30 – Feb 3, 2023Oden Technologies uses BigQuery to provide real-time visibility, efficiency recommendations and resiliency in the face of network disruptions in manufacturing systems. As part of the Built with BigQuery program, this blog post describes the use cases, challenges, solution and solution architecture in great detail.Manage table and column-level access permissions using attribute-based policies in Dataplex. Dataplex attribute store provides a unified place where you can create and organize a Data Class hierarchy to classify your distributed data and assign behaviors such as Table-ACLs and Column-ACLs to the classified data classes. Dataplex will propagate IAM-Roles to tables, across multiple Google Cloud projects, according to the attribute(s) assigned to them and a single, merged policy tag to columns according to the attribute(s) attached to them. Read more. Lytics is a next generation composableCDP that enables companies to deploy a scalable CDP around their existing data warehouse/lakes. As part of the Built with BigQuery program for ISVs, Lytics leverages Analytics Hub to launch secure data sharing and enrichment solution for media and advertisers. This blog post goes over Lytics Conductor on Google Cloud and its architecture in great detail. Now available in public preview, Dataplex business glossary offers users a cloud-native way to maintain and manage business terms and definitions for data governance, establishing consistent business language, improving trust in data, and enabling self-serve use of data. Learn more here.Security Command Center (SCC), Google Cloud’s native security and risk management solution, is now available via self-service to protect individual projects from cyber attacks. It’s never been easier to secure your Google Cloud resources with SCC. Read our blog to learn more. To get started today, go to Security Command Center in the Google Cloud console for your projects.Global External HTTP(S) Load Balancer and Cloud CDN now support advanced traffic management using flexible pattern matching in public preview. This allows you to use wildcards anywhere in your path matcher. You can use this to customize origin routing for different types of traffic, request and response behaviors, and caching policies. In addition, you can now use results from your pattern matching to rewrite the path that is sent to the origin.Run large pods on GKE Autopilot with the Balanced compute class. When you need computing resources on the larger end of the spectrum, we’re excited that the Balanced compute class, which supports Pod resource sizes up to 222vCPU and 851GiB, is now GA.Week of Jan 23 – Jan 27, 2023Starting with Anthos version 1.14, Google supports each Anthos minor version for 12 months after the initial release of the minor version, or until the release of the third subsequent minor version, whichever is longer. We plan to have Anthos minor release three times a year around the months of April, August, and December in 2023, with a monthly patch release (for example, z in version x.y.z) for supported minor versions. For more information, read here.Anthos Policy Controller enables the enforcement of fully programmable policies for your clusters across the environments. We are thrilled to announce the launch of our new built-in Policy Controller Dashboard, a powerful tool that makes it easy to manage and monitor the policy guardrails applied to your Fleet of clusters. New policy bundles are available to help audit your cluster resources against kubernetes standards, industry standards, or Google recommended best practices. The easiest way to get started with Anthos Policy Controller is to just install Policy controller and try applying a policy bundle to audit your fleet of clusters against a standard such as CIS benchmark.Dataproc is an important service in any data lake modernization effort. Many customers begin their journey to the cloud by migrating their Hadoop workloads to Dataproc and continue to modernize their solutions by incorporating the full suite of Google Cloud’s data offerings. Check out this guide that demonstrates how you can optimize Dataproc job stability, performance, and cost-effectiveness. Eventarc adds support for 85+ new direct events from the following Google services in Preview: API Gateway, Apigee Registry, BeyondCorp, Certificate Manager, Cloud Data Fusion, Cloud Functions, Cloud Memorystore for Memcached, Database Migration, Datastream, Eventarc, Workflows. This brings the total pre-integrated events offered in Eventarc to over 4000 events from 140+ Google services and third-party SaaS vendors. mFit 1.14.0 release adds support for JBoss and Apache workloads by including fit analysis and framework analytics for these workload types in the assessment report. See the release notes for important bug fixes and enhancements.Google Cloud Deploy – Google Cloud Deploy now supports Skaffold version 2.0. Release notesCloud Workstations – Labels can now be applied to Cloud Workstations resources. Release notes Cloud Build- Cloud Build repositories (2nd gen) lets you easily create and manage repository connections, not only through Cloud Console but also through gcloud and the Cloud Build API. Release notesWeek of Jan 17 – Jan 20, 2023Cloud CDN now supports private origin authentication for Amazon Simple Storage Service (Amazon S3) buckets and compatible object stores in Preview. This capability improves security by allowing only trusted connections to access the content on your private origins and preventing users from directly accessing it.Week of Jan 9 – Jan 13, 2023Revionics partnered with Google Cloud to build a data-driven pricing platform for speed, scale and automation with BigQuery, Looker and more. As part of the Built with BigQuery program, this blog post describes the use cases, problems solved, solution architecture and key outcomes of hosting Revionics product, Platform Built for Change on Google Cloud. Comprehensive guide for designing reliable infrastructure for your workloads in Google Cloud. The guide combines industry-leading reliability best practices with the knowledge and deep expertise of reliability engineers across Google. Understand the platform-level reliability capabilities of Google Cloud, the building blocks of reliability in Google Cloud and how these building blocks affect the availability of your cloud resources. Review guidelines for assessing the reliability requirements of your cloud workloads. Compare architectural options for deploying distributed and redundant resources across Google Cloud locations, and learn how to manage traffic and load for distributed deployments. Read the full blog here.GPU Pods on GKE Autopilot are now generally available. Customers can now run ML training, inference, video encoding and all other workloads that need a GPU, with the convenience of GKE Autopilot’s fully-managed Kubernetes environment.Kubernetes v1.26 is now generally available on GKE. GKE customers can now take advantage of the many new features in this exciting release. This release continues Google Cloud’s goal of making Kubernetes releases available to Google customers within 30 days of the Kubernetes OSS release.Event-driven transfer for Cloud Storage:Customers have told us they need asynchronous, scalable service to replicate data between Cloud Storage buckets for a variety of use cases including aggregating data in a single bucket for data processing and analysis, keeping buckets across projects/regions/continents in sync, etc. Google Cloud now offers Preview support for event-driven transfer – serverless, real-time replication capability to move data from AWS S3 to Cloud Storage and copy data between multiple Cloud Storage buckets. Read the full blog here. Pub/Sub Lite now offers export subscriptions to Pub/Sub. This new subscription type writes Lite messages directly to Pub/Sub – no code development or Dataflow jobs needed. Great for connecting disparate data pipelines and migration from Lite to Pub/Sub. See here for documentation.
Quelle: Google Cloud Platform

Actuate your data in real time with new Bigtable change streams

Cloud Bigtable is a highly scalable, fully managed NoSQL database service that offers single-digit millisecond latency and an availability SLA up to 99.999%. It is a good choice for applications that require high throughput and low latency, such as real-time analytics, gaming, and telecommunications.Cloud Bigtable change streams is a feature that allows you to track changes to your Bigtable data and easily access and integrate this data with other systems. With change streams, you can replicate changes from Bigtable to BigQuery for real-time analytics, trigger downstream application behavior using Pub/Sub (for event-based data pipelines), or capture database changes for multi-cloud scenarios and migrations to Bigtable.Cloud Bigtable change streams is a powerful tool that can help you unlock new value from your data.NBCUniversal’s streaming service Peacock uses Bigtable for identity management across their platform. The Bigtable change streams feature helped them simplify and optimize their data pipeline. “Bigtable change streams was simple to integrate into our existing data pipeline leveraging the dataflow beam connector to alert on changes for downstream processing. This update saved us significant time and processing in our data normalization objectives.” – Baihe Liu, PeacockActuating your data changesEnabling a change stream on your table can easily be done through the Google Cloud console, or via the API, client libraries or declarative infrastructure tools like Terrafom.Once enabled on a particular table, all data changes to the table will be captured and stored for up to seven days. This is useful for tracking changes to data over time, or for auditing purposes. The retention period can be customized to meet your specific needs. You can build custom processing pipelines using the Bigtable connector for Dataflow. This allows you to process data in Bigtable in a variety of ways, including batch processing, streaming processing, and machine learning. Or, you can have even more flexibility and control by integrating with the Bigtable API directly.Cloud Bigtable change streams use cases Change streams can be leveraged for a variety of use cases and business-critical workloads. Analytics and MLCollect event data and analyze it in real time. This can be used to track customer behavior to update feature store embeddings for personalization, monitor system performance in IoT services for fault detection or identify security threats, or monitor events to detect fraud.In the context of BigQuery, change streams can be used to track changes to data over time, identify trends, and generate reports. There are two main ways to send change records to BigQuery: as a set of change logs or mirroring your data on BigQuery for large scale analytics.Event-based applications Leverage change streams to trigger downstream processing of certain events, for example, in gaming, to keep track of player actions in real time. This can be used to update game state, provide feedback to players, or detect cheating.Retail customers leverage change streams to monitor catalog changes like pricing or availability to trigger updates and alert customers.Migration and multi-cloudCapture Bigtable changes for multicloud or hybrid cloud scenarios. For example, leverage Bigtable HBase replication tooling and change streams to keep your data replicated across clouds or on-premises databases. This topology can also be leveraged for online migrations to Bigtable without disruption to serving activity.ComplianceCompliance often refers to meeting the requirements of specific regulations, such as HIPAA or PCI DSS. Retaining the change log can help you to demonstrate compliance by providing a record of all changes that have been made to your data. This can be helpful in the event of an audit or if you need to investigate a security incident.Learn moreChange streams is a powerful feature providing additional capability to actuate your data on Bigtable to meet your business requirements and optimize your data pipelines. To get started, check out our documentation for more details on Bigtable change streams, along with these additional resources:Expanding your Bigtable architecture with change streamsProcess a Bigtable change stream tutorialCreate a change stream-enabled table and capture changes quickstartBigtable change streams Code samples
Quelle: Google Cloud Platform