Contact Center AI reimagines the customer experience through full end-to-end platform expansion

Providing best-in-class customer service is crucial for the success of your business. Contact centers are a critical touch point, as they have to balance between representing your brand and prioritizing customer care. When your customers seek help and support, they expect efficient service that is accessible through modern voice and digital channels. In short, customer expectations are increasing—and that’s a problem if your contact center infrastructure and solutions are becoming outdated.  All of these factors are why today, we’re announcing Google Cloud Contact Center AI Platform, an expansion to Contact Center AI that offers an out-of-box, end-to-end solution for the contact center. It brings together the advantages of AI, cloud scalability, multi-experience capabilities, and tight integration with customer relationship management (CRM) platforms to unify sales, marketing, and support teams around data across the customer journey.Improving customer experiences from all angles Google Cloud’s Contact Center AI helps you leverage AI to scale your contact center interactions while maintaining a high level of customer satisfaction. Over the last two years, we have built a large group of partners, including the largest contact center and customer experience ISVs and our system integrator ecosystem, to bring Contact Center AI to customers. Today, we are helping enterprises across industries and geographies to cost-effectively reimagine contact center experiences. For example, Marks & Spencer reduced in-store call volume by 50%, and similarly, The Home Depot improved call containment by 185%, all while significantly increasing customer self-service engagement.Adding to our Contact Center AI capabilities, Contact Center AI Platform is purpose-built for customer relationship management, extending your ability to offer personalized customer experiences that are consistent across your brand, whether delivered through a virtual agent, a human agent, or a combination of both. It eliminates many long-running pain points, from managing data fragmentation to replacing rigid customer experience flows with more engaging, personalized, and flexible support. With this addition, Contact Center AI now lets you: Orchestrate the customer journey by creating modern experiences that can be embedded in their chosen channels with mobile/web software developer kits (SDKs), compatible with iOS and Android;Leverage CRM as a single source of insight into the customer experience, to unify content, increase personalization, and automate processing with CRM data unification;Manage multiple channels without pivoting across voice, SMS, and chat support;Predict customer needs and route calls appropriately with AI-driven routing, based on both historical CRM data and real-time interactions;Automate scheduling, schedule adherence monitoring, and manage employee scheduling preferences with Workforce Optimization (WFO) integration;Provide customers with self-service via web or mobile interfaces using Visual Interactive Voice Response (IVR).Helping you do more with contact centersThe addition of Contact Center AI Platform provides your partners the ability to integrate with Contact Center AI, so you can enjoy a more seamless experience operating your customer service center, with a complete view of the customer in a single workspace that includes real-time AI intelligence, native agent call controls, and real-time call transcription. For example, we are expanding our partnership with Salesforce to integrate Contact Center AI with Service Cloud Voice to deliver a unified Service Cloud agent console and Customer 360. “Customers are continually raising their service expectations, and our research tells us 79% of consumers believe the experience a company provides is as important as its products and services,” said Ryan Nichols, SVP & GM, Contact Center, for Salesforce Service Cloud. “Through intelligence, workflows, and a deeper understanding of the customer, Salesforce’s Service Cloud Voice paired with Google’s Contact Center AI will empower agents with a seamless experience to help them wow customers.”We are also excited to partner with UJET, an innovative and experienced Contact Center as a Service (CCaaS) provider. UJET offers secure user-centric design, scalability, and mobile-focused solution, with turnkey implementation, strong omnichannel capabilities, and best-in-class user experience, making their product a natural fit into Google’s contact center vision. To learn more about the partnership, see here.Delivering impact for customers Contact Center AI is already making a difference for our customers such as OneUnited Bank, the largest Black-owned bank in the U.S. “OneUnited Bank has been in partnership with Google Cloud and UJET, as well as a long-standing customer of Salesforce. The expansion and enhancements of Google Cloud’s Contact Center AI, along with its deeper integration with Salesforce, means better return on investment as we drive towards evolving our contact center to deliver exceptional client experiences,” said Teri Williams, President and Chief Operating Officer at OneUnited Bank.Fitbit, which boasts more than 29 million active users, is also reaping the benefits. “Fitbit relies on Google Cloud and UJET to provide support to our customers with a mobile-first approach. This collaboration, in combination with a strong Salesforce integration, has helped us modernize our entire customer support experience,” stated Cassandra Johnson, VP, Devices & Services Customer Care & Vendor Management Office, at Google.According to industry analyst Sheila McGee-Smith of McGee-Smith Analytics, “Google Cloud’s Contact Center AI is already a force in the contact center industry thanks to its early focus on AI for customer experience.” She continued, “Through their partnerships with UJET and Salesforce, as well as these expanded capabilities, Google Cloud’s Contact Center AI Platform will help define the future of customer service by powering more secure, engaging, and personalized customer experiences.”Contact Center AI Platform is supported by a host of integration partners, including Accenture, CDW, Cognizant, Deloitte, HCL, IBM, Infosys, Quantiphi, Tata Consultancy Services, and Wipro. We will also continue to partner closely with the contact center and customer experience (CX) ISVs that our customers already rely on. If you already have a contact center solution provider, you can still integrate Google Cloud’s Contact Center AI into your existing environment. To learn more about how you can leverage the power of AI to reimagine your contact center experience, visit our Contact Center AI page.Related ArticleReaching more customers with Contact Center AI: 2021 Wrap-upExplore Google Cloud’s Contact Center AI (CCAI) and its momentum in 2021Read Article
Quelle: Google Cloud Platform

Go 1.18 and Google Cloud: Go now with Google Cloud

On March 15th, the Go team announced Go 1.18 GA, the latest release of the Go programming language. The culmination of over a decade of design delivers the features our developers demanded: generics, fuzzing, and module workspaces. With this release, Go becomes the first major language to integrate fuzz testing into its core toolchain without using third-party support, further establishing Go as a preferred language for developing secure applications.Go was created at Google in 2007, designed to help developers build fast, reliable, and secure software. Unlike traditional languages, Go was built for the modern multi-core computing world. Go has emerged as a modern language for developing cloud applications, services, and infrastructure. Today Go powers several of Google’s largest products, and is used by many customers to scale their businesses. Organizations big and small love Go and the community of Go developers, known as “gophers” has grown into a global network with over 2 million users worldwide. Using the power of Go in the CloudWhen looking at the public repos, over 75% of CNCF projects including Kubernetes and Istio are written in Go and 10% of developers are writing in Go worldwide (as of May 2021). Google delivers high performance infrastructure to run key, cloud native, Open Source projects. Our modern cloud infrastructure is based on Kubernetes at its core and our strong support for Istio and Knative have formed the base of some of our leading services like Google Kubernetes Engine (GKE), our managed application platform with Anthos, Cloud Functions, and Cloud Run. Google uses Go extensively for a wide range of applications from our indexing platform that powers Google Search, to the server side optimizations that power Chrome’s 1B+ users, to the infrastructure on which Google cloud is built. Release HighlightsWith this new release of Go 1.18, Generics are the biggest change to Go since the language was created. Go developers told us that they feel that Go lacks critical features, with generics being the main missing piece. With Go 1.18, new and existing Go developers can take advantage of the productivity, performance, and maintenance benefits that generics can bring. We’ve already begun to see the new kinds of libraries and projects gophers are building with generics in its short beta period, and expect this creativity to grow as time goes on. This Go release also brings native support for fuzzing. Fuzzing is a type of vulnerability testing that throws arbitrary data at a piece of software to expose unknown errors and is emerging as a common testing scheme in enterprise development. Go is now the first major language to provide fuzzing support with no third-party integrations necessary, allowing developers to start building secure software with minimal additional cost. Go’s innovative approach to fuzzing can provide not only security for the current code but also ongoing protection as code and dependencies evolve.  With attacks on software becoming more common and complex, vulnerability detection can be a critical part of the enterprise development lifecycle, and Go’s fuzzing capabilities catch vulnerabilities earlier in the lifecycle.Build securely using Go At Google we are helping to make Open Source software secure. Open source software is a connective tissue for much of the online world. At Google, we’ve been working to raise awareness of the state of open source security and are committed to helping secure the software supply chain for organizations. Go has been designed to create secure applications, helping to minimize risk as much as possible. Go applications compile down to a single binary without local dependencies. It’s not uncommon to see an application built using only the standard library, or only a couple well-vetted Go dependencies. Go’s dependency management uses tamper-evident  transparency log, with built in tooling that helps ensure your dependencies are what you can expect. Go has native encryption, which is used across much of the internet, including key components of Google. Go even supports distroless containers, where there are zero local dependencies to worry about. Google Cloud products like Cloud Build, for CI/CDand Artifact Registry, for container management, and have direct access to Go’s vulnerability database and can provide you instant warnings about security threats. “At Google we are committed to helping to secure the online infrastructure and applications upon which the world depends. A critical aspect of this mission is being able to understand and verify the security of open source dependency chains. The 1.18 release of Go is an important step towards helping to ensure that developers are able to build secure applications, understand risk when vulnerabilities are discovered, and reduce the impact of cybersecurity attacks” said Eric Brewer, VP Infrastructure, Google FellowThis launch is a significant milestone for Go that helps developers from around the world build more performant and secure applications that run on any infrastructure. For more information on this release and how to get started with Go, please visit.
Quelle: Google Cloud Platform

Azure HBv3 VMs for HPC now generally available with AMD EPYC CPUs with AMD 3D V-Cache

Azure HBv3 virtual machines (VMs) are now upgraded to and generally available with AMD EPYC 3rd Gen AMD EPYC™ processors with AMD 3D V-Cache™ technology, formerly codenamed “Milan-X”, in the Azure East US, South Central, and West Europe regions. In addition, we are announcing that HBv3 VMs will also soon come to Central India, UK South, China North 3, Southeast Asia, and West US 3 Azure regions. Customers can view estimated time of arrival for these new regions at Azure Availability by region.

To access these enhanced CPUs, customers need only deploy new HBv3 VMs, as all VM deployments from today onward will occur on machines featuring the new processors. Existing HBv3 VMs deployed prior to today’s launch will continue to see 3rd Gen AMD EPYC processors, formerly codenamed “Milan”, until they are de-allocated and a customer creates new VMs in their place.

Significant performance upgrade for all HBv3 customers

As previously detailed, EPYC processors with AMD 3D V-Cache can significantly improve the performance, scaling efficiency, and cost-effectiveness of a variety of memory performance-bound workloads such as CFD, explicit finite element analysis, computational geoscience, weather simulation, and silicon design right-to-left (RTL) workflows.

Compared to the performance HBv3-series delivered prior to the upgrade to the new processors, customers will experience up to:

80 percent higher performance for CFD.
60 percent higher performance for EDA RTL.
50 percent higher performance for explicit FEA.
19 percent higher performance for weather simulation.

HBv3-series VMs retain their existing pricing and do not require changes to customer workloads. No other changes are being made to the HBv3-series VM sizes customers already know and rely on for their critical research and business workloads. For more information on the Azure HBv3-series, please see official documentation for the Azure HBv3-series of virtual machines.

The highest performance, most cost-effective cloud HPC

Based on testing of a broad array of customer HPC workloads against the best publicly demonstrated performance from other major cloud providers, Azure HBv3-series VMs with 3rd Gen AMD EPYC processors with AMD 3D V-Cache and InfiniBand from NVIDIA Networking deliver 2.23-3.88 times higher performance.

Figure 1: Relative at-scale workload performance in CFD, molecular dynamics, and weather simulation.

For more performance, scalability, and cost information see our detailed blog here.

Continuous improvement for Azure HPC customers

Microsoft and AMD share a vision for a new era of high-performance computing in the cloud. One defined by continuous improvements to the critical research and business workloads that matter most to our customers. Azure has teamed with AMD to make this vision a reality by raising the bar on the performance, scalability, and value we deliver with every release of Azure HB-series virtual machines.

Figure 2: Azure HB-Series virtual machine generational performance improvement.

“Rescale is excited to see the dedication by Microsoft to continually raise the bar, the new Azure HBv3 VMs featuring AMD EPYC™ CPUs with AMD 3D V-Cache™ technology specifically targets memory bandwidth bottlenecks impacting the most widely used commercial CFD codes on the Rescale platform. Preliminary testing has demonstrated a 25 percent performance boost across three of the most common CFD applications and a positive impact on virtually all software running on the upgraded instances,” said Chris Langel, HPC Engineering Manager at Rescale and Mulyanto Poort, VP of HPC Engineering at Rescale. “We are seeing a strong customer demand for “Milan-X” and are excited to offer the updated Azure HBv3 VMs to our customers,” said Ethan Rasa, Senior Director of Strategic Alliances at Rescale.

“Ansys Fluent is the industry-leading computational fluid dynamics tool and our customers are always looking for ways to run larger problems more quickly, or with more granularity.  The super-linear scaling we are seeing with the AMD Milan-X chip on the Azure HBv3 virtual machines will be received with a lot of excitement by our user base across many industries.”—Jeremy McCaslin, Product Manager, Fluids, Ansys

"Customers who require high-fidelity production simulations in demanding industries rely on Siemens Simcenter STAR-CCM+ software,” said Patrick Niven, Senior Director of Fluid and Thermal Product Management, Siemens Digital Industries Software. “Customers usually need those results quickly, so Siemens and Microsoft collaborate to ensure Azure HB-series instances deliver true HPC-class performance. The new Azure HBv3 instances featuring 3rd Gen AMD EPYC™ CPUs with AMD 3D V-Cache™ technology can accelerate simulations by up to 50 percent, so Microsoft can offer Simcenter STAR-CCM+ users cutting-edge performance on an accessible platform.”

Learn more

Azure Docs—HBv3-series Virtual Machines.
Azure HBv3-series with Milan-X processors launch video.
Watch the announcement at the AMD Acceleration Datacenter Premier.
See additional information on performance, scalability and cost information.
Performance and Scalability of HBv3-series with Milan-X processors.
Find out more about  high-performance computing in Azure.
AMD Launch Hub EPYC 3rd Gen EPYC with AMD 3D V-Cache.
Azure HPC optimized OS images.
Azure HPC virtual machines.

Quelle: Azure