What’s new in Azure Data, AI, and Digital Applications: Are you ready to go from GenAI experimentation to solutions deployed at scale?

For many organizations, implementing AI is not an “if” but a “when.” Determining the “when” piece depends on many factors, but one that may be tricky to scope is “when we are ready?.” Is it when your workloads are in the cloud? When your team has the skills? When it’s a good fit for your culture and customers? All of the above, and …?  

Prior to working at Microsoft, I was a Microsoft Systems Implementor (SI) partner implementing Microsoft Azure solutions. A big reason we were exclusive with Microsoft was the investment in tooling and expertise Microsoft makes to help customers assess their transformation readiness. Going back to our very beginning, Microsoft’s roots are in helping developers and organizations use technology to solve some of the world’s biggest opportunities and challenges. In this era of AI, we once again find ourselves seeking ways to empower our customers to adopt GenAI widely and at a pace not seen before in the technology industry. In fact, in a recent IDC study, we found that pace is a huge part of the GenAI equation. 

As someone who has been at a keyboard building software solutions, I can tell you—having enterprise-ready tooling and seamless integration into the broader cloud services needed to build applications is at the top of the list for delivering on time, with quality, no doubt about it.

We are now in the second year of the era of AI. The first year was full of excitement and experimentation, giving all of us a glimpse into the powerful potential of AI to revolutionize experiences for customers, improve employee productivity, and ignite a sense of wonder. The focus this year is on implementation, realizing value, and discovering where AI can take your organization. Read on for what’s new across the business, with a special focus on updates that will help ensure you’re ready for what’s next.

Whats new in AI readiness

Evaluate apps in Azure AI Studio before deploying

Ensuring an AI-powered app is ready to deploy means evaluating model accuracy and robustness, response quality, scalability, compliance, and other items critical to launching a successful app. With Azure AI Studio, generative AI app developers can build and evaluate applications for safety, reliability, and performance before deploying. If needed, they can fine-tune models and reorchestrate prompt application components. The platform facilitates scalability, transforming proof-of-concepts into full production with ease. And continuous monitoring and refinement supports long-term success. To get a look at this in action, watch our good friend Seth Juarez in this Microsoft Mechanics episode, “Build your own copilots with Azure AI Studio,” to see how evaluation is built into the workflow.

Calling all retailers! AI-ready data solutions in Microsoft Fabric now in public preview

In January we unveiled new generative AI and data solutions across the shopper journey, offering copilot experiences through Microsoft Cloud for Retail. A new retail industry data model can be used for data governance, reporting, business intelligence and advanced analytics. A data connector brings e-commerce data from Sitecore OrderCloud into Microsoft Fabric in real time. Analytics templates, such as frequently bought together, provide actionable, data-driven recommendations to help retailers improve product upselling and shelf optimization. New copilot templates on Azure OpenAI Service allow retailers to build personalized shopping experiences and support store operations. Add to that new copilot features in Microsoft Dynamics 365 Customer Insights, and the launch of Retail Media Creative Studio in the Microsoft Retail Media Platform and Microsoft Cloud for Retail now offers more options for retailers to choose from to infuse copilot experiences throughout the shopper journey. Learn more.

Advancing through the LLMOps Maturity Model: A roadmap for generative AI operational excellence 

The latest in our ongoing series on LLMOps for business leaders delves into how to use the LLMOps Maturity Model to methodically advance from theoretical to practical with powerful generative AI models. The LLMOps Maturity Model is not just a roadmap from foundational LLM utilization to mastery in deployment and operational management; it’s a strategic guide that underscores why understanding and implementing this model is essential for navigating the ever-evolving landscape of AI. This fourth blog in the series offers a practical roadmap for businesses to skillfully navigate the world of generative AI and Large Language Models. It’s all about moving from the basics of using LLMs to mastering deployment and management.

New Azure SQL Database Hyperscale pricing and Azure AI Search integration

AI applications need high-performance databases that can handle large volumes of data and complex queries. Azure SQL Database Hyperscale’s unique architecture provides the needed flexibility and scalability for AI-ready cloud applications of any size and I/O requirement. And new, reduced compute pricing gives you the performance and security of Azure SQL at commercial open-source prices. Learn more.

New, integrated vectorization capability in Azure AI Search means customers can now do vector search using data stored in Azure SQL Database. This new capability opens up new application scenarios for integrating vectors into traditional search as well as Retrieval-Augmented Generation (RAG) applications. Learn more.

Microsoft is a leader in the 2023 IDC MarketScape for AI Governance Platforms  

We are proud of the work we put into ensuring our AI products and services empower you to deploy solutions that are safe, responsible, and effective. So, we are honored to be recognized as a Leader in the inaugural IDC MarketScape Worldwide AI Governance Platforms 2023 Vendor Assessment (doc #US50056923, November 2023). Learn more about this recognition.

Azure Partners: Faster app delivery, cost savings, and funding with Azure Migrate and Modernize and Azure Innovate

Whether your customers are migrating to gain a secure and AI-ready foundation, modernizing their app portfolio, or are ready to build new intelligent apps, it is now easier to bring value to our mutual customers and help them capitalize on the technological innovations transforming every industry.

Achieve faster application delivery times, significant cost savings, and access funding through with Azure Migrate and Modernize and Azure Innovate. Why is this so important to get ready for AI? You must get to the cloud before that innovation can begin!

Formerly the Azure Migration and Modernization Program, Azure Migrate and Modernize is an expanded offering for customer scenarios across apps, data, and infrastructure that includes support for more workloads, streamlined access to specialized partners, incentives to offset migration costs, and security guidance built into every engagement. Azure Innovate is a new offering focused on building new solutions and modernizing existing ones on Azure to meet the demand for AI transformation.

Both offerings provide end-to-end coverage of customer needs, from migration and modernization scenarios to AI innovation, and are built to scale as customer requirements and priorities evolve. Partners providing services, or ISVs building new or modernizing applications, have access to assessments, pilot PoCs, tooling, funding, and Microsoft expert guidance when you need it – all designed to help you accelerate the cloud journey and drive growth and impact.

Visit Azure Migrate and Modernize and Azure Innovate for more info. 

Ready for the cloud and AI? Find out with a free Azure Expert Assessment

Looking for the best way to leverage the scale and compute horsepower of the cloud to optimize your IT infrastructure, data, and applications? Want a personal recommendation on that best way? For free? Check out Azure Expert Assessment, a one-to-one offer to collaborate with a Certified Azure Expert who will personally guide you through the assessment and make recommendations for your organization’s cloud adoption plan. You’ll get a clear technical roadmap and a comprehensive business case to support your cloud strategy. You will also get access to best practices, tools, and resources to help you implement your cloud solutions. Sound good? Go: Azure Expert Assessment.

Build your AI skills

Coming soon: Industry AI Implementation Workshops for partners

A new series of AI workshops to help partners implement industry-specific AI solutions is launching soon. The goal is to inform, educate, and accelerate our industry-specific partners on generative AI and equip them with what’s needed to go from concept to market. Workshop benefits include:

Architectural guidance supporting customer/partner adoption of our Generative AI stack (Azure OpenAI, Copilot, etc.)

An approach to Code ‘For’ / Code ‘With’ depending on the relationship

Offer feature requests/improvements to our horizontal platforms (Azure OpenAI, Copilot for M365, copilot for D365, etc.)

The Retail workshop is currently in pilot with Manufacturing, Healthcare, and Sustainability following soon. Stay updated on the Industry AI Workshops 

Workshop: Get started with Responsible AI

The Responsible AI framework helps AI developers identify and mitigate risks and harms that could impact people, businesses, and society. This hand-on workshop gives participants experience using Responsible AI to debug their machine learning model to improve the model’s performance to be more fair, inclusive, safe, reliable, and transparent. Learn more on the Responsible AI Framework.

Opportunities to connect

Register for the Microsoft Fabric Community Conference

Unifying data from across a sprawling infrastructure is critical to AI readiness. Microsoft Fabric helps you connect and curate data from anywhere and apply powerful analytics and share insights across your organization all while governing and protecting your data. Come see for yourself – join us at the Microsoft Fabric Community Conference in Las Vegas March 26-28and see firsthand how Fabric and the rest of our data and AI products can help your organization prepare for the era of AI. You’ll hear from leading Microsoft and community experts from around the world and get hands on experiences with the latest features from Microsoft Fabric, Power BI, Azure Databases, Azure Databricks, Azure AI, Microsoft Purview, and so much more. Use discount code MSCUST to save $100 off. Register today. 

Azure Cosmos DB Conf 2024—Call for speakers

Azure Cosmos DB is the database for the era of AI and modern app development. From Chat GPT to TomTom Digital Cockpit—an immersive, conversational in-car infotainment system—Azure Cosmos DB powers responsive and intelligent apps with real-time data, ingested and processed at any scale.

There are many, many examples of developers building innovative apps with Azure Cosmos DB and if you’re one of them, we invite you to showcase your work at Azure Cosmos DB Conference 2024 on April 16. This free, virtual developer event is co-organized by Microsoft and the Azure Cosmos DB community. This year’s theme is “Building the next generation of AI Apps with Azure Cosmos DB.” We seek stories about customers using Azure Cosmos DB and AI to power next-gen intelligent apps, focusing on unique customer scenarios/use cases, and on integrating with open-source APIs like MongoDB, PostgreSQL, and Apache Cassandra. Customer demo sessions showcasing innovative AI use cases will get priority. This call for speakers is open until Feb. 15. Get the details and submit a session.

AI era ready customers are making transformative moves

Walmart unveils new generative AI-powered capabilities for shoppers and associates

Microsoft and Walmart established a strategic partnership in 2018 that has accelerated innovation on several fronts. At CES Walmart unveiled that latest results of our ongoing collaboration: an all-new generative AI-powered search function that works across iOS, Android, and Walmart’s own website. The new capability is specifically designed to understand the context of a customer’s query and generate personalized responses. Soon, customers will have a more interactive and conversational experience, get answers to specific questions, and receive personalized product suggestions. Read the story.

Windstream uses Azure OpenAI Service to empower employees and transform business outcomes

Windstream, a leading telecommunications services provider in the U.S. was ready for the era of AI and has fully embraced it to revolutionize operations. From using Azure OpenAI Service to extract valuable insights from customer interactions and call transcripts to improve customer service, to using the service to analyze technical information and error codes and transform it into customer-friendly messages to inform users about issues and expected resolution times. Windstream also uses Azure AI, including cognitive search and OpenAI’s Davinci, to index their vast amount of internal social media data and documents, which included approximately 100,000 indexed documents on their Confluence wiki. This indexed data is made available to Windstream’s custom-built GPT (Generative Pre-trained Transformer) platform, hosted within Azure Kubernetes Service (AKS). This allows employees to access indexed knowledge and answer questions, leading to increased efficiency and more informed decision-making. And that’s not all. Read more about Windstream’s AI story.

Ally Financial empowers customer service associates to focus on human engagement by using Azure OpenAI Service

Ally, a digital financial services firm, wanted to increase the time its call-center customer service associates have to spend with customers. Those associates had to write detailed notes after customer calls, taking time away from customers. To free up that time and maintain detailed call documentation, Ally used Microsoft Azure and Azure OpenAI Service to automate note-taking. Now associates can quickly review the AI-generated summary after each call and turn their attention back to serving customers. This solution cut associates’ post-call effort by 30 percent. As Ally improves call summary accuracy—which is over 85% to start—they expect to reduce associate post-call effort by 50%. Read more on Ally’s AI integration.

Microsoft and Cognite build industrial data operations platform on Microsoft Fabric and Azure OpenAI Service

Another partnership transforming industry with AI is our partnership with Cognite, which recently expanded to converge enterprise and industrial data operations to create a scalable, AI-driven platform that meets the demands of modern industries from the shop floor to the top floor. The solution integrates flagship Cognite Data Fusion with Microsoft Fabric to deliver a unified enterprise DataOps solution with capabilities for vertical industrial data workloads enabled through copilots. Customers can leverage Cognite Data Fusion for driving decisions in asset centric scenarios, for example asset performance optimization, and Fabric to generate insights to run their business. Read more on Cognite’s story.

How Microsoft’s AI screened over 32 million candidates to find a better battery—in 80 hours

In an awe-inspiring example of how AI can reshape our world, the Microsoft Quantum team used advanced AI to screen over 32 million candidates to discover and synthesize a new material that holds the potential for better batteries—the first real-life example of many that will be achieved in a new era of scientific discovery driven by AI. Joining forces with the Department of Energy’s Pacific Northwest National Laboratory (PNNL), the team accomplished in days something that would have taken traditional science and lab experimentation multiple lifespans to achieve. The discovery is important for many reasons. Solid-state batteries are safer than traditional liquid or gel-like lithium batteries and provide more energy density. Lithium is scarce, expensive, and environmentally and geopolitically problematic. Creating a battery that reduces lithium requirements by approximately 70% could have tremendous environmental, safety, and economic benefits. This achievement is a glimpse at how the convergence of high-performance computing (HPC) and AI is accelerating scientific discovery across industries. Microsoft puts the power these breakthroughs into customers’ hands with our Azure Quantum Elements platform. Read the AI breakthrough battery story.

Are you ready for the era of AI?

With an average return of $3.5 for every $1 invested and firms beginning to see returns in just 14 months, as reported by IDC,1 AI is rerouting technology roadmaps everywhere as the focus sharpens around when to deploy AI-powered solutions. The era of AI dramatically expands that next horizon and moves it much, much closer, accelerating transformation and value realization timelines.

If you are assessing your organization’s readiness milestones on a revised roadmap, include room for experimentation [the unknown?]. With AI, there’s what you want it to do, and in many cases, what more you realize it can do once you begin implementation. Not quite serendipity, but close.

Microsoft Azure was purpose built with your limitless innovation in mind. With the most comprehensive and responsible AI toolset in the market, the most advanced supercomputer in the world, and a team of the best and brightest on hand to help you plan and execute, Microsoft is the trusted partner to empower you to make the most of the AI era. 

IDC STUDY—The Business Opportunity of AI (microsoft.com) 

The post What’s new in Azure Data, AI, and Digital Applications: Are you ready to go from GenAI experimentation to solutions deployed at scale? appeared first on Azure Blog.
Quelle: Azure

Reflecting on 2023—Azure Storage

The beginning of a new year often prompts reflection along with planning for the future. At the forefront of our priorities lies the commitment to enhance the Azure platform and its ecosystem, fuel groundbreaking AI solutions, and facilitate seamless migrations to the cloud. We achieve this through purpose-built storage solutions tailored to meet the unique demands of your workloads.

2023 was an emblematic year of growth for storage and data services with AI at the top of mind for many customers. Azure customers’ overall data estate continued to grow, powered by the emergence of new workloads and access patterns. Notable Azure Storage figures include:

The storage platform now processes more than 1 quadrillion (that’s 1000 trillion!) transactions a month with over 100 exabytes of data read and written each month. Both numbers were sharply higher compared to the beginning of 2023.

Premium SSD v2 disks, our new general-purpose block storage tailored for SQL, NoSQL databases, and SAP grew capacity by 100 times.

The total transactions of Premium Files, our file offering for Azure Virtual Desktop (AVD) and SAP, grew by more than 100% year over year (YoY).

In 2023, advancements and investments were made in five strategic areas aligning with customer workload patterns, cloud computing trends, and the evolution of AI.

Focused innovations for new workloads

We’ve made advancements in offering end-to-end solutions with unique storage capabilities, tailored to help customers bring new workloads to Azure, without the need to retrofit them.

As customers modernize applications with Kubernetes, we have observed customers’ workloads moving from “stateless” to “stateful” workloads. Azure Container Storage is custom designed to meet the needs of hosting stateful workloads where storage is tightly integrated with containers and offers outstanding data management, lifecycle management, price-performance, and scale. Azure Container Storage is the industry’s first platform-managed container-native storage service in the public cloud. It offers a path to manage persistent volumes and the backend storage options completely via the Kubernetes native experience.

Microsoft’s Azure Elastic SAN is a fully managed storage area network (SAN) service that simplifies deploying, scaling, managing, and configuring a SAN in the cloud. It is designed to address customers’ challenges in service management when it comes to migrating a large-scale SAN appliance to cloud. We are the first in the market in providing a fully managed SAN offering on cloud, with built-in cloud scale and high availability options. It adopts a SAN-like hierarchy, provisioning resources at the top-level resource, dynamically sharing resources across the workloads, managing security policies at the workload level. In addition, it enables customers to bring new data intensive workloads to Azure VMware Solution using Elastic SAN to serve the storage needs. Elastic SAN and Azure Container Storage, in public preview, showcase a paradigm shift by simplifying service management, purpose built for cloud native storage scenarios. General availability is planned for the upcoming months.

AI is leading a fresh wave of innovation powered by data. Azure Blob Storage stands at the forefront of this data explosion, providing excellent performance and high bandwidth access for all AI training needs. Azure Premium Block Blob, generally available since 2019, provides extremely low latency at competitive pricing for AI and machine learning (ML), IoT and streaming analytics, and interactive workloads. We also released Azure Managed Lustre in July which offers a fully managed distributed parallel file system tailored for high performance computing (HPC) and AI training workloads.

OneLake (which is the storage layer for Microsoft Fabric) is powered by the scale, performance, and rich capabilities of Azure Data Lake Storage (ADLS). ADLS has been a game changer for analytics with differentiated storage features including native file system capabilities that enable atomic meta data operations, resulting in a significant performance boost, especially when used with Azure Premium Block Blob. Multi-protocol access is another unique capability that allows customers to work with their data using a variety of access protocols (including REST, SFTP and NFS v3) eliminating data silos, costly data movement, and duplication.

Optimizations for mission critical workloads

We continue to add differentiated capabilities to enable enterprises to optimize their mission critical workloads hosted on Azure.  

With the general availability of Premium SSD v2, we have been actively onboarding solutions focused on mission critical data. Gigaom highlighted the superior performance and reduced total cost of ownership of SQL Server 2019 running on Azure Virtual Machines (VMs) with Premium SSD v2. Similar benefits apply to other database workloads like PostgreSQL. We will continue enhancing Premium SSD v2 capabilities and expand regional coverage in the upcoming year.

Azure Blob Storage powers data-intensive mission critical workloads globally, including some of the largest AI model training and inferencing. Cloud-native companies host exabytes of consumer-generated data and rely on the platform’s scale-out capabilities. Major retailers store trillions of user activities to power recommendation engines and inventory systems. The Cold storage tier, released in 2023, enables cost-efficient retention of this data.

Enterprises migrate Windows Server workloads to the cloud for modernization, supported by enhanced Azure Files and Azure File Sync capabilities. The preview of Microsoft Entra ID (previously named as Azure Active Directory support for Azure Files REST API) enables modern applications to use identity-based authentication and authorization for accessing SMB file shares. Azure Files offers comprehensive authentication support, including on-premises AD and Microsoft Entra ID for both Kerberos and REST. Unique to Azure, we also offer Microsoft Entra ID Kerberos authentication enabling seamless migration from on-premises Windows servers to Azure which keeps your data and permissions intact while eliminating complex domain join setups in the cloud. Enterprises migrating mission-critical applications such as SAP and Oracle rely on Azure NetApp Files (ANF) to meet their performance and data protection requirements.  In strategic partnership with NetApp, we introduced the public preview of ANF large volumes up to 500 TiB, addressing the needs of workloads that require larger capacity under a single namespace, such as HPC in Electronic Design Automation and Oil and Gas related applications, and the public preview of Cool Access, improving the TCO for infrequently accessed data.

Efficient data migration at scale is critical for successful cloud workload migration. We offer a comprehensive suite of migration solutions. Azure Storage Mover, a fully managed data migration service, improves the overall migration experience of large data sets. Azure File Sync allows hybrid synchronization between on-premises Windows Servers and Azure Files. AzCopy, a command-line utility, enables quick script-based migration jobs, while Azure Databox facilitates offline data transfer.

Expanding partner ecosystem

We transition customer workloads to Azure in collaboration with our partners, who complement our platform capabilities with innovative solutions. We have fostered numerous successful strategic partnerships and are delighted to see the partner community investing in exclusive Azure solutions.

Qumulo announced the next version of Azure Native Qumulo Scalable File Service, exclusive to Azure and built to leverage Azure Blob Storage’s scale and cost efficiency. This service supports exabytes of data under a single namespace, surpassing other public cloud file solutions, and facilitates high-scale enterprise network attached storage (NAS) data migration to Azure.

Commvault unveiled a purpose-built cyber resilience solution, exclusively on Azure. It empowers users to predict threats faster, achieve clean recoveries, and accelerate response times. It integrates seamlessly with Microsoft Azure OpenAI Service.

Since the launch of Pure Storage Cloud Block Storage offering on Azure in 2021, we have partnered closely with Pure Storage. They use unique Azure Storage features such as shared disks and were early adopters of Ultra and Premium SSD v2 disk storage. Our collaboration has expanded to containers through Portworx and the Azure VMware Solution.

Besides storage and backup partners, we collaborate with Atempo, Data Dynamics, Komprise, and Cirrus Data to deliver storage migration solutions, providing access to industry-leading file and block solutions to migrate petabytes of NAS and storage area network (SAN) data to Azure at no additional charge.

Industry contributions

We actively participate in industry organizations such as Storage Networking Industry Association (SNIA) and the Open Compute Platform (OCP), collaborating with industry leaders to share insights, influence standards, and contribute to the development of innovative solutions and best practices.

At the 2023 Storage Developer Conference, we presented Massively Scalable Storage for Stateful Containers on Azure, emphasizing the synergy between Azure Container Storage and Elastic SAN. This combination offers unified volume management across diverse storage backends and delivers a highly available block storage solution designed to scale to millions of input/output operations per second (IOPS) with fast pod attach and detach experience.

At Flash Memory Summit (FMS) conference we chartered the course for the future of flash as a HDD displacement technology in alignment with Azure Storage long-range plans.

We are committed to sustainability, striving for net carbon-zero goals in the next decade. Through the OCP Sustainability Project, we provide an open framework for the datacenter industry to adopt best practices for reusability and circularity and lead in the steering committee to drive progress.

Unparalleled commitment to quality

Our primary focus is on delivering a robust foundation for our storage services, adapting to evolving workloads like the emergence of AI and ML. This ongoing commitment covers reliability, durability, scalability, and performance, for our storage solutions while maintaining a low total cost of ownership (TCO) for Azure customers.

We assure customers of high data durability, backed by substantial investments in infrastructure, hardware, software, and streamlined processes. Our Zone-redundant Storage (ZRS) ensures data durability of at least 99.9999999999% (12 9’s) over a given year—the highest standard among the major cloud service providers (CSPs). Notably, we are the sole cloud provider among the major CSPs maintaining a 0% Annual Failure Rate (AFR) with our block storage offerings since launch.

Our unique ZRS offering provides a simple way to establish highly available solutions across three zones at a fraction of the costs. You can deploy a cross-zonal AKS cluster with persistent volumes hosted on ZRS disks to ensure data durability and availability during zonal outages. Clustered applications like SQL failover cluster instances (FCI) leveraging Windows Server Failover Cluster (WSFC) can also benefit from ZRS with high resiliency provided out of the box.

In a concerted effort to elevate customer support experience, we’ve integrated with Azure Copilot, empowering support engineers to troubleshoot more efficiently while delivering quality responses quickly. This has resulted in a noteworthy increase in customer satisfaction scores.

In closing, 2023 has been a year of profound learning and substantial progress. Explore our feature releases in Azure Updates. Our commitment is to empower you with Azure Storage innovations and enable the seamless execution of a diverse range of workloads. We invite you to continue trusting us as we anticipate exciting developments in 2024!
The post Reflecting on 2023—Azure Storage appeared first on Azure Blog.
Quelle: Azure

Microsoft Cost Management updates—January 2024

Whether you’re a new student, or part of a thriving startup or the largest enterprise, you have financial constraints, and you need to know what you’re spending, where it’s being spent, and how to plan. Nobody wants a surprise when it comes to the bill, and this is where Microsoft Cost Management comes in.

We’re always looking for ways to learn more about your challenges and how Cost Management can help you better understand where you’re accruing costs in the cloud, identify and prevent bad spending patterns, and optimize costs to empower you to do more with less. Here are a few of the latest improvements and updates based on your feedback:

Billing Tags

Exports (preview)

Pricing updates on Azure.com

Help shape the future of Pricing calculator

What’s new in Cost Management Labs

New ways to save money with Microsoft Cloud

New videos and learning opportunities

Documentation updates

Let’s dig into the details.

Billing tags

Tags are a great way to group and allocate costs based on different engineering departments, business units, and organizational hierarchies, for example. There are many tools available in Microsoft Azure to implement your tagging strategy for cost visibility (showback) and accountability (chargeback). You can create Azure policies to ensure that your Azure resources are tagged in a certain way, enable the tag inheritance feature in cost management to use subscription and resource group tags for cost reporting, and create cost allocation rules using tags to manage shared costs, among other actions.

We’re very pleased to announce one more capability that you can potentially use to implement your tagging strategy for your cost allocation needs. In addition to tagging Azure resources, you can now also tag billing profiles and invoice sections and use them for cost reporting and analysis by enabling tag inheritance at the billing profile level. If you already have tag inheritance enabled, any tag applied to billing profiles or invoice sections will automatically be inherited.  

Learn more on the application of billing tags.

Exports (preview)

In our year-end blog, I mentioned the opportunity to sign up for the much-improved Exports experience with access to more datasets including reservation recommendations, reservation transactions, reservation details, and price sheets. Now, you can directly enable the new functionality through Cost management labs.

I strongly encourage you to take advantage of this preview and get access to additional datasets along with improved user experience and new functionality. With features like file partitioning and file overwriting you can also potentially save on your network and storage costs.  We have also added support for FinOps open cost and usage specification (FOCUS) dataset which combines amortized and actual costs, eliminating the need to ingest and process multiple datasets.

For more details, please refer to this article.

Pricing updates for Azure.com

It’s been a few months since we last shared an update on our Azure pricing experiences, and we’re excited to tell you about the new pricing we’ve launched. These enhancements aim to streamline your cost estimation process for Azure solutions.

We’ve integrated Azure Savings Plan into the Azure Container Apps service, both on its pricing page and calculator.

The virtual machines selector tool now includes Azure Resource Manager (ARM)-based burstable virtual machines (VMs).

Our suite of AI-related services has seen numerous pricing updates. This includes Microsoft Azure OpenAI Service (new “whisper” speech model), Microsoft Azure AI Content Safety (jailbreak risk detection and protected material detection features), Microsoft Azure AI Speech (Avatar Text-to-Speech feature and enhanced add-on features for commitment tiers), Microsoft Azure AI Vision (‘video retrieval’ and ‘image retrieval’ features), and Azure AI Search (general availability of the Semantic Ranker feature).

We’ve launched new pages for Microsoft Azure HDInsight on AKS (preview), Microsoft Azure Managed Confidential Consortium Framework (as a pricing page and as a calculator service), Microsoft Fabric (as a calculator service), and Microsoft Azure IoT Operations (pricing page for preview). We’ve also revamped several pages for improved user experience, including Microsoft Azure AI Vision, Microsoft Azure AI Language, Microsoft Azure App Service, and the Microsoft Azure Electric SAN’s pricing calculator.

We’ve updated the pricing of recently generally available services and features in Azure, including Microsoft Azure Chaos Studio, the Microsoft Azure Monitor SCOM Managed Instance feature, Azure Communication’s Job router feature, and Microsoft Azure Blob Storage’s Cold tier.

Many other services have seen updates in features and pricing, including Microsoft Azure Virtual Machines, Microsoft Azure Cosmos DB, Microsoft Azure VM Ware Solution, Microsoft Fabric, Microsoft Azure Red Hat OpenShift, Microsoft Azure SQL Database, Microsoft Azure SQL Managed Instance, Microsoft Defender for Cloud, Microsoft Azure Databricks, Microsoft Azure Database for MySQL, Microsoft Azure Backup, Microsoft Azure Managed Grafana, block Microsoft Azure Blob Storage, Azure Container Apps, Microsoft Azure Update Manager, Microsoft Azure Active Directory, Microsoft Azure Notification Hubs, and Azure App Configuration.

We continue to strive for improvements in our pricing tools to make them more accessible and user-friendly. We hope these changes assist you in estimating costs for your Azure Solutions. Your feedback and suggestions for future improvements are always welcome.

Help shape the future of Pricing calculator

We know that Pricing calculator is an important tool for you to estimate costs of your Azure resources and plan your cloud spend. We would love to hear from you as we create new designs for the calculator to improve your experience. Please share your feedback in this study and feel free to share within your organization.

This is an unmoderated study and should take less than 30 minutes.

What’s new in Cost Management Labs?

With Cost Management Labs, you get a sneak peek at what’s coming in Cost Management and can engage directly with us to share feedback and help us better understand how you use the service so we can deliver more tuned and optimized experiences. Here are a few features you can see in Cost Management Labs:

New: Exports (preview)

As mentioned above, use Exports (preview) for additional datasets and functionality.

Currency selection in Cost analysis smart views  View your non-USD charges in USD or switch between the currencies in which you have charges to view the total cost for that currency only. To change currency, select Customize at the top of the view and select the currency you would like to apply. Currency selection is not applicable to those with only USD charges. Currency selection is enabled by default in Cost Management Labs. If not enabled for you, enable currency selection from the Try preview menu.   

 Streamlined Cost Management menu Organize Cost Management tools into related sections for reporting, monitoring, optimization, and configuration settings. You can enable this option from the Try preview menu.   

Recommendations view   View a summary of cost recommendations that help you optimize your Azure resources in the cost analysis preview. You can opt in using the Try preview menu.   

Forecast in Cost analysis smart views Show your forecast cost for the period at the top of Cost analysis preview. You can opt in using Try preview. 

Group related resources in Cost analysis smart views   Group related resources, like disks under virtual machines or web apps under App Service plans, by adding a “cm-resource-parent” tag to the child resources with a value of the parent resource ID.   

Charts in Cost analysis smart views View your daily or monthly cost over time in Cost analysis smart views. You can opt in using Try preview. 

View cost for your resources The cost for your resources is one click away from the resource overview in the preview portal. Just click View cost to quickly jump to the cost of that resource.   

Of course, that’s not all. Every change in Cost Management is available in Cost Management Labs a week before it’s in the full Azure portal or Microsoft 365 admin center. We’re eager to hear your thoughts and understand what you’d like to see next. What are you waiting for? Try Cost Management Labs today.

New ways to save money with Microsoft Cloud

Lots of cost management improvements over the last month! Here are new and updated offers you might be interested in:

Azure Advisor integration with Azure Monitor Log Analytics Workspace

Public Preview: Free SQL Managed Instance

Azure Spring Apps Enterprise is now eligible for Azure savings plan for compute

Dedicated clusters in Azure Monitor logs now support any commitment tier

General availability: “As on-premises” sizing in Azure Migrate SQL Discovery and Assessment

Generally available: Zone Redundant Storage for Azure Disks is now available in more regions

New videos and learning opportunities

Here are a few new videos you might be interested in:

Use your Marketplace Rewards Azure sponsorship benefit to increase your marketplace sales (youtube.com)

Tools and Tips for Unparalleled Cost Transparency on AKS (youtube.com)

A couple of blogs for your reading:

Take control of your cloud spend with Microsoft Cost Management

Interconnected guidance for an optimized cloud journey

Follow the Microsoft Cost Management YouTube channel to stay in the loop with new videos as they’re released and let us know what you’d like to see next.

Want a more guided experience? Start with: Control Azure spending and manage bills with Microsoft Cost Management.

Documentation updates

Here are a few documentation updates you might be interested in:

Azure EA pricing – Microsoft Cost Management

How an Azure saving plan discount is applied – Microsoft Cost Management

View your Azure usage summary details and download reports for EA enrollments – Microsoft Cost Management

Azure EA agreements and amendments – Microsoft Cost Management | Microsoft Learn

Want to keep an eye on all documentation updates? Check out the Cost Management and Billing documentation change history in the azure-docs repository on GitHub. If you see something missing, select Edit at the top of the document and submit a quick pull request. You can also submit a GitHub issue. We welcome and appreciate all contributions!

What’s next?

These are just a few of the big updates from last month. Don’t forget to check out the previous Cost Management updates. We’re always listening and making constant improvements based on your feedback, so please keep the feedback coming.

Follow @MSCostMgmt on Twitter and subscribe to the YouTube channel for updates, tips, and tricks. You can also share ideas and vote up others in the Cost Management feedback forum or join the research panel to participate in a future study and help shape the future of Cost Management.

Best wishes from the Cost Management team.
The post Microsoft Cost Management updates—January 2024 appeared first on Azure Blog.
Quelle: Azure

Harmonizing AI-enhanced physical and cloud operations 

Azure’s adaptive cloud approach builds on our work with Azure Arc to make it easier for customers to accelerate their adoption of cloud innovation throughout their entire IT estate. In my last blog, I talked about how our adaptive cloud approach unifies siloed teams, distributed sites, and sprawling systems with a collaborative innovation and operations platform—enhanced by AI—across hybrid, multicloud, edge, and IoT. You can accomplish a lot with Azure Arc today, but you will be able to amplify your impact by leveraging AI across existing and new scenarios such as:

Operating with AI-enhanced central management and security. 

Rapidly developing and scaling applications across boundaries. 

Cultivating data and insights across physical operations. 

Customers continue to indicate that building cloud-native apps that can take advantage of data distributed throughout their physical operations will be central to their next digital transformation phase. However, when we work with them to implement these new solutions, the challenges customers face are evident. Issues like legacy applications, heterogeneous environments, fragmented data, and a lack of standardization are slowing their progress. Accordingly, I want to focus on the requirements for the second and third scenarios above. 

What is needed, and what we are building in Azure, is a single AI-enhanced operations and innovation platform for developing highly distributed applications that reveal insights and accelerate transformation. 

Turning vision into results

We have learned a great deal about what is needed to bring cloud-native practices, patterns, and approaches into any environment from working with customers, like DICK’s Sporting Goods, who are digitally re-imagining the in-person shopping experience, as well as manufacturers, like Grupo Bimbo, who are creating the factories of the future. 

Rapidly develop and scale applications across boundaries 

Cloud-native architecture and engineering bring significant benefits to physical operations. To realize these benefits, resources and applications must be tailored to work with physical processes, efficiently managing what workloads run in the public cloud and at operational sites. 

By leveraging cloud-native application development patterns everywhere, organizations can focus on creating small, modular, and repeatable component-level investments. These components can be updated frequently, exchanged, and deployed across many environments consistently. This approach helps organizations move away from creating unique solutions for each factory or store by providing a more scalable, standardized foundation for solution development and management. 

Drawing inspiration from DICK’s Sporting Goods’ “One Store” vision, the same principle applies: managing, updating, and deploying to each of their 800-plus stores as if they were a single entity. This streamlines DevOps integration and sets a precedent for central application deployment. It establishes a robust foundation for data processing across cloud and edge, with intent to empower decision-making at the edge. This, in turn, boosts efficiency, agility, and sustainability within physical operations. Here is how: 

Azure Arc helps you scale cloud-native applications to locations like factories or branch offices.

Microservices architecture breaks down complex systems into smaller, independent services, enhancing agility and ease of updates. For IT and developers, this approach boosts development speed and application updates, and fosters innovation in OT systems. Azure Arc supports this by consistently deploying and managing these microservices across various environments, ensuring flexibility and integration in hybrid landscapes. 

Streamlined DevOps integration emphasizes collaborative and automated workflows that enhance the development lifecycle and significantly reduce the time-to-market for new features and updates. Azure facilitates this through Azure Arc by extending Azure management services to your infrastructure, enabling DevOps practices to be implemented more consistently across hybrid, multicloud, edge, and IoT-heavy environments. 

Central application deployment ensures consistent and efficient application rollouts across the entire network, which is crucial for maintaining a harmonized operational state and reducing deployment variance. Azure Arc integrates with GitOps workflows, allowing for configuration management and application deployment practices to be centralized through Git repositories, which Azure Arc uses to automate updates and maintain consistency across environments. 

Global orchestration and resiliency allows for a coordinated and reliable operation of systems up to an international scale, enhancing the overall uptime and fault tolerance of applications. Adopting CI/CD practices enables continuous updates and improvements. This ensures that systems are up to date with the latest features and security patches, reducing vulnerabilities. Azure Arc brings Azure’s orchestration capabilities to a broader range of environments, providing the tools needed to manage and maintain applications with the same resiliency as in Azure’s native cloud environment. 

Kubernetes everywhere offers scalability and deployment agility through the use of container orchestration to manage applications seamlessly in any environment. Azure Arc enables this by allowing organizations to run Azure services on Kubernetes clusters anywhere, extending Azure’s management capabilities to any infrastructure, and enabling customers to use Azure’s first party Kubernetes service (AKS) to run anywhere. 

Hyperscale cloud services to the edge extend the vast capabilities of cloud computing to local edge devices, optimizing performance and responsiveness, and allowing for real-time data processing and analytics at the source of data generation. Azure Arc supports this by bringing Azure services to the edge, such as AKS, AI/ML, App Service, Functions, Logic Apps, IoT Operations, and more, enabling applications to leverage Azure’s hyperscale capabilities on-premises and in edge environments. 

AI-assisted development and resource management brings a new level of intelligence and automation to cloud environments. By leveraging AI, systems can anticipate needs, optimize resources, and automate routine tasks, to significantly improve efficiency and accuracy. GitHub Copilot helps. Microsoft Copilot for Azure brings AI assistance to operating application resources in the Azure platform, whether they are in Azure regions or connected via Azure Arc. 

Cultivate data and insights across physical operations 

Physical operations, including production lines, processing facilities, retail locations, medical equipment, and shop floors, are pivotal to an enterprise’s operational framework. These operations are typically spread across several geographically diverse sites, each characterized by distinct equipment types, personnel, and processes. Therefore, enterprises have historically adopted a decentralized model for digitizing their physical operations. This approach involves independent, site-specific strategies for technology integration, application development, and data management. 

While this method provides local autonomy, it often leads to inefficiencies and challenges in scaling operations. Extending a cloud-native approach to data is the other critical component for developing highly distributed applications that unlock insights and accelerate transformation. 

A common data foundation enables you to start generating insights centrally across dispersed data.

Standards-based data flow ensures seamless communication and integration between distributed data, systems, and devices. This means creating interoperable systems that can exchange data effectively for IT, developers, and OT. Azure IoT Operations enables secure and standardized communication between IoT devices, helping maintain consistent and efficient data flow. Additionally, Microsoft’s existing library of connectors to line-of-business applications allows use of Microsoft Fabric’s data factory tools to manage data from multiple sources, like support for use in Fabric’s data lake.

Common data foundation provides unified data management across diverse environments, which is key for consistency and reliability in analytics and decision-making. This foundation is valuable because it eliminates data silos, allowing for comprehensive insights and better data governance. With Azure Arc, companies can extend Azure data services and tools to most infrastructure, ensuring a consistent data estate spanning on-premises, multi-cloud, and edge environments. Azure IoT Operations, which is built in a way that is open and extensible, leverages open source standards and specifications, including MQTT, OPC UA, Open Telemetry, Akri, and Kubernetes, simplifying integration across a customer’s solution ecosystem and their existing infrastructure. It enables a cloud-to-edge data plane with local data processing and analytics to transfer enhanced, useful data to hyperscale cloud services such as Microsoft Fabric, Azure Event Grid, and Azure Digital Twins. A common data foundation is key to make your data more available, enable cross-team collaboration, and accelerate decision-making. 

Decentralized decision-making enhances responsiveness and reduces latency, which is vital in dynamic operational environments. In IT and OT, the ability to respond faster to operational signals in near real time, without reliance on centralized control systems is often preferred. Azure and Azure IoT Operations support this by enabling edge computing, where operational decisions are made closer to the data source. Azure Arc helps by managing resources in a distributed manner across various locations while making access to information available to all stakeholders from a central, single source of truth. 

Contextualized data to information transforms raw data into easily referenced information and is important for making informed operational decisions. Azure Arc adds value because you can bridge data sources running anywhere, facilitating the integration of local data points with data in the cloud, like supply chain information or customer order data. Then, you can use shortcuts in Microsoft OneLake, within Microsoft Fabric, to unify your data across domains, clouds, and accounts by creating a single virtual data lake for your entire enterprise. This is possible because Azure Arc allows for creating data pipelines so that data aggregations from multiple factories, clinics, or stores can be brought into OneLake. 

Going forward 

While we witnessed a surge in cloud adoption for IT infrastructures over the past decade, signals indicate the next decade will be a transformative era: building applications that are designed to be distributed across public cloud and operational environments with a set of consistent AI-enhanced services that run anywhere. From brick-and-mortar facilities to supply chains, all the way through personnel and services, cloud-native technology and practices will become the enabler of transformation and will be woven into the physical operations of every organization. 

As we navigate this transformative landscape, the call to action for technical leaders is clear: embrace the adaptive cloud approach to act on the changing digital landscape and actively shape it. To get started, bookmark the Azure Arc Jumpstart page. We have ready-to-go environments that you can evaluate and spin up easily, without the work of creating a complex hybrid infrastructure just to test things out. 
The post Harmonizing AI-enhanced physical and cloud operations  appeared first on Azure Blog.
Quelle: Azure

How H&R Block is using Azure OpenAI Service to bring ease to American taxpayers this season

In 2023, we saw generative AI usher in a wave of new innovations, from Microsoft Copilot to the “last song” from The Beatles to helping rural India farmers gain easier access to government services. These AI innovations are now starting to show up in our everyday lives and have a positive impact for millions of people around the world.

Last week, as part of Microsoft’s FY24 Q2 Earnings, we announced we now have 53,000 Azure AI customers. Over one-third are new to Azure this calendar year.

And we have great momentum with Azure OpenAI Service, too, which became generally available just over 12 months ago, empowering developers and organizations to build AI applications with OpenAI’s latest models (including GPT-4 Turbo with Vision, fine-tuning for the foundational models, and DALL-E 3) backed by the enterprise capabilities of our Azure cloud. We continue to introduce new innovation with Azure OpenAI Service including Assistants API in preview, text-to-speech capabilities, support for new models, and updates to fine-tuning API.

Today, thousands of organizations around the world are using Azure OpenAI service to drive meaningful business impact for people and society. We are seeing increased usage from AI-first startups like DeepBrain, Perplexity, and SymphonyAI, as well as the world’s biggest companies. Over half of the Fortune 500 use Azure OpenAI today, including AT&T, Walmart, Ally Financial, and Carmax.

Helping taxpayers with AI Tax Assist

It’s no secret that filing taxes can be stressful, overwhelming, and time-consuming. According to the IRS, the average taxpayer spends 13 hours preparing their return. As we enter this year’s tax season (the IRS officially began accepting eFiling from individuals on January 29), it’s especially exciting to see how H&R Block is leveraging the generative AI capabilities of Azure to help ease the filing process for taxpayers this year. 

Almost half (46%) of the 146 million tax filers in the U.S. prepare and file their own taxes. This season, those 69 million DIYers will have the ability to simplify the tax preparation process with H&R Block AI Tax Assist.1

Sound familiar? You may have caught a glimpse of AI Tax Assist commercials last month during the NFL playoffs, or when the solution first launched in December 2023.

H&R Block AI Tax Assist is a generative artificial intelligence (GenAI) experience that harnesses the power of Azure OpenAI Service to help streamline the tax preparation process for individuals, the self-employed and small business owners to file and manage their taxes confidently.

“With H&R Block AI Tax Assist, we are enhancing the DIY tax filing experience. By applying generative AI tools with our unique ability to provide human expertise at scale, we’re taking a step forward in innovation to help make the tax filing process easier. We’re thrilled to work with Microsoft and leverage the power of Azure OpenAI Service to transform how Americans do their taxes.”
– Heather Watts, Senior Vice President of Consumer Tax Products, H&R Block 

Leveraging data from H&R Block’s The Tax Institute and the experience of more than 60,000 tax professionals, H&R Block AI Tax Assist helps DIY customers efficiently work through the tax preparation process by assisting with:

Tax Information: AI Tax Assist can provide information on tax forms, deductions, and credits, maximizing potential refunds and minimizing tax liability.

Tax Preparation: AI Tax Assist can guide individuals through questions as they prepare and file their taxes, answer tax theory questions and offer navigation instructions when needed. 

Tax Knowledge: AI Tax Assist can answer free-form tax-related questions, providing dynamic responses that clarify tax terms and give guidance on specific tax rules or general information about the U.S. and state tax system. 

Tax Changes: AI Tax Assist can answer questions about the tax code, recently changed laws and tax policies. 

“H&R Block has built a long-standing partnership with Microsoft starting with migration to Azure as a part of our broader digital transformation. As we considered platform partners for building the AI Tax Assist and AI Platforms more broadly, we indexed heavily on speed-to-market while ensuring we did it safely, responsibly, and honoring the trust our clients place in our brand every day. The capabilities of Azure OpenAI Service to enable responsible deployment of AI were critical in helping build confidence in this new solution and allowed us to deploy this new technology at scale to our clients early in our journey.”
– Alan Lowden, Chief Information Officer, H&R Block

Driving business impact with Azure AI

Like H&R Block, organizations including AT&T, Inflection AI, PwC, Siemens and Volvo are using Azure AI to reimagine customer experiences, remove friction in business-critical processes to help end-users and employees focus their time and energy on valuable work.

Whether building a copilot, a chatbot, or a creative assistant, businesses are taking advantage of Azure AI’s model choice, flexibility and multimodal capabilities through Azure AI Studio to innovate faster and more responsibly across a range of diverse services including education, financial services, healthcare, manufacturing, and risk management.

Our commitment to responsible AI

With Responsible AI tools in Azure, Microsoft is empowering organizations to build the next generation of AI apps safely and responsibly. Azure AI Content Safety is a state-of-the art AI system that helps organizations keep AI-generated content safe and create better online experiences for everyone. Customers—from startup to enterprise—are applying the capabilities of Azure AI Content Safety to social media, education and employee engagement scenarios to help construct AI systems that operationalize fairness, privacy, security, and other responsible AI principles.

Get started with Azure OpenAI Service 

Apply for access to Azure OpenAI Service by completing this form. 

Build with Azure OpenAI Service models using Azure AI Studio.

Learn about Azure OpenAI’s, latest enhancements and get started with GPT-4.

Join Microsoft for Startups Founders Hub if you’re a startup founder.

Learn more about Azure AI Content Safety.

1- Internal Revenue Service Data Book
The post How H&R Block is using Azure OpenAI Service to bring ease to American taxpayers this season appeared first on Azure Blog.
Quelle: Azure

AWS Private CA unterstützt jetzt den Widerruf von Matter-Zertifikaten

Die AWS Private Certificate Authority (AWS Private CA) unterstützt jetzt den Widerruf von Matter-Zertifikaten. Matter ist ein Industriestandard für Smart-Home-Geräte, der eine nahtlose, sichere und herstellerübergreifende Konnektivität für Geräte wie Leuchtmittel, Türverriegelungen und Mediengeräte bietet. Sie können AWS Private CA nutzen, um digitale Zertifikate zur Identifizierung von Matter-Geräten auszustellen. Um die Sicherheit des Smart-Home-Standards zu erhöhen, wurde in Matter 1.2 eine Unterstützung für den Widerruf von Gerätezertifkaten (DACs) eingeführt. Mit dieser neuen Unterstützung für den Widerruf von AWS Private CA können Sie Matter standardkonform erhalten, ohne dass bestehenden Matter-konforme Zertifizierungsstellen (CAs) gestört werden.
Quelle: aws.amazon.com

Amazon Q in QuickSight ist jetzt in der Region Europa (Frankfurt) als Vorschau verfügbar

Amazon Q in QuickSight ist jetzt zusätzlich zu den bestehenden Regionen USA Ost (Nord-Virginia) und USA West (Oregon) als Vorschauversion in der Region Europa (Frankfurt) (eu-central-1) verfügbar. Amazon Q in QuickSight, unterstützt durch die großen Sprachmodelle (LLMs) von Amazon Bedrock, macht es einfach und schnell, Daten zu untersuchen, Erkenntnisse aus Daten zu gewinnen und Ergebnisse mit anderen zu teilen. QuickSight Q-Abonnenten können jetzt einsteigen, indem sie die Vorschau über den Vorschaumanager von QuickSight aktivieren.
Quelle: aws.amazon.com

Einführung der AWS-Kompetenz für kleine und mittlere Unternehmen (SMB)

Wir freuen uns, die AWS-Kompetenz für kleine und mittlere Unternehmen (SMB) einzuführen, die erste AWS-Spezialisierung auf dem Markt, die für Partner konzipiert wurde, die kleine und mittlere Kunden beliefern. Die SMB-Kompetenz bietet AWS-Partnern zusätzliche Vorteile, wenn sie investieren und sich auf das Geschäft mit KMU-Kunden konzentrieren, und stellt eine ehrgeizige Messlatte dar, die aufstrebende Partner erreichen müssen. Zu den Vorteilen der SMB-Kompetenz gehört, dass Partner zum Standard für die Teilnahme an neuen Pilot-/Vertriebsinitiativen werden und einen einzigartigen Zugang zu skalierbaren Motoren zur Nachfragegenerierung erhalten.
Quelle: aws.amazon.com

Amazon RDS for Db2 unterstützt jetzt die EBCDIC-Kollatierungssequenz

Amazon Relational Database Service (Amazon RDS) for Db2 unterstützt jetzt die Sortiersequenz EBCDIC (Extended Binary Coded Decimal Interchange Code). Mit diesem Launch können Sie jetzt die Sortierungssequenz beibehalten, wenn Sie von Db2 auf z/OS, das die EBCDIC-Kollatierungssequenz unterstützt, zu Amazon RDS for Db2 migrieren. Sie können die EBCDIC-Kollatierungssequenz angeben, wenn Sie eine Datenbank mithilfe der gespeicherten Prozedur rdsadmin.create_database erstellen, die von Amazon RDS for Db2 bereitgestellt wird.
Quelle: aws.amazon.com