AWS Firewall Manager launches in AWS Asia Pacific (New Zealand) Region

AWS Firewall Manager announces that it is now available in AWS Asia Pacific (New Zealand) Region. AWS Firewall Manager helps cloud security administrators and site reliability engineers protect applications while reducing the operational overhead of manually configuring and managing rules. Working with AWS Firewall Manager, customers can provide defense in depth policies to address the full range of AWS security services for customers hosting their applications and workloads in AWS Taipei. Customers wishing to establish secured assets using AWS WAF can create and maintain security policies with AWS Firewall Manager. To learn more about how AWS Firewall Manager works, see the AWS Firewall Manager documentation for more details and the AWS Region Table for the list of regions where AWS Firewall Manager is currently available. To learn more about AWS Firewall Manager, its features, and its pricing, visit the AWS Firewall Manager website.
Quelle: aws.amazon.com

Amazon EKS announces 99.99% Service Level Agreement and new 8XL scaling tier for Provisioned Control Plane clusters

Amazon Elastic Kubernetes Service (Amazon EKS) now offers a 99.99% Service Level Agreement (SLA) for clusters running on Provisioned Control Plane, up from the 99.95% SLA offered on standard control plane. Amazon EKS is also introducing the 8XL scaling tier, the largest available Provisioned Control Plane tier. Provisioned Control Plane gives you the ability to select your cluster’s control plane capacity from a set of well-defined scaling tiers, ensuring the control plane is pre-provisioned and ready to handle traffic spikes or unpredictable bursts. The higher 99.99% SLA is measured in 1-minute intervals, providing a more granular and stringent availability commitment for mission-critical workloads. The new 8XL tier offers double the Kubernetes API server request processing capacity of the next lower 4XL tier, enabling workloads such as ultra-scale AI/ML training, high-performance computing (HPC), and large-scale data processing. Both the 99.99% SLA and the 8XL tier are available today in all AWS regions where Amazon EKS Provisioned Control Plane is offered. To learn more about the SLA, see the Amazon EKS Service Level Agreement. For 8XL pricing and capabilities, see the EKS pricing and EKS Provisioned Control Plane documentation.
Quelle: aws.amazon.com

Amazon Bedrock AgentCore Runtime adds WebRTC support for real-time bidirectional streaming

Amazon Bedrock AgentCore Runtime now supports WebRTC for real-time bidirectional streaming between clients and agents, adding to the existing WebSocket protocol support. With WebRTC, developers can build voice agents for browser and mobile applications that stream audio and video bidirectionally with low latency using peer-to-peer, UDP-based transport, enabling natural, real-time conversational experiences.
WebRTC joins WebSocket as the second bidirectional streaming protocol supported by AgentCore Runtime. While WebSocket provides persistent, full-duplex connections for text and audio streaming over TCP, WebRTC is optimized for real-time media delivery where low latency is critical, such as voice agents in browser and mobile applications. WebRTC requires a TURN relay for media traffic, and AgentCore Runtime gives you flexibility in how you set that up: Amazon Kinesis Video Streams managed TURN for a fully managed experience with native AWS IAM integration, a third-party provider, or your own self-hosted TURN infrastructure. Both protocols benefit from AgentCore Runtime session isolation, observability, and scaling.
WebRTC is supported in AgentCore Runtime across fourteen AWS Regions: US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Mumbai), Canada (Central), Asia Pacific (Seoul), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), Europe (London), Europe (Paris), and Europe (Stockholm).
To get started, see Bidirectional streaming in the Amazon Bedrock AgentCore documentation, which includes ready-to-deploy examples for both protocols: an Amazon Nova Sonic voice agent with KVS TURN server, Pipecat voice agents with WebSocket, WebRTC, and Daily transport, a LiveKit voice agent, and a Strands Agents SDK voice agent.
Quelle: aws.amazon.com

Modernizing regulated industries with cloud and agentic AI

Organizations today face mounting pressure to grow revenue, strengthen security, and innovate—often all at the same time. To meet these demands, many are accelerating cloud migration as a way to unlock greater business outcomes. According to the IDC White Paper,1 sponsored by Microsoft, the top driver for moving to the cloud is operational efficiency, with 46% of organizations prioritizing reductions in IT operating costs. Beyond cost savings, cloud infrastructure is also enabling organizations to prepare for increased use of AI (37%), launch new performance intensive applications (30%), improve resilience (26%), and meet governance, risk, and compliance requirements (24%).

Yet despite broad cloud adoption, migration and modernization remain complex. Legacy architectures, fragmented environments, and persistent skills gaps continue to slow progress, pushing organizations to find ways to migrate faster while minimizing operational risk.

The IDC study highlights agentic AI as a critical unlock. These intelligent systems automate assessments, orchestrate migration and modernization efforts, and optimize operations across hybrid environments—helping organizations shift from periodic, manual initiatives to continuous, adaptive modernization. This momentum is driving unprecedented growth, with IDC forecasting the public cloud services market will reach USD1.9 trillion by 2029.

Discover how AI accelerates modernizationWhile migration frameworks may be horizontal, their real-world impact is industry-specific. Healthcare, financial services, and manufacturing each face unique constraints shaped by regulation, operational risk, and mission-critical systems.

In this blog, we explore the key migration and modernization challenges across these three industries—healthcare, manufacturing, and financial services—through real customer stories that highlight the tangible impact cloud adoption is delivering today.

Healthcare: Modernizing securely while powering next-generation clinical experiencesHealthcare faces the toughest modernization headwinds: strict regulations (HIPAA/HITECH, HITRUST), fragmented clinical data across electronic health records (EHRs) and imaging systems, aging on-premises infrastructure resulting in high Capex, and heightened exposure to ransomware.1 Clinical environments also demand extremely low latency and high reliability.

The IDC study notes that these constraints slow modernization—but accelerate the need for it, as organizations push to scale telehealth, imaging workloads, genomics pipelines, and AI-powered clinical workflows.1

What healthcare organizations need, according to the IDC study:

Secure, compliant integration across EHRs, picture archiving and communication systems (PACS), genomics systems, and Internet of Things (IoT) medical devices.1Elastic compute for high-throughput imaging and genomics.Stronger disaster recovery and recovery time performance.1Ambient documentation and AI-supported diagnostics.Secure clinician collaboration and modern patient digital front doors.Customer spotlight: Franciscan HealthFacing aging infrastructure and disaster recovery risks, Franciscan adopted a pragmatic workload placement strategy—moving its Epic EHR to Microsoft Azure.

The results included:

$45 million in savings over five years after migrating Epic to Azure.90% faster disaster recovery compared to the prior environment.Around a 30-minute failover, reduced from hours.$10–$12 million per day in potential downtime risk avoided.Learn more about Franciscan Health’s journey to migrate its Epic EHR to Azure here.

Healthcare’s modernization mandate is clear: reduce operational risk, meet regulatory demands, and harness cloud AI to improve patient outcomes.

Turn modernization into AI impact: Explore the AI for Better Health e‑bookFinancial services: Enabling real-time intelligence and automated complianceFinancial institutions operate in one of the most regulated environments, including the payment card industry data security standard (PCI DSS), the Sarbanes-Oxley Act (SOX), the Gramm-Leach-Bliley Act (GLBA), Basel capital frameworks, and know your customer (KYC) and anti-money laundering (AML) requirements, and rely heavily on legacy mainframes that are difficult to modernize. Today, regulatory pressure is intensifying further as new frameworks such as the EU’s Digital Operational Resilience Act (DORA) and the EU AI Act raise the bar for operational resilience, third-party risk management, model transparency, and ongoing compliance monitoring. Under DORA, financial services firms must demonstrate continuous information and communication technology (ICT) risk management, advanced incident reporting, and resilience testing across critical systems and cloud service providers. Meanwhile, the EU AI Act introduces governance requirements for high-risk AI systems, including explainability, data lineage, human oversight, and auditability—with direct implications for fraud models, credit scoring, and customer decisioning platforms.

IDC interviews highlight accelerating demand for real-time risk analytics, fraud detection, digital onboarding, and infrastructure elasticity to support peak activity—capabilities that are increasingly mandated, not optional.1

Key challenges the IDC study identifies:

Strict data residency, model risk governance, explainability, and eDiscovery requirements.1Heightened expectations for operational resilience, cyber defense, and third-party risk oversight.Legacy systems and common business-oriented language (COBOL)-based batch processes resistant to change.Rapidly evolving regulatory mandates requiring continuous compliance rather than point-in-time audits.Cloud—especially especially platform as a service (PaaS) and managed services—helps financial institutions shift from static, batch-driven compliance to continuous controls and real-time observability. By reducing batch windows from hours to minutes, modern cloud platforms enable real-time insights, automated evidence collection, resilient architectures, and policy-driven compliance workflows aligned with DORA and AI governance requirements.1 Learn more about how Microsoft can help financial institutions navigate these requirements in this e-book.

Reimagine financial services with AI, data, and cloud innovationCustomer spotlight: CrediclubTo accelerate product innovation and meet expectations from Mexico’s national banking and securities commission (CNBV), Mexican fintech Crediclub modernized its databases to a serverless platform as a service (PaaS) architecture and adopted microservices.1

The impact:

Uptime improved from around 80% to 99.5%.90% reduction in network latency through Multiprotocol Label Switching (MPLS) and dark fiber.Rapid deployment of new financial products via Kubernetes and DevSecOps.For financial institutions, modernization is no longer just about efficiency—it is foundational to resilience, trustworthy AI, and regulatory compliance at scale.

Manufacturing: Unifying IT and OT for predictive, data-driven industrial operationsManufacturers operate in one of the most complex operating environments—defined by legacy and proprietary operational technology (OT) protocols, historically air-gapped manufacturing execution systems (MES) and supervisory control and data acquisition (SCADA) systems, and globally distributed supply chains. Stringent low-latency requirements for safety-critical systems, intermittent connectivity at the edge, and the need to protect intellectual property further compound the challenge. The ability to modernize and unify these environments—without compromising safety, reliability, or performance—represents a critical inflection point for industrial transformation.

Unique modernization challenges according to the IDC study:

Ultra-low latency requirements for safety-critical operations.Massive telemetry ingestion and time-series analytics at scale.Operational complexity across global, distributed supply chains.Secure protection of intellectual property across edge and cloud environments.Opportunities unlocked by cloud:

Predictive maintenance with IoT ingestion.1Reduced unplanned downtime and improved overall equipment effectiveness (OEE).Digital twins for plants, lines, and products.Computer vision for real-time quality and safety.High-performance computing (HPC) simulations for engineering and design.Standardized, global data models.Get the e-book: Modernizing Manufacturing for Transformative GrowthCustomer spotlight: ASTEC IndustriesASTEC unified fragmented systems across its rock to road value chain—from aggregate processing through asphalt production and paving—by adopting Azure, modernizing to timeseries databases, and building a universal connectivity platform using Azure IoT Hub, Azure Events Hub, and Power BI.1

The results:

Realtime operational visibility across fleets.Predictive maintenance for reducing downtime.New digital services supported by connected equipment.Manufacturing’s modernization imperative: unify OT and IT, scale real-time intelligence, and enable global efficiency.

Microsoft’s approach: Continuous, intelligent, collaborative modernizationMicrosoft’s strategy is grounded in a simple principle: modernization should be continuous, intelligent, and collaborative. The IDC study emphasizes that successful enterprises adopt a balanced, multipath migration strategy, blending rehost, replatform, refactor, and software as a service (SaaS) substitution based on workload criticality.1

Microsoft enables this approach through a comprehensive set of tools and offerings, including Azure Copilot and GitHub Copilot. Agentic automation enables:

Discovery and dependency mapping.Security assessment and 6R recommendations.Application refactoring, code remediation, and modernization.Azure Migrate provides unified discovery, assessment, migration execution, and modernization services. Azure Accelerate complements this with a coordinated framework that includes:

Guided deployments through Cloud Accelerate Factory.1Funding and Azure credits for planning, pilot, and rollout.Expert partners and tailored skilling programs.The IDC study concludes that organizations using Microsoft Azure for migration and modernization achieve lower operational costs, improved resiliency, faster modernization timelines, and stronger security postures—especially in regulated industries.1

Looking ahead: Agentic modernization as the foundation for AI-ready enterprisesAcross all industries, IDC’s findings are consistent: agentic AI is emerging as the new force multiplier for modernization, enabling organizations to keep pace with rising complexity, regulatory demands, and competitive pressure.

Healthcare, financial services, and manufacturing each face unique constraints—but cloud modernization remains the foundation for innovation, operational excellence, and enterprise AI.

Microsoft’s approach gives organizations the unified automation, intelligence, and tooling they need to modernize securely and at scale.

Explore how Azure Copilot accelerates cloud migrationDiscover GitHub Copilot for app modernizationSee how Azure Accelerate supports transformationDiscover how AI accelerates modernization1 IDC White Paper, Cloud Migration and Modernization Strategies for Healthcare, Financial Services, and Manufacturing, February 2026.
The post Modernizing regulated industries with cloud and agentic AI appeared first on Microsoft Azure Blog.
Quelle: Azure

From legacy to leadership: How PostgreSQL on Azure powers enterprise agility and innovation

In today’s digital economy, business leaders face a relentless challenge: how to deliver innovation, scale, and resilience without spiraling costs or compromising performance. At the heart of this challenge lies data infrastructure, which is often one of the most critical and most constrained layers of the enterprise stack.

At Microsoft, we’ve seen firsthand how legacy systems, particularly on-premises databases like Oracle, can become a bottleneck to progress. These systems are expensive to maintain, difficult to scale, and increasingly out of step with the agility modern organizations require. But we also understand that migration is not a trivial decision. Concerns about downtime, compatibility, security, and retraining are real.

That’s why we’ve spent the last several years investing in PostgreSQL. Our mission is to make PostgreSQL the most performant, scalable, and enterprise-ready open database platform available. With Azure Database for PostgreSQL and the newly introduced Azure HorizonDB, we’re delivering on that vision.

Explore Azure Database for PostgreSQL

The cost of standing still

Staying on legacy infrastructure might feel like a safe choice, but it’s rarely the best one. The costs of maintaining aging on-premises databases are rising. Hardware refresh cycles, escalating licensing fees, and the need for niche expertise all add up. Organizations can spend most of their IT budgets and time just maintaining existing systems, leaving little room for innovation.

Some Oracle customers have cited rising licensing costs, performance bottlenecks, and scalability limits as major pain points. Others have reported high support costs and a need for advanced AI capabilities as primary reasons for considering a move away from Oracle databases.

But migration comes with its own set of challenges. What if your applications aren’t compatible with a new platform? What if your team lacks the skills to manage a new system? What if performance suffers or what if something breaks? These are valid concerns, and they are precisely the challenges we’ve engineered PostgreSQL on Azure to solve.

Apollo Hospitals: A case study in transformation

Apollo Hospitals, one of Asia’s largest healthcare providers, faced these very questions. With more than 74 hospitals and over 10,000 beds, Apollo’s digital infrastructure is mission critical. Their in-house hospital information system, built on Oracle, was becoming increasingly difficult to maintain. Performance bottlenecks were impacting care delivery, and the cost of scaling was unsustainable.

Apollo Hospitals made the bold, strategic decision to migrate their databases to Azure Database for PostgreSQL. Their IT and development teams worked closely with Microsoft and their cloud partner to ensure a seamless transition. The results were transformative. Since the migration, Apollo has seen:

90% of transactions complete within five seconds, a significant leap in responsiveness for clinical systems.

Uptime has improved to 99.95%, ensuring that critical hospital operations remain uninterrupted.

Deployment timelines have dropped by 40%, allowing the organization to roll out new features and updates faster than ever before.

Perhaps most importantly, Apollo has achieved a 60% reduction in operational costs and a 3x improvement in overall system performance. Apollo’s story is a powerful example of what’s possible when you pair the right technology with the right migration strategy.

Read how system performance at Apollo Hospitals improved with Azure

Smarter Oracle to PostgreSQL migrations with AI-assisted tooling

One of the biggest barriers to migration is the complexity of converting Oracle schemas, stored procedures, and application code. Enterprise applications often rely on thousands of stored procedures, functions, and application-side code (Java, .NET, etc.) built around Oracle-specific syntax. Manually rewriting and validating this code is time-consuming, error-prone, and expensive.

To address this, we introduced the AI-assisted Oracle-to-PostgreSQL migration tool, now available in preview as part of the PostgreSQL extension for Visual Studio Code. This tool is powered by GitHub Copilot and a multi-agent AI system that automates the end-to-end conversion process.

Oracle to PostgreSQL AI-assisted migration tool in action

The tool begins by analyzing Oracle schemas and stored procedures, converting them into PostgreSQL-compatible formats using intelligent pattern recognition and transformation logic. It doesn’t stop at the database layer. It also scans application code, such as Java or .NET, and updates database drivers, rewrites SQL queries, and modifies stored procedure calls to align with PostgreSQL syntax. The tool generates automated unit tests to validate the converted logic and runs post-conversion validation in a scratch PostgreSQL environment to check for functional parity.

The tool uses a hybrid AI architecture with specialized agents for migration, validation, and documentation. It reduces manual effort and minimizes human error. The tool also produces side-by-side comparisons and detailed reports, giving teams the transparency and control they may need to trust the process. By embedding AI-assisted conversion directly into the PostgreSQL extension for VS Code, we’re meeting developers where they work. With GitHub Copilot integration, schema conversion, code refactoring, and validation become part of the same inner loop as code editing and CI/CD. The result is a streamlined, intelligent workflow that reduces friction and accelerates delivery.

Post-migration enterprise-grade performance, scale, and security

PostgreSQL on Azure is more than a cost-effective alternative to legacy systems. With Azure Database for PostgreSQL, and the new Azure HorizonDB service, a move to Azure provides high-performance, scale, and security built and optimized for your most business-critical enterprise workloads.

Azure Database for PostgreSQL continuing innovation

With the introduction of v6-series compute SKUs, customers can now scale vertically up to 192 vCores. This is ideal for high-throughput transactional workloads and complex analytical queries. For workloads that require horizontal scaling, elastic clusters powered by the open-source Citus extension enable distributed PostgreSQL deployments across multiple nodes. This architecture supports multi-tenant SaaS applications, IoT platforms, and large-scale analytics with ease.

Storage performance in Azure Database for PostgreSQL has also taken a leap forward. SSD v2 storage delivers high IOPS and low latency, ensuring that even the most demanding workloads run smoothly. Integrated monitoring and tuning tools like Azure Monitor provide real-time insights and automated optimization, helping teams maintain peak performance without manual intervention.

As always, security remains a top priority. Azure Database for PostgreSQL includes enterprise-grade protections such as Microsoft Defender for Cloud, Entra ID integration, private endpoints, confidential compute SKUs, and end-to-end encryption. These features help organizations meet compliance requirements and safeguard sensitive data.

And because PostgreSQL is open source, there are no licensing fees. It’s one of the most widely used databases in the world, with a vibrant community and deep Microsoft support.

Azure HorizonDB: The future of PostgreSQL at scale

For organizations with extreme performance and scale requirements, we’ve introduced Azure HorizonDB, which is a new, cloud-native PostgreSQL service built for the most demanding workloads. Currently in private preview, Azure HorizonDB supports up to 3,072 vCores and 128 TB of auto-scaling storage. It delivers sub-millisecond multi-zone commit latencies and up to 3x higher throughput than self-managed PostgreSQL. Azure HorizonDB also builds on the AI and agentic capabilities of Azure Database for PostgreSQL with built-in AI model management and DiskANN advanced filtering capabilities, making it ideal for next-generation applications that require real-time analytics and intelligent data processing.

Because Azure HorizonDB is PostgreSQL-compatible, organizations can start with Azure Database for PostgreSQL today and move to Azure HorizonDB if the need arises. This allows for a smooth transition path without the need for replatforming or rewriting applications.

Build and scale mission-critical applications with Azure HorizonDB

Open source, engineered for the enterprise

Microsoft is proud to be one of the top corporate contributors to the PostgreSQL project. Our engineering teams have upstreamed key innovations, and we’re committed to continuing this work so that PostgreSQL remains the most capable and trusted open-source database for the cloud era.

We believe that open-source data platforms like PostgreSQL are foundational to the next generation of intelligent applications. Our goal is to make PostgreSQL not only accessible but exceptional for enterprise workloads. That means investing in performance, security, developer experience, and ecosystem integration.

The payoff: Innovation, agility, and confidence

Migrating to PostgreSQL on Azure isn’t just about fixing what’s broken. It’s also about unlocking what’s next. Apollo Hospitals, as an example, is now exploring AI-powered clinical dashboards, real-time analytics with Microsoft Fabric, and containerized workloads with Azure Kubernetes Service. Their teams are more agile, their systems are more resilient, and their foundation is ready for the future. As Sridhar Yadla, Apollo’s General Manager, put it:

We’re no longer stuck reacting to problems. Now we’re thinking proactively and looking at how we can evolve.

That’s the power of PostgreSQL on Azure.

Ready to modernize?

If you’re considering a move from Oracle to PostgreSQL, we’ve built the tools, the platform, and the partner network to help you succeed.

Download our latest e-book and explore the Azure Database for PostgreSQL documentation to learn how to plan, execute, and accelerate your journey.

Azure Database for PostgreSQL
Innovate with a fully managed, AI-ready PostgreSQL database.

Download the e-book

The post From legacy to leadership: How PostgreSQL on Azure powers enterprise agility and innovation appeared first on Microsoft Azure Blog.
Quelle: Azure

Microsoft at NVIDIA GTC: New solutions for Microsoft Foundry, Azure AI infrastructure and Physical AI

Microsoft combines accelerated computing with cloud scale engineering to bring advanced AI capabilities to our customers. For years, we’ve worked with NVIDIA to integrate hardware, software and infrastructure to power many of today’s most important AI breakthroughs.

What’s new at NVIDIA GTCExpanded Microsoft Foundry capabilities to build, deploy and operate production-ready AI agents on NVIDIA accelerators and open NVIDIA Nemotron modelsNew Azure AI infrastructure optimized for inference-heavy, reasoning-based workloads, including the first hyperscale cloud to power on next-generation NVIDIA Vera Rubin NVL72 systemsDeeper integration across Microsoft Foundry, Microsoft Fabric and NVIDIA Omniverse libraries and open frameworks to support Physical AI systems from simulation to real‑world operationsFrom Frontier models to production-ready agentsAt the foundation of this system is Microsoft Foundry: serving as the operating system for building, deploying and operating AI at enterprise scale. Foundry builds on Azure to bring together models, tools, data and observability into a single system designed for production agents. Today we’re expanding those capabilities across Foundry Agent Service and NVIDIA Nemotron models.

The next-generation Foundry Agent Service and Observability in Foundry Control Plane are now generally available, enabling organizations to build and operate AI agents at production scale. Foundry Agent Service allows teams to quickly develop agents that reason, plan and act across tools, data and workflows. Once created, Foundry Control Plane provides the developer end-to-end visibility into agent behavior, unlocking both developer productivity as well as enterprise trust. Companies such as Corvus Energy are already using Foundry to replace manual inspection workflows with agent-driven operational intelligence across their global fleet.

We are further simplifying the path from prototype to production with the availability of Voice Live API integration with Foundry Agent Service, in public preview, which enables developers to build voice-first, multimodal, real-time agentic experiences. This pairs with the general availability of a refreshed Microsoft Foundry portal and expanded integrations for Palo Alto Networks’ Prisma AIRS and Zenity, delivering deeper builder experiences and runtime security across the entire agent lifecycle.

NVIDIA Nemotron models are also now available through Microsoft Foundry, joining the widest selection of models on any cloud, including the latest reasoning, frontier and open models. This bolsters our recent partnership announcement bringing Fireworks AI to Microsoft Foundry, enabling customers to fine-tune open-weight models like NVIDIA Nemotron into low-latency assets that can be distributed to the edge.

Scaling AI infrastructure for the world’s most demanding workloadsInference AI workloads are reshaping cost, performance and system design requirements. To operationalize agentic AI at scale, customers need purpose-built infrastructure for inference‑heavy, reasoning‑based workloads that can be deployed and operated consistently across global and regulated environments.

Microsoft’s AI infrastructure approach is engineered to seamlessly bring next-generation NVIDIA systems into Azure datacenters that are designed for power, cooling networking and rapid generational upgrades. This allows our customers to move with speed and agility and stay at the leading edge from generation to generation.

In less than a year, we’ve deployed hundreds of thousands of liquid-cooled Grace Blackwell GPUs across our global datacenter footprint, and now we are excited to be the first hyperscale cloud to power on NVIDIA’s newest Vera Rubin NVL72 in our labs. Over the next few months, Vera Rubin NVL72 will be rolled out into our modern, liquid-cooled Azure datacenters.

Microsoft’s infrastructure innovation with NVIDIA also extends to sovereign and regulated environments to give customers control of both where AI runs and how it evolves over time. Recently, we announced Foundry Local support for modern infrastructure and large AI models, and today we now have initial support for NVIDIA Vera Rubin platform on Azure Local, extending accelerated AI capabilities to customer-controlled environments. This approach allows organizations to plan for next-generation AI workloads, including reasoning-based and agentic systems, while maintaining Azure-consistent operations, governance and security through our unified software layer with Azure Arc and Foundry Local.

YouTube Video

Bringing AI into the physical worldAs AI moves beyond digital experiences, Microsoft and NVIDIA are collaborating to support the next wave of Physical AI. At GTC, this work centers on NVIDIA Physical AI Data Factory Blueprint, with Microsoft Foundry as the platform for hosting and operating Physical AI systems on Azure at cloud scale.

By integrating this blueprint with Azure services as part of a Physical AI Toolchain, Microsoft enables developers to build, train and operate physical AI and robotics workflows that connect physical assets, simulation and cloud training environments into repeatable, enterprise-grade pipelines. To support, we are introducing a public Azure Physical AI Toolchain GitHub repository integrated with the Nvidia Physical AI Data Factory and with core Azure services.

To further the impact of AI in real‑world, physical environments, today Microsoft and NVIDIA are deepening the integration between Microsoft Fabric and NVIDIA Omniverse libraries, connecting live operational data with physically accurate digital twins and simulation. This allows organizations to see what’s happening across their physical systems, understand it in real time and use AI to decide what to do next. In practice, customers in manufacturing and operations and beyond are using this approach to move beyond dashboards and alerts to coordinated, AI‑driven action across machines, facilities and workflows.

From innovation to impactMicrosoft is delivering reliable, production‑scale AI by bringing together its global AI infrastructure, platforms and real‑world systems with the latest innovation from NVIDIA. For customers, this means the ability to operate intelligence continuously, running inference-heavy, reasoning-based and physical AI workloads with the performance, security and governance required for real businesses and regulated industries.

Whether powering always-on agents, scaling next-generation AI infrastructure or deploying intelligent systems in factories, energy facilities and sovereign environments, Microsoft and Nvidia are helping customers move faster from insight to action.

Yina Arenas leads product strategy and execution for Microsoft Foundry, overseeing the end–to–end AI product portfolio, infrastructure, developer experiences and foundation model integration across OpenAI, Anthropic, Mistral, DeepSeek and others. She delivers an enterprise ready, production grade AI platform trusted by global customers for secure, reliable and scalable AI.
The post Microsoft at NVIDIA GTC: New solutions for Microsoft Foundry, Azure AI infrastructure and Physical AI appeared first on Microsoft Azure Blog.
Quelle: Azure

FabCon and SQLCon 2026: Unifying databases and Fabric on a single data platform

In this article

Introducing the Database Hub in Microsoft FabricGetting your data estate ready for AI with FabricUnifying your data estate with Microsoft OneLakeProcessing and harmonizing data with Fabric analyticsCreating semantic meaning with Fabric IQEmpowering agents to act with Fabric data and operations agentsBuilding mission-critical applications with developer experiences in FabricMigrating your existing Azure service to FabricSee more Fabric innovation

Welcome to the third annual FabCon and our first ever SQLCon here in Atlanta, Georgia. With nearly 300 workshops and sessions, this joint event will highlight how they are bringing the power of Microsoft SQL and Microsoft Fabric together to create a single, unified platform. But FabCon 2026 and SQLCon 2026 are about more than product innovation. It’s about providing space for our 8,000 attendees to come together and share real experiences, learn from each other, and solve challenges side-by-side. Only together can we move beyond the hype and into meaningful results.

Learn more about FabCon and SQLCon 2026

The excitement surrounding this event reflects the same momentum we’re seeing across our data portfolio. Just two and a half years after Microsoft Fabric reached general availability, it’s already serving more than 31,000 customers and remains the fastest-growing data platform in Microsoft’s history. Fortune 500 companies like The Coca-Cola Company are already using Fabric at scale across their organizations.

Microsoft Fabric is helping us evolve our data foundation into a more unified, AI-ready platform. Combined with Power BI and capabilities like Fabric IQ, it enables the enterprise to turn data into intelligence and act on it faster.
Shekhar Gowda, Vice President of Global Marketing Technologies at The Coca-Cola Company

Our databases are accelerating just as quickly, with SQL Server 2025 growing more than twice as fast as the previous version.

Today, we’re thrilled to share how we are bringing the power of databases and Fabric together to form a truly converged data platform—one that unifies transactional, operational, and analytical data under a single, consistent architecture. I’ll also highlight how we’ve enhanced Fabric to help you transform data into the semantic knowledge AI needs to understand your business, powered by Fabric IQ and Power BI’s industry-leading semantic model technology.

Introducing the Database Hub in Microsoft Fabric

Databases sit at the heart of the enterprise data estate—a system of record powering applications, transactions, and mission‑critical insights. Yet as organizations scale across cloud, on‑premises, and edge environments, database estates have become increasingly fragmented and isolated. As AI places even greater demands on data estates, unifying databases under a single access point and control plane has become essential.

To address this challenge, Fabric is expanding its role as the central access point for enterprise data with the Database Hub in Fabric, now available in early access. Database Hub in Fabric provides a unified database management experience that brings together databases across edge, cloud, and Fabric into a single, coherent view. Teams now have one place to explore, observe, govern, and optimize their entire database estate—including Azure SQL, Azure Cosmos DB, Azure Database for PostgreSQL, SQL Server (enabled by Azure Arc), Azure Database for MySQL, and Fabric Databases—without changing how each service is deployed.

Built for scale, the Database Hub in Fabric introduces an agent‑assisted, human-in-the loop approach to database management. With built-in observability, delegated governance, and Microsoft Copilot-powered insights, teams can deploy intelligent agents to continuously reason over estate‑wide signals and surface what changed, explain why it matters, and guide teams toward what to do next. The result is a simpler, more confident way to manage databases at scale. Over time, this model enables database estates to become more proactive, resilient, and intelligent, laying the foundation for greater autonomy, while keeping humans firmly in control of goals, boundaries, and trust.

const currentTheme =
localStorage.getItem(‘msxcmCurrentTheme’) ||
(window.matchMedia(‘(prefers-color-scheme: dark)’).matches ? ‘dark’ : ‘light’);

// Modify player theme based on localStorage value.
let options = {“autoplay”:false,”hideControls”:null,”language”:”en-us”,”loop”:false,”partnerName”:”cloud-blogs”,”poster”:”https://cdn-dynmedia-1.microsoft.com/is/image/microsoftcorp/3402962-Database-Hub-Fabric_tbmnl_en-us?wid=1280″,”title”:””,”sources”:[{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402962-Database-Hub-Fabric-0x1080-6439k”,”type”:”video/mp4″,”quality”:”HQ”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402962-Database-Hub-Fabric-0x720-3266k”,”type”:”video/mp4″,”quality”:”HD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402962-Database-Hub-Fabric-0x540-2160k”,”type”:”video/mp4″,”quality”:”SD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402962-Database-Hub-Fabric-0x360-958k”,”type”:”video/mp4″,”quality”:”LO”}]};

if (currentTheme) {
options.playButtonTheme = currentTheme;
}

document.addEventListener(‘DOMContentLoaded’, () => {
ump(“ump-69bd125c88d30″, options);
});

Learn more about Database Hub in Fabric and what’s new across Databases

Beyond unified database management, we’re also introducing savings plan for databases, a new way to save by up to 35% compared to pay-as-you go pricing on select services.*

Bringing databases together under a single management layer is a critical step as you prepare your estates for AI at scale. But it’s not the end of the journey. The challenge shifts from where data lives to how data is understood, connected, and activated across the enterprise.

Getting your data estate ready for AI with Fabric

As organizations move from traditional applications to AI‑powered, multi‑agent systems, the advantage is shifting away from the specific model you deploy. It now lies in the intelligence and context that allow agents to understand how your business is run, the state of your business, and your institutional knowledge to help take meaningful action.

This is the challenge Microsoft IQ is designed to address. Unlike point solutions on the market today, Microsoft IQ provides an intelligence layer that delivers shared, enterprise-grade business context to every agent. That context is built from three complementary sources: productivity signals from Work IQ, institutional knowledge from Foundry IQ, and live business data from Fabric IQ.

However, like the database layer, while the IQ context layer is a critical part of a successful, and healthy AI foundation, it is not the full story. Building a complete AI-ready data foundation requires investing in four core steps:

Unifying your data estate to eliminate silos and reduce architectural complexity.

Processing and harmonizing data so it becomes AI-ready, clean, connected, and structured for both operational and analytical use.

Curating semantic meaning to give agents contextual understanding, enabling them to interpret data the way your teams already do. This is where Microsoft IQ comes into play.

Empowering AI agents to act, applying that context to automate workflows, accelerate decisions, and transform operations end‑to‑end.

Unifying your data estate with Microsoft OneLake

Every AI initiative starts with the same fundamental challenge: understanding where your data lives and how to bring it together. Microsoft OneLake was built to solve that problem by unifying data across clouds, on-premises environments, and third-party platforms into a single logical data lake without unnecessary extracting, transforming, and loading (ETL), fragmentation, or duplicated copies.

Are my agents hunting for data?

Watch the podcast

Connecting to more sources than ever before

Today, we’re expanding Mirroring in Fabric to support even more systems our customers rely on. Mirroring for SharePoint lists and Dremio are now in preview with Azure Monitor coming soon, while mirroring for Oracle and SAP Datasphere are generally available—all of which are available as part of the core mirroring capabilities. We are also introducing extended capabilities in mirroring designed to help you operationalize mirrored sources at scale, including Change Data Feed (CDF) and the ability to create views on top of mirrored data, starting with Snowflake. Extended capabilities for mirroring will be offered as a paid option.

const currentTheme =
localStorage.getItem(‘msxcmCurrentTheme’) ||
(window.matchMedia(‘(prefers-color-scheme: dark)’).matches ? ‘dark’ : ‘light’);

// Modify player theme based on localStorage value.
let options = {“autoplay”:false,”hideControls”:null,”language”:”en-us”,”loop”:false,”partnerName”:”cloud-blogs”,”poster”:”https://cdn-dynmedia-1.microsoft.com/is/image/microsoftcorp/3382405-ConnectToAndTransform_tbmnl_en-us?wid=1280″,”title”:””,”sources”:[{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382405-ConnectToAndTransform-0x1080-6439k”,”type”:”video/mp4″,”quality”:”HQ”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382405-ConnectToAndTransform-0x720-3266k”,”type”:”video/mp4″,”quality”:”HD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382405-ConnectToAndTransform-0x540-2160k”,”type”:”video/mp4″,”quality”:”SD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382405-ConnectToAndTransform-0x360-958k”,”type”:”video/mp4″,”quality”:”LO”}]};

if (currentTheme) {
options.playButtonTheme = currentTheme;
}

document.addEventListener(‘DOMContentLoaded’, () => {
ump(“ump-69bd125c894e1″, options);
});

Shortcut transformations are also now generally available, allowing data to be shaped automatically as it connects to or moves within OneLake. You can convert formats such as Excel to Delta tables, now in preview, and apply AI-powered transformations.

Additionally, we are continuing to invest in open interoperability, ensuring OneLake works seamlessly with the platforms organizations already use. We are excited to announce the ability to natively read from OneLake through Azure Databricks Unity Catalog is now in public preview. We also recently announced the general availability of our interoperability with Snowflake.

I’m also excited to share that Auger, a rapidly growing supply chain platform designed to bring intelligence and automation to global operations, has built its platform on Fabric, with all data stored natively in OneLake. This architecture enables Auger customers to seamlessly access their operations data through OneLake shortcuts within their own Fabric environments and use the full power of the platform including Power BI, Fabric data agents, and more. Learn more in my blog, co-authored with Auger Chief Executive Officer Dave Clark.

Protect your data with OneLake security, now generally available

Security and governance remain foundational to OneLake. I’m thrilled to announce OneLake security will be generally available in the coming weeks, enabling data owners to define roles, enforce row- and column-level controls, and manage permissions through a single unified model that follows the data.

To learn more about these announcements, read the OneLake blog and the Fabric Data Factory blog.

Processing and harmonizing data with Fabric analytics

AI agents are only as reliable as the data you feed them. Before data can train or ground an agent, it must be integrated, cleaned, and structured, so the agent operates from consistent, trusted information. With industry-leading engines in Fabric like Spark, T-SQL, KQL, and Analysis Services, we can equip data teams to do exactly that.

Now, we are expanding these capabilities with the introduction of Runtime 2.0 in preview, purpose-built for large-scale data computation. It incorporates Apache Spark 4.x, Delta Lake 4.x, Scala 2.13, and Azure Linux Mariner 3.0 to power advanced enterprise workloads. Materialized lake views are also now generally available, simplifying medallion architecture implementation in Spark SQL and PySpark and enabling always up-to-date pipelines with no manual orchestration. In addition, a new agentic Copilot experience in notebooks delivers deeper context awareness, reasoning over your workspace, and generating code with greater speed and precision.

For real-time scenarios, we’re launching Maps in Fabric into general availability. Maps add geospatial context to your agents and operations by turning large volumes of location-based data into interactive, real-time visual insights.

For a comprehensive overview of these announcements and much more, read the Fabric Analytics blog and the Fabric Real-Time Intelligence blog.

Creating semantic meaning with Fabric IQ

Preparing raw data for AI is essential. The next step is transforming that data into meaningful, unified business context. That is where Fabric IQ comes in.

Fabric IQ unifies analytical data and operational data, including telemetry, time series, graph, and geospatial data, within a shared semantic framework of business entities, relationships, properties, rules, and actions. Instead of thinking in terms of tables and schemas, your teams and agents can operate on this framework, or ontology, aligned to how the business actually runs.

Fabric IQ ontologies will soon become accessible through an MCP server in preview, enabling agents to discover, understand, and act on this semantic layer. Ontologies can also serve as context sources for maps and soon in operations agents in Fabric, extending shared business context directly into operational decision-making and execution.

We are also excited to announce planning in Fabric IQ, a new enterprise planning capability that enables organizations to create plans, budgets, forecasts, and scenario models directly on top of Fabric’s semantic models. By complementing Fabric IQ’s ontologies with integrated planning, you get a complete, contextual view of your historical, real-time, and forward planning data. This allows users and agents to quickly answer what has happened, what is happening, and what should happen all from a single source. See this in action:

const currentTheme =
localStorage.getItem(‘msxcmCurrentTheme’) ||
(window.matchMedia(‘(prefers-color-scheme: dark)’).matches ? ‘dark’ : ‘light’);

// Modify player theme based on localStorage value.
let options = {“autoplay”:false,”hideControls”:null,”language”:”en-us”,”loop”:false,”partnerName”:”cloud-blogs”,”poster”:”https://cdn-dynmedia-1.microsoft.com/is/image/microsoftcorp/3402963-Planning-FabricIQ_tbmnl_en-us?wid=1280″,”title”:””,”sources”:[{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402963-Planning-FabricIQ-0x1080-6439k”,”type”:”video/mp4″,”quality”:”HQ”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402963-Planning-FabricIQ-0x720-3266k”,”type”:”video/mp4″,”quality”:”HD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402963-Planning-FabricIQ-0x540-2160k”,”type”:”video/mp4″,”quality”:”SD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402963-Planning-FabricIQ-0x360-958k”,”type”:”video/mp4″,”quality”:”LO”}]};

if (currentTheme) {
options.playButtonTheme = currentTheme;
}

document.addEventListener(‘DOMContentLoaded’, () => {
ump(“ump-69bd125c8995b”, options);
});

Finally, we recently announced a strategic partnership with NVIDIA to power the next generation of Physical AI by integrating Real-Time Intelligence and Fabric IQ with NVIDIA Omniverse libraries. The combined platform unifies real‑time operational data, business semantics, and physical simulation to enable organizations to optimize their physical operations in scenarios like intelligent digital twins, predictive maintenance, autonomous logistics, and energy optimization.

To learn more about all of our partner announcements, read the Fabric ISV blog and the planning in Fabric IQ blog.

Enhancing the underlying Fabric IQ technology

Powering much of Fabric IQ’s rich experience is a combination of Power BI’s industry-leading, rich semantic model technology and graph in Fabric, our highly scalable graph database. Already delivering insights to more than 35 million active users, semantic models provide the ideal foundation for training agents through Fabric IQ. Now, with the general availability of Direct Lake on OneLake, your tables can be read directly from OneLake with native security enforcement, richer cross-item modeling, and import-class performance without data movement or refresh.

I’m also excited to share that graph in Fabric will be generally available in the coming weeks, enabling teams to visualize and query complex relationships across customers, partners, and supply chains.

To learn more, check out the Fabric IQ blog and the Power BI blog.

Empowering agents to act with Fabric data and operations agents

Frontier organizations are moving beyond general-purpose assistants and instead, adopting multi-agent systems composed of specialized agents. These agents are each grounded on specific data and reusable across different systems, allowing you to deliver more accurate, accelerated, and scalable outcomes.

To support your multi-agent systems, Fabric comes with built-in agent creation capabilities with Fabric data agents and operations agents. I’m excited to share that Fabric data agents are now generally available. Fabric data agents can be thought of as virtual analysts, aligned to specific domain data to support deeper analysis and deliver insights. Operations agents complement them by monitoring real-time data, detecting patterns, and taking proactive action.

Check out a quick demo of operations agents in Fabric:

const currentTheme =
localStorage.getItem(‘msxcmCurrentTheme’) ||
(window.matchMedia(‘(prefers-color-scheme: dark)’).matches ? ‘dark’ : ‘light’);

// Modify player theme based on localStorage value.
let options = {“autoplay”:false,”hideControls”:null,”language”:”en-us”,”loop”:false,”partnerName”:”cloud-blogs”,”poster”:”https://cdn-dynmedia-1.microsoft.com/is/image/microsoftcorp/3402964-Demo-Operations-Agent_tbmnl_en-us?wid=1280″,”title”:””,”sources”:[{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402964-Demo-Operations-Agent-0x1080-6439k”,”type”:”video/mp4″,”quality”:”HQ”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402964-Demo-Operations-Agent-0x720-3266k”,”type”:”video/mp4″,”quality”:”HD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402964-Demo-Operations-Agent-0x540-2160k”,”type”:”video/mp4″,”quality”:”SD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3402964-Demo-Operations-Agent-0x360-958k”,”type”:”video/mp4″,”quality”:”LO”}]};

if (currentTheme) {
options.playButtonTheme = currentTheme;
}

document.addEventListener(‘DOMContentLoaded’, () => {
ump(“ump-69bd125c89dd7″, options);
});

These agents can be used across Fabric or as foundational knowledge sources in leading AI tools like Microsoft Foundry, Copilot Studio or even Microsoft 365 Copilot. To learn more about our AI announcements, check out the Fabric analytics blog covering data agents and the Fabric IQ blog covering operations agents.

Building mission-critical applications with developer experiences in Fabric

Developers building the next generation of AI applications need a comprehensive, cost-effective data platform that’s already integrated with your existing tools and workflows. Today, we are expanding Fabric’s developer tooling to meet that demand.

First, Fabric Model Context Protocol (MCP) is advancing with two major milestones. Fabric local MCP is now generally available, providing an open-source local server that connects AI coding assistants such as GitHub Copilot directly to Fabric. Alongside this, we’re introducing the public preview of Fabric remote MCP, a secure, cloud‑hosted execution engine that enables AI agents and automation tools to perform authenticated actions in Fabric.

We’re also enhancing our Git integration with selective branching, allowing developers to branch out for a specific feature and pull only the items they need. You also get improved change comparisons to more easily review recent updates, and new folder relationships which show how feature workspaces connect to source workspaces.

const currentTheme =
localStorage.getItem(‘msxcmCurrentTheme’) ||
(window.matchMedia(‘(prefers-color-scheme: dark)’).matches ? ‘dark’ : ‘light’);

// Modify player theme based on localStorage value.
let options = {“autoplay”:false,”hideControls”:null,”language”:”en-us”,”loop”:false,”partnerName”:”cloud-blogs”,”poster”:”https://cdn-dynmedia-1.microsoft.com/is/image/microsoftcorp/3382404-BringingFabric_tbmnl_en-us?wid=1280″,”title”:””,”sources”:[{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382404-BringingFabric-0x1080-6439k”,”type”:”video/mp4″,”quality”:”HQ”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382404-BringingFabric-0x720-3266k”,”type”:”video/mp4″,”quality”:”HD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382404-BringingFabric-0x540-2160k”,”type”:”video/mp4″,”quality”:”SD”},{“src”:”https://cdn-dynmedia-1.microsoft.com/is/content/microsoftcorp/3382404-BringingFabric-0x360-958k”,”type”:”video/mp4″,”quality”:”LO”}]};

if (currentTheme) {
options.playButtonTheme = currentTheme;
}

document.addEventListener(‘DOMContentLoaded’, () => {
ump(“ump-69bd125c89f64″, options);
});

We’re also launching two open-source projects to help teams move faster with Fabric: Agent Skills for Fabric and Fabric Jumpstart. Agent Skills for Fabric is an open-source set of purpose-built plugins that let you use natural language in the GitHub Copilot terminal to harness the full power of Microsoft Fabric. Additionally, Fabric Jumpstart is designed to help you get off the ground with detailed guidance, reference architectures, and single‑click deployments for sample datasets, notebooks, pipelines, and reports.

Finally, we are announcing that the Fabric Extensibility Toolkit (FET), an evolution of the Workload Development Kit (WDK), is now generally available. Along with this release, we are enabling support for full CI/CD, variable library, and a new management experience in the Admin portal.

Read the Fabric Platform blog

Migrating your existing Azure service to Fabric

As Fabric continues to grow in functionality, we are also simplifying the migration from other Azure services. In addition to our existing Synapse tooling, we are bringing new migration assistants for Azure Data Factory, Azure Synapse Analytics, and Azure SQL in public preview.

The new Fabric migration assistant for Azure Data Factory and Synapse Analytics helps move your existing pipelines and artifacts like Spark pools and notebooks into Fabric with minimal disruption. It’s designed to support incremental modernization, allowing teams to evaluate, convert, and optimize pipelines as they transition to Fabric. The migration assistant for SQL databases helps move SQL Server into Fabric by importing schemas through DACPACs, identifying and resolving compatibility issues with AI assistance, and guiding teams through assessment and data copy workflows for a smoother cutover.

See more Fabric innovation

Explore the AI shift with The Shift podcast

In addition to the announcements above, we are also rolling out a broad set of Fabric innovations across the platform. For a deeper look at the updates and what’s new this month, visit the Fabric March 2026 Feature summary blog, the Power BI March 2026 feature summary blog, and the latest posts on the Fabric Updates channel.

Explore additional resources for Microsoft Fabric

Sign up for the Fabric free trial.View the updated Fabric Roadmap.Try the Microsoft Fabric SKU Estimator.Visit the Fabric website.Join the Fabric community.Read other in-depth, technical blogs on the Microsoft Fabric Updates Blog.

Read additional blogs by industry-leading partners

Sonata Software: Building an AI-ready data platform with data agents, ontology, and governance in Microsoft FabricQuadrant Technologies LLC: Real-Time Operational Intelligence in Microsoft Fabric: Deep Dive into RTI Capabilities, Anomaly Detection and Activator AlertingInspark: Why switch from Azure Synapse to Microsoft Fabric?Esri: Unlock the power of location intelligence with ArcGIS for Microsoft FabricDream IT Consulting Services: 8 Real-World Use Cases of Data Agents in Microsoft FabricUB Technology Innovations Inc.: From Data Platform to Decision Platform: How Microsoft Fabric and Copilot are Redefining Enterprise AnalyticsSimpson Associates: Fabric Data Warehouse: Bringing Structure to Modern Data StrategiesSynapx Ltd.: Migrating Power BI to Microsoft Fabric Lakehouse with Medallion Architecture: A Strategic Imperative for Modern Construction EnterprisesCloud Services: Real-Time Intelligence in Action: How Microsoft Fabric Helped Delfi Transform Its NewsroomCloud Services: Microsoft Fabric Data Agents: A New RealityiLink Digital: Detect to Act in Seconds: How Real-Time Intelligence Is Rewriting the Rules of Emissions ManagementValorem Reply: How Nonprofits Are Rethinking Data with Microsoft Fabric

*Customers may see savings estimated to be between 0% and 35%. The 35% savings estimate is based on one Azure SQL Database serverless running for 12 months at a pay-as-you-go rate vs. a reduced rate for a 1-year savings plan. Based on Azure pricing as of March 2026. Prices are subject to change. Actual savings may vary based on location, database service, and/or usage. 
The post FabCon and SQLCon 2026: Unifying databases and Fabric on a single data platform appeared first on Microsoft Azure Blog.
Quelle: Azure