Unlock AI innovation with new joint capabilities from Microsoft and SAP

Microsoft and SAP have been partners and customers of each other for over 30 years, collaborating on innovative business solutions and helping thousands of joint customers accelerate their business transformation. Microsoft Cloud is the market leader for running SAP workloads in the cloud, including RISE with SAP, and today at SAP Sapphire 2024, we are very excited to bring more amazing innovation to the market for our joint customers. 

In this blog, we explore the most recent exciting developments from the Microsoft and SAP partnership and how they help customers accelerate their business transformation. 

SAP on Microsoft Cloud

Innovate with the most trusted cloud for SAP

Discover solutions

Announcing new joint AI integration between Microsoft 365 Copilot and SAP Joule

Today at SAP Sapphire, Microsoft and SAP are expanding our partnership by bringing Joule together with Microsoft Copilot into a unified experience, allowing employees to get more done in the flow of their work through seamless access to information from business applications in SAP as well as Microsoft 365.  

Customers want AI assistants to carry out requests regardless of data location or the system that needs to be accessed. By integrating Joule and Copilot for Microsoft 365, the generative AI solutions will interact intuitively so users can find information faster and execute tasks without leaving the platform they are already in. 

A user in Copilot for Microsoft 365 will be able to leverage SAP Joule to access information stored in SAP—for example S/4HANA Cloud, SAP SuccessFactors, or SAP Concur. Similarly, someone using SAP Joule in a SAP application will be able to use Copilot for Microsoft 365 capabilities without context switching.

To see this in action, tune into the “Innovation: The key to bringing out your best” keynote at SAP Sapphire which will also be available on-demand and read the announcement blog. 

Unlock business transformation with Microsoft AI and RISE with SAP

SAP systems host mission-critical data powering core business processes and can greatly benefit from the insights, automation, and efficiencies unlocked by AI. Microsoft Cloud—the most trusted, comprehensive and integrated cloud—is best positioned to help you achieve these benefits. You can extend RISE with SAP by using a broad set of Microsoft AI services to maximize business outcomes, catered to your business needs: 

1: Unlocking joint AI innovation with SAP Business Technology Platform (BTP) and Microsoft Azure: SAP BTP is a platform offering from SAP that maximizes the value of the RISE with SAP offering. We recently partnered with SAP to announce the availability of SAP AI Core, an integral part of BTP, including Generative AI Hub and Joule on Microsoft Azure in West Europe, US East, and Sydney. Customers that use BTP and want to embed more intelligence into their finance, supply chain, and order to cash business processes can now do so easily, on Azure. SAP customers can also take advantage of the most popular large language models like GPT-4o available in SAP Generative AI Hub offered only on Azure through the Azure OpenAI Service.

2: Unlocking end-user productivity with Microsoft Copilot: SAP customers can use Copilot for Microsoft 365, available in Microsoft Word, PowerPoint, Excel, and PowerBI so that end-users can unlock insights, and perform tasks faster. You can go one step further by tailoring Copilot to work the way you need, with your data, processes and policies using the SAP plugin in Microsoft Copilot Studio. You can now customize Copilot to connect to your SAP systems and retrieve the information you need such as expense information, inventory, and so on. This is made possible with new Copilot connectors and new agent capabilities that we announced at Microsoft Build last week. 

3: Enabling AI transformation using Azure AI services: SAP customers running their workloads on Azure can build their own AI capabilities using the Azure OpenAI Service with their SAP data to quickly develop generative AI applications. We offer over 1,600 frontier and open models in Azure AI, including the latest from OpenAI, Meta, and others—providing you with the choice and flexibility to choose the model suited for your use case. More than 50,000 customers use Azure AI today, signaling the amazing momentum in this space. 

A powerful productivity enhancement use case here is the combination of OpenAI, along with business process automation tools like the Microsoft Power Platform which has SAP connectors available to help a user interact with SAP data easily and complete tasks faster. 

For example, a sales assistant can access SAP data to place product orders, directly from Microsoft Teams, leveraging the power of Azure OpenAI. Watch this video to see this scenario in action. 

RISE with SAP customers can also consume OpenAI services through Cloud Application programming model (CAP) and SAP BTP, AI Core. 

4: Automatic attack disruption for SAP, powered by AI: Microsoft Security Copilot empowers all security and IT roles to detect and address cyberthreats at machine speed, including those arising from SAP systems. For customers using Microsoft Sentinel for SAP, which is certified for RISE with SAP, attack disruption will automatically detect financial fraud techniques and disable the native SAP and connected Microsoft Entra account to prevent the cyberattacker from transferring any funds–with no additional intervention. See this video to learn more and read more about automatic attack disruption for SAP. 

More choice and flexibility with expanded availability of SAP BTP on Azure 

Together with SAP, we recently announced the expanded availability of SAP BTP to six new Azure regions (Brazil, Canada, India, the United Kingdom, Germany, and one to be announced) as well as additional BTP services in the existing seven public Microsoft Azure regions (Australia, Netherlands, Japan, Singapore, Switzerland, the United States East-VA, and the United States West-WA.) Upon completion, SAP Business Technology Platform will run in 13 Microsoft Azure regions, and all the key BTP services that most customers are asking for, will be available on Azure.

The expanded service availability, based on customer demand, unlocks AI innovation and business transformation for customers more easily, as you can now integrate SAP Cloud enterprise resource planning (ERP) and SAP BTP services with Microsoft Azure services, including AI services from both companies, within the same data center region.  

New and powerful infrastructure options for running SAP HANA on Azure

Last year we announced the Azure M-Series Mv3 family—the next generation of memory optimized virtual machines, giving customers faster insights, more uptime, a lower total cost of ownership, and improved price-performance for running SAP HANA workloads with Azure IaaS deployments and SAP RISE on Azure. These VMs, supporting up to 32 TB of memory, are powered by the 4th generation Intel® Xeon® Scalable processors and Azure Boost, one of Azure’s latest infrastructure innovations.  

Today, we are pleased to build on this investment and share that the Mv3 Very High Memory (up to 32TB of memory) offering is generally available for customers and the Mv3 High Memory offering (upto 16TB) is in preview.

Microsoft and SAP Signavio: Teaming up to accelerate transformation

Microsoft and SAP continue to collaborate to help customers in their journey to S/4HANA and RISE on Azure. SAP Signavio Process Insights is an integral part of SAP’s cloud-based process transformation suite and gives companies the ability to rapidly discover areas for improvement and automation within your SAP business processes. SAP Signavio Process Insights is now available on Microsoft Azure and provides ECC customers an accelerated path to S/4HANA, allowing customers to unlock the value of innovation through the Microsoft platform. 

Simplifying business collaboration with new integrations in Microsoft Teams 

Microsoft and SAP have been working together for several years enabling organizations and their employees improve productivity through collaborative experiences that combine mission critical data from SAP with Microsoft Teams and Microsoft 365.

Today, we are excited to build on this and announce new joint capabilities with exciting updates to Microsoft Teams apps for SAP S/4HANA, SAP SuccessFactors and, SAP Concur:

Upcoming features in S/4HANA app for Microsoft Teams

S/4HANA Copilot plugin: Users can access S/4HANA sales details, order status, sales quotes, and more using natural language with Copilot for Microsoft 365. See a sample query below: 

Adaptive Card Loop components: Share intelligent cards in Teams and Outlook (available for pre-release users only). 

Teams search-based message extension: Users can quickly search for and insert information from S/4HANA without leaving the Teams environment.  

SAP Community search: Search SAP Community and share content with co-workers in Microsoft Teams—without having to leave the app. While chatting with a colleague or a group, click on the three dots below the text field, open the SAP S/4HANA app, and enter your search term in the popup window. From the results list, pick the topic you want to share and directly send it to your colleagues. 

Share to Microsoft Teams as Card: Communicate better with your co-workers using Microsoft Teams by providing a collaborative view that shows application content in a new window and enables you to have a meaningful conversation. 

Access S/4HANA within Microsoft 365 Home and Outlook too:

New release: SuccessFactors for Microsoft Teams

HR Task Reminders through Teams Chatbot: Receive a private message from the SuccessFactors Teams chatbot, which can help you complete HR tasks directly in Teams or guide you to SuccessFactors online for more complex workloads. 

Trigger Quick Actions through Commands: Request and provide feedback to your colleagues, manage time entries, view learning assignments and approvals, access employee and manager self-services, and much more!  

Coming soon: New SuccessFactors Dashboard in Teams tab. 

Coming soon: Concur Travel and Expense

Users will be to able to share travel itineraries and expense reports with colleagues in Microsoft Teams. This app will be released later this summer. 

Microsoft and SAP collaborate to modernize identity for SAP customers 

Earlier this year, we announced that we are collaborating with SAP to develop a solution that enables customers to migrate their identity management scenarios from SAP Identity Management (IDM) to Microsoft Entra ID. We’re excited to announce that guidance for this migration will be available soon.

Driving joint customer success 

It’s super exciting to see all the product innovation that will ultimately drive success and business outcomes for customers. Microsoft was one of the early adopters of RISE with SAP internally, and is proud to have helped thousands of customers accelerate their business transformation to RISE with SAP with the power of the Microsoft Cloud. 

Construction industry supplier Hilti Group migrated its massive SAP landscape to RISE with SAP on Microsoft Azure, accelerating their continuous innovation roadmap. In parallel, it upgraded from a 12-terabyte to a 24-terabyte SAP S/4HANA ERP application and is about to shut down its on-premises datacenter to make Azure its sole platform. Hilti wanted to be one of the first adopters of the RISE with SAP offering, which brings project management, technical migration, and premium engagement services together in a single contract. The accelerated, on-demand business transformation solution was the perfect match to help evolve the company’s massive SAP landscape, which serves as the backbone of the company’s transactional business.

“RISE with SAP on Azure helped us move our experts and resources into areas where they can add the most value, which was a game-changer.” 
Dr. Christoph Baeck, Head of IT Platforms, Hilti Group

Tokyo-based steel manufacturer, JFE Steel Corporation wanted to upgrade it’s on-premises SAP systems to a hybrid cloud strategy to pursue better digital experiences. The company chose S/4 HANA Cloud private edition—which provides the SAP SaaS solution, RISE, in a private cloud environment, and Microsoft Azure was chosen as the foundation for this. They achieved the migration of their SAP system to the cloud in just seven months and are also driving further innovation with the Microsoft Power Platform. 

“We considered on-premises and various cloud services based on the three axes of quality, cost, and turnaround time. Azure was the first choice because we had confidence in its quality, and we had accumulated know-how in the company. We also actively use Microsoft Power Platform and other products, and we appreciated the high degree of affinity and integration between the products.” 
Mr. Etsuo Kasuya, JFE Systems, Inc. Tokyo Office Business Management System Development Department Accounting Group

Australian mining and metals company South32 set its goal of transitioning its more than 100 terabyte data landscape into a fit-for-purpose ERP system. South32 seamlessly completed phase one of its SAP migration to Azure, achieving consolidation and simplification of its estate and building a scalable, easy to manage system environment using SAP S/4HANA with RISE on Azure by working with SAP, Microsoft and key partner TCS. 

“Now that we’ve moved our SAP landscape to Azure, we have more breadth of coverage. Our environments are standardized, which provides our infrastructure team with much better tools to manage consumption and give us transparency around costs.” 
Stuart Munday, Group Manager, ERP, South32 

Learn more 

Microsoft and SAP are committed to continuing our partnership to serve our joint customers and enable their growth and transformation as well as unlock innovation for them in the era of AI. There are several ways you can learn more and engage with us: 

Visit us at SAP Sapphire this week: If you are at SAP Sapphire this week in Orlando or the next week in Barcelona, visit the Microsoft Booth to learn about the exciting announcements. Also, for the latest product updates from Sapphire, check out our engineering blog.   

Learn more on our website: To learn more about why the Microsoft Cloud is the leading cloud platform for SAP workloads, including RISE with SAP, visit our website. 

Read more about how customers are unlocking AI innovation and business transformation with SAP and the Microsoft Cloud. 

Migration offers and incentives: Beyond the announcements we are making today, we also offer programs, and incentives so you can make your migration decisions with confidence The Azure Migrate and Modernize offering gives you guidance, expert help, and funding to streamline your move to Azure for SAP workloads, including RISE with SAP.   

Skilling: Business leaders often share with us that skilling for their teams is top of mind so that the organization can better prepare itself for the cloud journey. We offer several online learning paths as well as instructor-led offerings so you can maximize the value of your migration to the cloud—learn more. 

The post Unlock AI innovation with new joint capabilities from Microsoft and SAP appeared first on Azure Blog.
Quelle: Azure

Raise the bar on AI-powered app development with Azure Database for PostgreSQL

Known for its reliability and versatility, PostgreSQL is a popular and powerful open-source database system with a wide array of features. By harnessing the might of PostgreSQL in the cloud—with all the scalability and convenience you expect—comes Microsoft Azure Database for PostgreSQL. This fully managed service takes the hassle out of managing your PostgreSQL instances, allowing you to focus on what really matters: building amazing, AI-powered applications.  

What is postgresql?

Learn more

Azure Database for PostgreSQL

Innovate with a fully managed, AI-ready PostgreSQL database

Explore our features

To better get you acquainted with how Azure Database for PostgreSQL empowers users to migrate their PostgreSQL databases and build intelligent apps, this blog will introduce a roster of new learning paths and events, including a pair of Cloud Skills Challenges. As if that’s not exciting enough, completing one of the challenges automatically enters you in a drawing for a great prize. So, let’s get going!   

Seamless database migration and app creation   

Say goodbye to tedious maintenance tasks and hello to seamless deployments, automated patching, and built-in high availability. Azure Database for PostgreSQL is a fully managed service that simplifies the migration of existing PostgreSQL databases to the cloud. We handle the burdens of patching, backups, and scaling—allowing you to focus on your applications.  

Seamless compatibility with PostgreSQL minimizes code changes during the transition and caters to diverse needs and budgets. With migration tooling in Azure Database for PostgreSQL, transferring data and schemas to the cloud becomes a breeze. 

Beyond migration, Azure Database for PostgreSQL empowers the development of AI-powered applications. Its native support for the pgvector extension allows for efficient storage and querying of vector embeddings, essential for AI and machine learning tasks. The service seamlessly integrates with other Azure AI services, such as Azure Machine Learning, Azure OpenAI Service, Microsoft Azure AI Language, and Microsoft Azure AI Translator, providing developers with a rich toolkit for building intelligent applications.  

Additionally, the service’s scalability ensures optimal performance as AI workloads grow, maintaining cost efficiency throughout the development process. Overall, Azure Database for PostgreSQL provides a comprehensive solution for both migrating to the cloud and building powerful AI applications. 

Here are some key features: 

High availability: Up to 99.99% uptime guaranteed with zone-redundant high availability, automated maintenance, patching, and updates.

Performance automation: Get analysis of your database workloads to identify opportunities to improve query performance with query store and index recommendations.

Security: Includes Microsoft Defender for open-source relational databases to protect your data, and Azure IP Advantage, which is designed to protect businesses and developers who build on Azure from intellectual property risks.

Azure AI extension: Generate and store vector embeddings, call Azure AI services, and build AI-powered apps directly within the database.

Migration support: Tools to migrate Oracle Database to Azure Database for PostgreSQL are available, making the transition smoother.

Cost-effective: Provides operational savings—up to 62% compared with on-premises—with comprehensive database monitoring and optimization tools, which can lead to a lower total cost of ownership. 

Learn at your own pace with curated lessons 

Now that you’ve gotten a primer on Azure Database for PostgreSQL, the next step is engaging with our curated learning paths. The collected modules in these two courses include readings, exercises, and knowledge checks.

Build AI Apps with Azure Database for PostgreSQLDesigned for developers interested in harnessing AI within their PostgreSQL applications on Azure, this learning path explores how the Azure AI extension for Azure Database for PostgreSQL can be leveraged to incorporate AI capabilities into your apps.By completing this learning path, you’ll gain a solid understanding of the Azure AI extension and its various functionalities. Discover how to evaluate different summarization techniques available through Azure AI services and the azure_ai extension, explore the differences between extractive, abstractive, and query-focused summarization, and apply generative AI summarization techniques to data within a PostgreSQL database. This hands-on experience will empower you to leverage Azure AI services and the azure_ai extension to build intelligent applications that can summarize complex content into concise and informative summaries. 

Configure and migrate to Azure Database for PostgreSQLThis learning path supplies you with the essential skills needed to effectively work with Azure Database for PostgreSQL. It begins with a foundational understanding of PostgreSQL architecture and core concepts, before delving into practical aspects such as connecting to the database, executing queries, and ensuring robust security measures.You’ll also learn how to create and manage databases, schemas, and tables, and how to leverage stored procedures and functions for code reusability. With insights into how Azure Database for PostgreSQL implements ACID transactions and write-ahead logging for data integrity and durability, you’ll gain confidence in configuring, managing, and migrating existing PostgreSQL databases to Azure.

Complete timed challenges to win Azure prizes 

To go along with these learning paths, we’ve also assembled a pair of corresponding Azure Database for PostgreSQL Cloud Skills Challenges. While learning paths are usually self-paced, solitary activities, Cloud Skills Challenges are part interactive learning sprint, part good-natured tournament between you and thousands of your peers around the globe. They’re immersive, gamified learning experiences blending hands-on exercises, tutorials, and assessments to ensure a well-rounded learning experience. 

Complete at least one of these challenges before time runs out and you’ll be automatically entered into a drawing to win one of 20 awesome Azure prizes. Sign up when these challenges kick off on June 11, 2024, and start competing! 

Cloud Skills Challenge: Configure and migrate to Azure Database for PostgreSQL

Cloud Skills Challenge: Build AI Apps with Azure Database for PostgreSQL 

Connect with PostgreSQL experts at POSETTE 2024 conference 

Hosted by Microsoft, POSETTE 2024 (formerly Citus Con), is an exciting developer event dedicated to all things PostgreSQL. The event is a unique opportunity to learn from experts, network with fellow Postgres enthusiasts, and delve into the latest innovations in database technology. 

As a key player in the PostgreSQL community, we’ll be showcasing our commitment to the open-source database system. Attendees can look forward to a session on the future of Azure Database for PostgreSQL, where our experts will share our vision for the service and its integration with other Azure offerings.  

Running June 11 to 13, 2024, POSETTE—which stands for Postgres Open Source Ecosystem Talks, Training, and Education—is a free, virtual event featuring four unique livestreams. Registration is optional, and all scheduled talks will be online to watch immediately after the event ends. Don’t miss out on this chance to connect with the Microsoft team and learn how we’re advancing PostgreSQL in the cloud.

Take the next step on your Azure Database for PostgreSQL journey 

Whether you’re a seasoned developer or just starting out, PostgreSQL and Azure Database for PostgreSQL is a dream team for building modern, scalable, and AI-powered apps. By offering robust migration tools and seamless integration with AI and machine learning services, Azure Database for PostgreSQL helps users efficiently migrate to the cloud and build sophisticated AI applications.  

Get started today with our pair of learning paths and their respective Cloud Skills Challenges to be entered into a drawing for cool Azure prizes, then check out the POSETTE 2024 livestreams to learn more about everything you can do with the world’s most advanced open-source database.  
The post Raise the bar on AI-powered app development with Azure Database for PostgreSQL appeared first on Azure Blog.
Quelle: Azure

Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters

Following the successful open sourcing of Retina: A Cloud-Native Container Networking Observability Platform, Microsoft’s Azure Container Networking team is excited to announce a new offering called Advanced Container Networking Services. It’s a suite of services built on top of existing networking solutions for Azure Kubernetes Services (AKS) to address complex challenges around observability, security, and compliance. The first feature in this suite, Advanced Network Observability, is now available in Public Preview.

What is Advanced Container Networking Services?

Advanced Container Networking Services is a suite of services built to significantly enhance the operational capabilities of your Azure Kubernetes Service (AKS) clusters. The suite is comprehensive and is designed to address the multifaceted and intricate needs of modern containerized applications. With capabilities specifically tailored for observability, security, and compliance, customers can unlock a new approach to managing container networking.

Advanced Container Networking Services focuses on delivering a seamless and integrated experience that empowers you to maintain robust security postures, ensure comprehensive compliance and gain deep insights into your network traffic and application performance. This ensures that your containerized applications are not only secure and compliant but also meet or exceed your performance and reliability goals, allowing you to confidently manage and scale your infrastructure.

What is Advanced Network Observability?

Advanced Network Observability is the inaugural feature of the Advanced Container Networking Services suite bringing the power of Hubble’s control plane to both Cilium and Non-Cilium Linux data planes. It unlocks Hubble metrics, Hubble’s command line interface (CLI) and the Hubble user interface (UI) on your AKS clusters providing deep insights into your containerized workloads. Advanced Network Observability empowers customers to precisely detect and root-cause network related issues in a Kubernetes cluster.

This capability provides network flow information in the form of metrics or flow logs at pod-level granularity by collecting data in real time from Linux Kernel leveraging extended Berkeley Packet Filter (eBPF) technology. Along with network traffic flows, volumetric data and dropped packets, it now brings domain name service (DNS) metrics and flow information with deep request and response insights.

eBPF based observability powered by either Cilium or Retina.

Container Network Interface (CNI) agnostic experience.

Monitor network traffic in real time to identify bottlenecks and performance issues with Hubble metrics.

Trace packet flows across your cluster to understand and debug complex networking behaviors with on-demand Hubble command line interface (CLI) network flows.

Visualize network dependencies and interactions between services to ensure optimal configuration and performance with an unmanaged Hubble UI.

Generate detailed metrics and logs to meet compliance requirements and enhance security postures.

Architecture diagram of Hubble interfacing with Cilium/Retina.

Container Network Interface (CNI) agnostic Hubble

Advanced Network Observability extends the Hubble control plane beyond Cilium. In Cilium based clusters, Cilium provides the eBPF events to Hubble. In non-Cilium based clusters, Microsoft Retina serves as the dataplane surfacing deep insights to Hubble, providing a seamless interactive experience for customers.

Visualizing Hubble metrics with Grafana

Advanced Network Observability supports two integration modes for visualization:

Azure Managed Prometheus and Grafana.

Bring your own (BYO) Prometheus and Grafana for advanced users comfortable with increased management overhead.

With the Azure-managed Prometheus and Grafana approach, Azure offers integrated services that simplify the setup and management of monitoring and visualization. Azure Monitor provides a managed instance of Prometheus, which collects and stores metrics from various sources including Hubble.

Querying network flows with Hubble CLI

With Advanced Network Observability, customers can use the Hubble command line interface (CLI) to query for all or filtered network flows across all nodes.

Customers will be able to identify dropped or forwarded flows from all nodes via a single pane of glass.

Service dependency graph with Hubble UI

Customers can deploy Hubble UI on to clusters with Advanced Network Observability enabled to visualize service dependencies. Hubble UI provides on-demand view of flows across the whole cluster and allows customers to select a given namespace and view network flows between different pods within the cluster surfacing in-depth information about each flow.

Benefits

Advanced network visibility

Advanced Network Observability offers unparalleled network visibility by providing granular insights into network traffic at the pod level. This detailed visibility enables administrators to monitor traffic flows, detect anomalies, and gain a comprehensive understanding of network behavior within their Azure Kubernetes Service (AKS) clusters. By leveraging eBPF-based data collection from the Linux Kernel, Advanced Network Observability provides real-time metrics and logs that surface traffic volume, packet drops, and DNS metrics. This enhanced visibility ensures that network administrators can swiftly identify and address potential issues, thereby maintaining optimal network performance and security.

Cross node network flow tracking

With Advanced Network Observability, customers can track network flows across multiple nodes within their Kubernetes clusters. This allows precise tracing of packet flows, making it possible to understand complex networking behaviors and interactions between different nodes. Hubble CLI can query network flows enabling users to filter and analyze specific traffic patterns. This cross-node tracking capability is invaluable for debugging network issues, as it surfaces the entire network flow within a single pane of glass identifying both dropped and forwarded packets across all nodes.

Real-time performance monitoring

Advanced Network Observability provides customers real-time performance monitoring capabilities. By integrating Hubble metrics powered by either Cilium or Retina, users can monitor network traffic in real time, identifying bottlenecks and performance issues as they occur. This immediate feedback loop is critical for maintaining high-performance and ensuring that any degradation in network performance is promptly surfaced and addressed. The managed Hubble metrics and flow logs offer continuous, detailed insights into network operations, allowing for proactive management and rapid troubleshooting.

Multi-Cluster historical analysis

Advanced Network Observability coupled with Azure Managed Prometheus and Grafana extends its benefits to multi-cluster environments, providing historical analysis capabilities that are essential for long-term network management and optimization. By storing and analyzing historical data across multiple clusters, administrators can identify trends, patterns, and recurring issues that may impact network performance and reliability going forward. This historical perspective is crucial for capacity planning, performance benchmarking, and compliance reporting. The ability to review and analyze past network data helps in understanding the evolution of network performance over time and informs future network design and configuration decisions.

Learn More about Advanced Container Networking Services in Azure

Read more in the Advanced Container Networking Services documentation and try it out on your clusters today.

Learn more about Azure Kubernetes Service.

Explore Microsoft Retina to gain more insight.

Discover more about Azure Monitor.

We would love to hear from you.  Please take a minute and give us some feedback.

Azure Kubernetes Service (AKS)

Deploy and scale containers on managed Kubernetes

Try for free

The post Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters appeared first on Azure Blog.
Quelle: Azure

Azure Maps: Reimagining location services with cloud and AI innovation

I’m thrilled to share news about the evolution of our enterprise mapping software development kit (SDK) and API offerings. Today, we’re announcing the unification of our enterprise maps offerings under Microsoft Azure Maps.1 We are combining the technologies and data in Bing Maps for Enterprise with Azure Maps and retiring Bing Maps for Enterprise. This marks a significant milestone as we transform our enterprise maps offerings to focus on creating a platform that empowers our customers to innovate and bring their ideas to life using cutting-edge innovations in AI and cloud—all on a trusted platform.  

This unification enables our customers to accelerate innovation by leveraging other Microsoft Azure cloud services while retaining many familiar features from Bing Maps for Enterprise within Azure Maps. Moreover, Azure Maps introduces several features not found in Bing Maps for Enterprise, including advanced service authentication methods, data residency compliance, geolocation, weather information, and custom indoor maps. Unifying our enterprise maps offerings under Azure Maps will greatly simplify our enterprise mapping offerings making it easier for customers to buy and use our services in a way that is consistent with other services at Microsoft.

Microsoft Azure Maps

Transform your business operations with location data

Explore more

The concept of “where” has become integral to the decision-making and how organizations navigate through the complex business environment. While the demand for geospatial information continues to grow, the context of “when” and “how” we utilize this information is evolving. At Microsoft, we have observed how technologies in cloud and AI are reshaping how our customers engage with the services we offer. 

Making the transition to Azure Maps

Bing Maps for Enterprise customers will have ample time under their existing and renewed contracts to transition to Azure Maps. Bing Maps for Enterprise will be retired on June 30, 2028, existing Bing Maps for Enterprise customers can continue to license Bing Maps for Enterprise APIs and services until then. We will however no longer be accepting new Bing Maps for Enterprise customers from June 30, 2024. Customers with an enterprise license have until June 30, 2028, to transition to Azure Maps, while customers on the free and basic license for Bing Maps for Enterprise have until June 30, 2025. We are committed to facilitating this transition and look forward to having you join us on this journey.  

Simplifying and enhancing through Microsoft products

Currently, millions of developers and businesses have access to geospatial information through a range of Microsoft products—Excel, Power BI in Microsoft Fabric, and Dynamics 365. Customers use our geospatial services to manage wildfires, respond to public emergencies, deliver products to consumers doorsteps, and so much more.  

IT managers and developers are increasingly seeking tools that help them adapt to the growing demands for more data and greater agility in the business-critical solutions they build. These demands extend beyond the solutions themselves; they also apply to the building blocks used to construct those solutions. As application complexity rises, our customers are increasingly requesting ways to simplify development, enhance collaboration, and create robust solutions with fewer integration points. By unifying our offerings, we can prioritize investments in supporting the ways that developers are building enterprise applications today.  

Integrating with Azure solutions

Our customers seek to deliver business impact by leveraging AI services and tools within their solutions. The inclusion of Azure Maps in the Azure platform brings geospatial data closer to other Azure services, facilitating seamless integration. Azure Maps customers already benefit from AI deeply embedded in industry-leading functions like geocoding. Being part of the Azure ecosystem also streamlines access to complementary services from the Azure Marketplace. At Microsoft Build 2024, we announced the availability of NVIDIA’s cuOpt service within the Azure Marketplace. NVIDIA cuOpt, a world-record holding graphics processing unit (GPU)-accelerated optimization AI microservice, assists teams in solving complex routing problems with multiple constraints. By combining cuOpt with Azure Maps, customers can achieve dynamic routing and real-time re-optimization at scale, driving efficiencies and revenue growth. Azure Maps, alongside the Azure Marketplace, opens up new avenues for innovation, enabling businesses to tackle diverse challenges.    

The aggregation of geospatial information with additional data unlocks new insights for businesses, enabling them to reduce operating costs, generate new revenue, and deliver better services to their customers. Azure Maps provides a holistic view, allowing you to understand not only what happened but also where it happened. By leveraging Azure Maps, you can make intelligent decisions and pivot faster by correlating the location of your business elements with your overall business goals. This orientation empowers your team to comprehend the situation and chart a path forward.

Start enhancing your location data today

Understanding “where” translates to knowing how to improve and move ahead. With Azure Maps, your organization gains easier access to, utilization of, and insights from location data. By combining our industry-leading cloud infrastructure with accurate, reliable location data and Microsoft applications like Power BI, you can generate predictive insights for your business. 

Explore our new Azure Maps solutions. 

1 Azure Maps is the next generation geospatial cloud service offered by Microsoft Maps. It has been created specifically, with Microsoft Developers and Enterprise Customers in mind, to provide scalable location capabilities and relevant geographical data. 

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.
The post Azure Maps: Reimagining location services with cloud and AI innovation appeared first on Azure Blog.
Quelle: Azure

IT trends show customers need computing power to take advantage of AI 

This is part of a larger series on the new infrastructure of the era of AI, highlighting emerging technology and trends in large-scale compute. This month, we’re sharing the Tech Pulse: March 2024 to help businesses harness the power of AI now. 

IT professionals today are focused on generative AI and its potential benefits, but one of the biggest challenges they face is securing the necessary computing power. In a recent study, Microsoft surveyed more than 2,000 IT professionals across 10 countries on their tech readiness for and adoption of AI. The report highlights their concerns and challenges along the way and is now available to help inform your business’s AI strategy. Stay ahead of the game with the latest insights from Microsoft.

Read the Tech Pulse: March 2024 study

Learn what the rise of AI means for IT professionals

Download now

Leading the charge in AI adoption and use  

IT professionals are at the forefront of AI adoption and use, with 79% of professionals using it multiple times a week. They are positive AI will have a positive impact on their company and role, make their jobs easier, and enable them to be strategic. The positivity crosses both the professional and personal spheres.  

IT professionals’ expectations of AI

AI in the professional sphere  

68% of IT professionals surveyed have already implemented AI in their work, a testament to the field’s eagerness to embrace this cutting-edge technology. This widespread adoption is not just a trend; it reflects the strategic importance placed on AI as a tool for enhancing productivity, driving innovation, and maintaining a competitive edge in a rapidly changing digital economy. 

AI in personal lives  

The influence of AI extends beyond the confines of the workplace, with 66% of those surveyed incorporating AI into their daily routines. This integration of AI into personal life underscores the technology’s versatility and its capacity to improve efficiency and decision-making in various aspects of life. It shows that people are not being pulled along in the newest IT trend—they are ready and excited to find additional uses outside of work to enhance their own productivity. 

Rethinking their tech stack requirements  

The emergence of AI is already changing many aspects of cloud computing. Even companies that started in the cloud have questions to consider when it comes to their future cloud strategy. While IT professionals remain positive about the cloud amid the changing landscape of AI, they increasingly cite the need for changes to the infrastructure they support. 72% of respondents agreed that AI will fundamentally change their tech stacks.

Embrace the future of ai

Learn about Azure innovation insights

With AI integration and a changing tech stack, proactively addressing uncertainties and changes that impact cloud computing is crucial. This includes adapting to new security paradigms, managing evolving skill requirements for AI-driven tasks, and optimizing cloud resources amid dynamic shifts in workload patterns. Additionally, IT professionals will need to ensure compliance with emerging regulations to effectively navigate the evolving landscape of AI-optimized cloud services.

Building AI confidence

As the IT landscape rapidly evolves with the integration of AI, professionals in the field are facing a mix of emotions and questions about the future. There’s a general optimism about the potential of AI to enhance productivity and innovation, yet IT experts are also mindful of skilling requirements and the pace of technological advancement. 

IT professionals are confident in their skills and the positive changes AI can bring to their roles and organizations. A significant 78% of IT pros surveyed have AI deployed or in pilot stages within their companies, indicating a strong move towards embracing AI technologies. However, this rapid adoption comes with its own set of challenges and questions around the pace of AI evolution, costs, and governance.

As organizations continue to invest in AI, it’s now more important than ever to focus on these concerns and drive the appropriate opportunities and messaging internally. Skilling will continue to be front of mind as IT Pros grapple with the rapid changes and investments that are happening in the space. The good news is there are a lot of tools and resources to help as people start to ramp up their AI skilling.  Great places to find training are online through providers and partners, hands-on labs, and events like Microsoft Build (developers), Microsoft Ignite (all audiences), and NVIDIA GTC (developers).

Looking for AI model accuracy, increased privacy and security

IT professionals are prioritizing AI model accuracy, privacy, and security when selecting a technology partner. AI model accuracy ranks highest, followed by a commitment to privacy and security, and a strong reputation as a technology leader. 

5 most important AI provider characteristics 

Accuracy remains the key to success

The march towards AI integration in IT is marked by a meticulous search for precision and a shield of privacy and security. Alliances with the right partners can guarantee the highest degree of AI model accuracy while accelerating usage of AI. Precision is paramount, as it forms the foundation upon which reliable and effective AI solutions are built. As IT professionals continue to grapple with skilling up in an accelerated timeframe, utilizing partners help relieve some of the gaps and accelerate the timeline for organizations wanting to integrate AI as quickly as possible without sacrificing quality. 

A commitment to privacy and security  

Beyond accuracy, IT professionals demand a steadfast commitment to privacy and security. In an era where data breaches are all too common, a technology partner’s ability to protect sensitive information is a critical criterion. IT professionals understand that robust security measures are not just a feature but a necessity, ensuring that AI solutions enhance rather than endanger their operations. 

Learn more about the future of AI

Many organizations will need to make significant tech and infrastructure changes to fully leverage AI’s benefits. You can read more about our findings in the Tech Pulse: March 2024 report. To better help your teams be AI-ready, consider attending one of our events this year or reading our State of AI Infrastructure report. Whether you’re a CEO, working hand-in-hand with the infrastructure, a developer, or end user, you can see firsthand how to design infrastructure that has the necessary computing power to support all your workloads and AI solutions.

More from the series:

Tech Pulse: What the rise of AI means for IT Professionals

New infrastructure for the era of AI: Emerging technology and trends in 2024

A year in review for AI Infrastructure

The post IT trends show customers need computing power to take advantage of AI  appeared first on Azure Blog.
Quelle: Azure

Celebrating customers’ journeys to AI innovation at Microsoft Build 2024

Ever since I started at Microsoft in August 2023, I was more than excited for Microsoft Build 2024, which wrapped up last week. Why? The achievements of our customers leveraging Azure AI to drive innovation across all industries are astounding, and I’ll happily take every opportunity to showcase and celebrate them. From enhancing productivity and creativity to revolutionizing customer interactions with custom copilots, our customers demonstrate the transformative power of generative AI and truly, brought Build 2024 to life. So, how’d they do it?  

Fostering creativity and innovation 

By creating Muse Chat, a copilot for coding and documentation, Unity Technologies showcases how AI can foster creativity in the gaming industry. This tool, developed with Azure OpenAI Service and Azure AI Content Safety, allows developers to create and perfect their own video games. At Build, Unity gave a live demo on Muse Chat during the Multimodel, Multimodal and Multiagent Innovation with Azure AI, which resulted in a video game prototype based on prompts, without typing a single line of code. Unity also provided a gaming station at our event celebration and gave out Unity trial subscriptions to developers.  

Figure 1: Live demo of Unity Muse Chat, powered on Azure OpenAI Service, to create a space-inspired video game.

Pushing the boundaries of creative expression and efficiency, WPP explores the use of video, images, and speech to accelerate content creation with Azure OpenAI Service (GPT-4 with Vision). Generative AI and multimodality are central to their innovation. Additionally, WPP is a leader when it comes to accessibility. Featured in the Day 1 keynote and the Accessibility in the era of generative AI, WPP showcased their solutions with Seeing AI. 

Azure AI

Where innovators are creating the future

Learn more

The New York City Department of Education is leveraging generative AI through Azure OpenAI Service to create a specialized teaching assistant capable of providing instant feedback and addressing student inquiries. This solution is developed using the Azure platform, ensuring seamless integration and enhanced educational support. During the Build session, safeguard your copilot with Azure AI, event-goers learned how NYC built this solution and how they are keeping the experience safe for teachers and students alike. 

Figure 2: New York City Department of Education uses Azure OpenAI Service and Azure AI Content Safety to safeguard their educational generative AI tools.

Enhancing productivity and efficiency 

Utilizing AI Tax Assist with Azure OpenAI Service and Azure AI Studio, H&R Block is helping tax professionals and filers reduce the time and effort needed to file taxes. The integration of Azure AI Search for RAG exemplifies how AI can streamline complex processes to enhance productivity. 

Featured in the Azure AI Studio—creating and scaling your custom copilots session, Sweco focused on freeing up time for more creative solutions in client projects by developing SwecoGPT using Azure AI Studio. This digital assistant automates document creation and analysis, allowing consultants to quickly find critical project information and deliver more personalized services. As a result, consultants report increased productivity and added client value.  

By leveraging the full Microsoft Azure stack, Sapiens International enhances developer creativity and streamlines the development of automated customer solutions. The use of Azure Kubernetes Service and Azure-managed databases significantly improves tasks like underwriting, claims processing, and fraud detection. 

For PwC, the creation of ChatPwC, a scalable and secure GenAI solution using Azure OpenAI Service, Azure AI Search, and Azure AI Document Intelligence, has been a game-changer. This tool helps employees summarize, assess, and identify themes in client data, benefiting hundreds of thousands of employees daily. 

Transforming customer interactions 

Revolutionizing customer interactions, Vodafone introduced TOBi, a public-facing virtual assistant which efficiently handles calls and empowers agents through context-aware conversations and call transcriptions. SuperAgent, a company-wide internal conversational AI search copilot, enhances agent efficiency, helping customer journeys. 

To transform their MBUX Voice Assistant and dashcams, Mercedes-Benz integrates GPT-4 Turbo with Vision via Azure OpenAI Service. This technology enables the car to understand its surroundings and provide context for speech assistance with the ‘Hey Mercedes’ cue. In the new multimodal vision AI models and their practical applications breakout session, Mercedes showcased the power of multimodality by demonstrating the capabilities of ‘Hey Mercedes’ including the ability to add DALL-E generated images to vehicles’ dashboards by a simple voice command. 

Figure 3: Mercedes-Benz utilizes Dall-E to display an image of Olympic National park on the car’s dashboard using a simple voice command.

Offering an immersive in-car infotainment system, TomTom utilizes Azure OpenAI Service, Azure Kubernetes Service, and CosmosDB. Their Digital Cockpit demonstrates the impact of advanced technologies in creating a seamless and enriched driving environment. In the TomTom brings AI-powered, talking cars to life with Azure session at Build, staff data scientist Massimiliano Ungheretti showcased how he led the development of the Digital Cockpit.  

Detecting suspicious activity by analyzing millions of daily transactions, Kinectify uses its AI-powered anti-money laundering (AML) risk management platform. Built with Azure Cosmos DB, Azure AI Services, and Azure Kubernetes Service, the platform is scalable and robust. 

Elevating business operations 

Featured in the Build sophisticated custom copilots with Azure OpenAI Assistants API breakout session, Coca-Cola aims to improve the productivity of its 30,000 associates by leveraging the Assistants API with GPT-4 Turbo for KO Assist, their genAI copilot. These standardized assistants help with enhanced business intelligence, data synthesis, strategic planning, and risk management across all departments. 

Figure 4: Cola-Cola demonstrates the power of their KO Assist chatbot, powered by Azure OpenAI Assistants API.

Increasing productivity and enabling more intuitive app development, Freshworks uses Freddy Copilot to provide conversational assistance and informed insights to employees and customers. This tool is supported by the Assistants API. Freshworks showcased Freddy Copilot in Build sophisticated custom copilots with Azure OpenAI Assistants API. 

Featured in the Day 2 keynote, the fastest growing consumer experience in history, OpenAI, now offers enhanced features like GPTs for customizing ChatGPT with external knowledge and Assistants API for developers to create AI-powered assistants with scaled-up external knowledge integration using retrieval augmented generation (RAG). Leveraging Azure AI Search, OpenAI API users can now upload 500 times more files, transforming it into a powerful retrieval system, solving significant present and future challenges. 

Thank you 

We are proud to celebrate the remarkable achievements of our customers at Microsoft Build 2024 and beyond. These examples underscore the transformative impact of Azure AI across various industries, showcasing how innovative solutions can revolutionize the way we act, think, and work.

As we continue to develop and support cutting-edge AI technologies, we are excited for even more groundbreaking advancements to come from our customers, driving the future of digital transformation and setting new standards for excellence. Together, we are shaping a smarter, more efficient, and more connected world.

Curious how your enterprise can unleash their potential with generative AI? With Azure, the opportunities are endless.
The post Celebrating customers’ journeys to AI innovation at Microsoft Build 2024 appeared first on Azure Blog.
Quelle: Azure

Microsoft and Broadcom to support license portability for VMware Cloud Foundation on Azure VMware Solution

Co-authored by Ahmar Mohammad, Vice President, Partners, Managed Services, and Solutions GTM, VCF Division, Broadcom

Microsoft and Broadcom have partnered closely for many years to support our mutual customers, and we continue to build and innovate together as customer needs change. Today, we are pleased to share that Microsoft and Broadcom are expanding our partnership with plans to support VMware Cloud Foundation subscriptions on Azure VMware Solution. Customers that own or purchase licenses for VMware Cloud Foundation will be able to use those licenses on Azure VMware Solution, as well as their own datacenters, giving them flexibility to meet changing business needs. 

This provides an additional purchase option for Azure VMware Solution, which has been sold and operated by Microsoft since 2019. Customers can currently purchase the solution with VMware licenses included, and this option will continue to be available for customers that prefer to purchase their VMware licenses as part of their solution from Microsoft.   

Azure VMware Solution delivers a fully managed VMware environment that is operated and supported by Microsoft. Customers can move VMware workloads to Azure “as is” with minimal to no refactoring. This streamlines migration and allows customers to continue using familiar skills while learning new Azure skills.  

By migrating to Azure VMware Solution, now available in 33 regions around the world, organizations can take advantage of Azure’s scalable and high-performance cloud infrastructure. Customers can deploy with business-critical capabilities such as backup, high availability, threat protection, and performance monitoring. Moreover, workloads running on Azure VMware Solution can be integrated with Azure’s portfolio of more than 200 cloud services to accelerate innovation, gain deeper insights from data with advanced AI services, and to modernize business applications. 

VMware Cloud Foundation delivers a private cloud platform that is ubiquitous, flexible, and integrated across cloud endpoints. By deploying VMware Cloud Foundation ​on​​ ​Azure, customers benefit from a highly optimized cloud operating model that provides the scale and agility of public cloud with the security and performance of private cloud. Running VMware Cloud Foundation ​on ​Azure enables organizations to modernize IT infrastructure with demonstrable TCO, provide developers a self-service private cloud experience resulting in greater productivity, and achieve better cyber resiliency and security.

With​​​​​​ improved license portability for customers with eligible VMware Cloud Foundation entitlements, customers will be able to purchase subscriptions of the new VMware Cloud Foundation software and have complete mobility to and from their on-premises environment to Azure VMware Solution. VMware customers that have already purchased and begun deploying the new VMware Cloud Foundation will be able to transfer the remaining value of an existing subscription to Azure VMware Solution. Additionally, customers will be able to move their VMware Cloud Foundation subscription between on-premises and Azure VMware Solution as their needs and requirements evolve over time. ​​​​​​Customers will retain the rights to their software subscription when moving their VMware Cloud Foundation subscription to Azure VMware Solution. 

VMware Rapid Migration Plan: Reduce your migration time and cost

In addition to the new VMware license portability benefit, the VMware Rapid Migration Plan provides an additional and comprehensive set of licensing benefits and programs to reduce the cost and time it takes for organizations to migrate to Azure VMware Solution.  

The plan includes: 

Price protection: With reserved instances customers can lock in pricing for one, three, or five years.  

Savings for Windows Server and SQL Server: Windows Server and SQL Server are common workloads on VMware environments. With Software Assurance for on-premises Windows Server and SQL Server licenses, organizations can qualify for the Azure Hybrid Benefit discount to use existing Windows Server and SQL Server licenses in Azure VMware Solution. Free Extended Security Updates are available for older versions that face end of support.

Migration support: Use Azure Migrate and Modernize to get resources, expert help, and funding from Microsoft and its partner ecosystem. 

Azure credits: Customers that purchase a new reserved instance for Azure VMware Solution can get additional Azure credits valid for Azure VMware Solution or other Azure services. 

Supporting your cloud journey with Microsoft and Broadcom

We are committed to continued partnership and innovation to support our mutual customers as they adapt to changing business needs. VMware Cloud Foundation license portability to Azure VMware Solution will be available later this year, so now is a great time to contact your account team or Microsoft partner to start planning your move.

Here are a few resources to help you get started:

Learn more about Azure VMware Solution.

Take advantage of the VMware Rapid Migration Plan from Microsoft. 

Azure VMware Solution

Move or extend on-premises VMware environments to Azure without refactoring

Learn more

The post Microsoft and Broadcom to support license portability for VMware Cloud Foundation on Azure VMware Solution appeared first on Azure Blog.
Quelle: Azure

New models added to the Phi-3 family, available on Microsoft Azure

Read more announcements from Azure at Microsoft Build 2024: New ways Azure helps you build transformational AI experiences and The new era of compute powering Azure AI solutions.

At Microsoft Build 2024, we are excited to add new models to the Phi-3 family of small, open models developed by Microsoft. We are introducing Phi-3-vision, a multimodal model that brings together language and vision capabilities. You can try Phi-3-vision today.

Phi-3-small and Phi-3-medium, announced earlier, are now available on Microsoft Azure, empowering developers with models for generative AI applications that require strong reasoning, limited compute, and latency bound scenarios. Lastly, previously available Phi-3-mini, as well as Phi-3-medium, are now also available through Azure AI’s models as a service offering, allowing users to get started quickly and easily.

The Phi-3 family

Phi-3 models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks. They are trained using high quality training data, as explained in Tiny but mighty: The Phi-3 small language models with big potential. The availability of Phi-3 models expands the selection of high-quality models for Azure customers, offering more practical choices as they compose and build generative AI applications.

Phi-3-vision

Bringing together language and vision capabilities

Try it today

There are four models in the Phi-3 model family; each model is instruction-tuned and developed in accordance with Microsoft’s responsible AI, safety, and security standards to ensure it’s ready to use off-the-shelf.

Phi-3-vision is a 4.2B parameter multimodal model with language and vision capabilities.

Phi-3-mini is a 3.8B parameter language model, available in two context lengths (128K and 4K).

Phi-3-small is a 7B parameter language model, available in two context lengths (128K and 8K).

Phi-3-medium is a 14B parameter language model, available in two context lengths (128K and 4K).

Find all Phi-3 models on Azure AI and Hugging Face.

Phi-3 models have been optimized to run across a variety of hardware. Optimized variants are available with ONNX Runtime and DirectML providing developers with support across a wide range of devices and platforms including mobile and web deployments. Phi-3 models are also available as NVIDIA NIM inference microservices with a standard API interface that can be deployed anywhere and have been optimized for inference on NVIDIA GPUs and Intel accelerators.

It’s inspiring to see how developers are using Phi-3 to do incredible things—from ITC, an Indian conglomerate, which has built a copilot for Indian farmers to ask questions about their crops in their own vernacular, to the Khan Academy, who is currently leveraging Azure OpenAI Service to power their Khanmigo for teachers pilot and experimenting with Phi-3 to improve math tutoring in an affordable, scalable, and adaptable manner. Healthcare software company Epic is looking to also use Phi-3 to summarize complex patient histories more efficiently. Seth Hain, senior vice president of R&D at Epic explains, “AI is embedded directly into Epic workflows to help solve important issues like clinician burnout, staffing shortages, and organizational financial challenges. Small language models, like Phi-3, have robust yet efficient reasoning capabilities that enable us to offer high-quality generative AI at a lower cost across our applications that help with challenges like summarizing complex patient histories and responding faster to patients.”

Digital Green, used by more than 6 million farmers, is introducing video to their AI assistant, Farmer.Chat, adding to their multimodal conversational interface. “We’re excited about leveraging Phi-3 to increase the efficiency of Farmer.Chat and to enable rural communities to leverage the power of AI to uplift themselves,” said Rikin Gandhi, CEO, Digital Green.

Bringing multimodality to Phi-3

Phi-3-vision is the first multimodal model in the Phi-3 family, bringing together text and images, and the ability to reason over real-world images and extract and reason over text from images. It has also been optimized for chart and diagram understanding and can be used to generate insights and answer questions. Phi-3-vision builds on the language capabilities of the Phi-3-mini, continuing to pack strong language and image reasoning quality in a small model.

Phi-3-vision can generate insights from charts and diagrams:

Groundbreaking performance at a small size

As previously shared, Phi-3-small and Phi-3-medium outperform language models of the same size as well as those that are much larger.

Phi-3-small with only 7B parameters beats GPT-3.5T across a variety of language, reasoning, coding, and math benchmarks.1

The Phi-3-medium with 14B parameters continues the trend and outperforms Gemini 1.0 Pro.2

Phi-3-vision with just 4.2B parameters continues that trend and outperforms larger models such as Claude-3 Haiku and Gemini 1.0 Pro V across general visual reasoning tasks, OCR, table, and chart understanding tasks.3

All reported numbers are produced with the same pipeline to ensure that the numbers are comparable. As a result, these numbers may differ from other published numbers due to slight differences in the evaluation methodology. More details on benchmarks are provided in our technical paper.

See detailed benchmarks in the footnotes of this post.

Prioritizing safety

Phi-3 models were developed in accordance with the Microsoft Responsible AI Standard and underwent rigorous safety measurement and evaluation, red-teaming, sensitive use review, and adherence to security guidance to help ensure that these models are responsibly developed, tested, and deployed in alignment with Microsoft’s standards and best practices.

Phi-3 models are also trained using high-quality data and were further improved with safety post-training, including reinforcement learning from human feedback (RLHF), automated testing and evaluations across dozens of harm categories, and manual red-teaming. Our approach to safety training and evaluations are detailed in our technical paper, and we outline recommended uses and limitations in the model cards.

Finally, developers using the Phi-3 model family can also take advantage of a suite of tools available in Azure AI to help them build safer and more trustworthy applications.

Choosing the right model

With the evolving landscape of available models, customers are increasingly looking to leverage multiple models in their applications depending on use case and business needs. Choosing the right model depends on the needs of a specific use case.

Small language models are designed to perform well for simpler tasks, are more accessible and easier to use for organizations with limited resources, and they can be more easily fine-tuned to meet specific needs. They are well suited for applications that need to run locally on a device, where a task doesn’t require extensive reasoning and a quick response is needed.

The choice between using Phi-3-mini, Phi-3-small, and Phi-3-medium depends on the complexity of the task and available computational resources. They can be employed across a variety of language understanding and generation tasks such as content authoring, summarization, question-answering, and sentiment analysis. Beyond traditional language tasks these models have strong reasoning and logic capabilities, making them good candidates for analytical tasks. The longer context window available across all models enables taking in and reasoning over large text content—documents, web pages, code, and more.

Phi-3-vision is great for tasks that require reasoning over image and text together. It is especially good for OCR tasks including reasoning and Q&A over extracted text, as well as chart, diagram, and table understanding tasks.

Get started today

To experience Phi-3 for yourself, start with playing with the model on Azure AI Playground. Learn more about building with and customizing Phi-3 for your scenarios using the Azure AI Studio.

Footnotes

1Table 1: Phi-3-small with only 7B parameters

2Table 2: Phi-3-medium with 14B parameters

3Table 3: Phi-3-vision with 4.2B parameters

The post New models added to the Phi-3 family, available on Microsoft Azure appeared first on Azure Blog.
Quelle: Azure

From code to production: New ways Azure helps you build transformational AI experiences

We’re witnessing a critical turning point in the market as AI moves from the drawing boards of innovation into the concrete realities of everyday life. The leap from potential to practical application marks a pivotal chapter, and you, as developers, are key to bringing it to bear.

The news at Build is focused on the top demands we’ve heard from all of you as we’ve worked together to turn this promise of AI into reality:

Empowering every developer to move with greater speed and efficiency, using the tools you already know and love.

Expanding and simplifying access to the AI, data—application platform services you need to be successful so you can focus on building transformational AI experiences.

And, helping you focus on what you do best—building incredible applications—with responsibility, safety, security, and reliability features, built right into the platform. 

I’ve been building software products for more than two decades now, and I can honestly say there’s never been a more exciting time to be a developer. What was once a distant promise is now manifesting—and not only through the type of apps that are possible, but how you can build them.

With Microsoft Azure, we’re meeting you where you are today—and paving the way to where you’re going. So let’s jump right into some of what you’ll learn over the next few days. Welcome to Microsoft Build 2024!

Create the future with Azure AI: offering you tools, model choice, and flexibility  

The number of companies turning to Azure AI continues to grow as the list of what’s possible expands. We’re helping more than 50,000 companies around the globe achieve real business impact using it—organizations like Mercedes-Benz, Unity, Vodafone, H&R Block, PwC, SWECO, and so many others.  

To make it even more valuable, we continue to expand the range of models available to you and simplify the process for you to find the right models for the apps you’re building. You can learn more about all Azure AI updates we’re announcing this week over on the Tech Community blog. 

Azure AI Studio, a key component of the copilot stack, is now generally available. The pro-code platform empowers responsible generative AI development, including the development of your own custom copilot applications. The seamless development approach includes a friendly user interface (UI) and code-first capabilities, including Azure Developer CLI (AZD) and AI Toolkit for VS Code, enabling developers to choose the most accessible workflow for their projects.

Developers can use Azure AI Studio to explore AI tools, orchestrate multiple interoperating APIs and models; ground models using their data using retrieval augmented generation (RAG) techniques; test and evaluate models for performance and safety; and deploy at scale and with continuous monitoring in production.

Empowering you with a broad selection of small and large language models  

Our model catalog is the heart of Azure AI Studio. With more than 1,600 models available, we continue to innovate and partner broadly to bring you the best selection of frontier and open large language models (LLMs) and small language models (SLMs) so you have flexibility to compare benchmarks and select models based on what your business needs. And, we’re making it easier for you to find the best model for your use case by comparing model benchmarks, like accuracy and relevance.

I’m excited to announce OpenAI’s latest flagship model, GPT-4o, is now generally available in Azure OpenAI Service. This groundbreaking multimodal model integrates text, image, and audio processing in a single model and sets a new standard for generative and conversational AI experiences. Pricing for GPT-4o is $5/1M Tokens for input and $15/1M Tokens for output.

Earlier this month, we enabled GPT-4 Turbo with Vision through Azure OpenAI Service. With these new models developers can build apps with inputs and outputs that span across text, images, and more, for a richer user experience. 

We’re announcing new models through Models-as-a-Service (MaaS) in Azure AI Studio leading Arabic language model Core42 JAIS and TimeGen-1 from Nixtla are now available in preview. Models from AI21, Bria AI, Gretel Labs, NTT DATA, Stability AI as well as Cohere Rerank are coming soon.  

Phi-3: Redefining what’s possible with SLMs

At Build we’re announcing Phi-3-small, Phi-3-medium, and Phi-3-vision, a new multimodal model, in the Phi-3 family of AI small language models (SLMs), developed by Microsoft. Phi-3 models are powerful, cost-effective and optimized for resource constrained environments including on-device, edge, offline inference, and latency bound scenarios where fast response times are critical. 

Introducing Phi-3: Groundbreaking performance at a small size

Sized at 4.2 billion parameters, Phi-3-vision supports general visual reasoning tasks and chart/graph/table reasoning. The model offers the ability to input images and text, and to output text responses. For example, users can ask questions about a chart or ask an open-ended question about specific images. Phi-3-mini and Phi-3-medium are also now generally available as part of Azure AI’s MaaS offering.

In addition to new models, we are adding new capabilities across APIs to enable multimodal experiences. Azure AI Speech has several new features in preview including Speech analytics and Video translation to help developers build high-quality, voice-enabled apps. Azure AI Search now has dramatically increased storage capacity and up to 12X increase in vector index size at no additional cost to run RAG workloads at scale.

Azure AI Studio

Get everything you need to develop generative AI applications and custom copilots in one platform

Try now

Bring your intelligent apps and ideas to life with Visual Studio, GitHub, and the Azure platform

The tools you choose to build with should make it easy to go from idea to code to production. They should adapt to where and how you work, not the other way around. We’re sharing several updates to our developer and app platforms that do just that, making it easier for all developers to build on Azure. 

Access Azure services within your favorite tools for faster app development

By extending Azure services natively into the tools and environments you’re already familiar with, you can more easily build and be confident in the performance, scale, and security of your apps.  

How to choose the right approach for your AI transformation

Learn more

We’re also making it incredibly easy for you to interact with Azure services from where you’re most comfortable: a favorite dev tool like VS Code, or even directly on GitHub, regardless of previous Azure experience or knowledge. Today, we’re announcing the preview of GitHub Copilot for Azure, extending GitHub Copilot to increase its usefulness for all developers. You’ll see other examples of this from Microsoft and some of the most innovative ISVs at Build, so be sure to explore our sessions.  

Also in preview today is the AI Toolkit for Visual Studio Code, an extension that provides development tools and models to help developers acquire and run models, fine-tune them locally, and deploy to Azure AI Studio, all from VS Code.  

Updates that make cloud native development faster and easier

.NET Aspire has arrived! This new cloud-native stack simplifies development by automating configurations and integrating resilient patterns. With .NET Aspire, you can focus more on coding and less on setup while still using your preferred tools. This stack includes a developer dashboard for enhanced observability and diagnostics right from the start for faster and more reliable app development. Explore more about the general availability of .NET Aspire on the DevBlogs post.   

We’re also raising the bar on ease of use in our application platform services, introducing Azure Kubernetes Services (AKS) Automatic, the easiest managed Kubernetes experience to take AI apps to production. In preview now, AKS Automatic builds on our expertise running some of the largest and most advanced Kubernetes applications in the world, from Microsoft Teams to Bing, XBox online services, Microsoft 365 and GitHub Copilot to create best practices that automate everything from cluster set up and management to performance and security safeguards and policies.

As a developer you now have access to a self-service app platform that can move from container image to deployed app in minutes while still giving you the power of accessing the Kubernetes API. With AKS Automatic you can focus on building great code, knowing that your app will be running securely with the scale, performance and reliability it needs to support your business.

Data solutions built for the era of AI

Developers are at the forefront of a pivotal shift in application strategy which necessitates optimizations at every tier of an application—including databases—since AI apps require fast and frequent iterations to keep pace with AI model innovation. 

We’re excited to unveil new data and analytics features this week designed to assist you in the critical aspects of crafting intelligent applications and empowering you to create the transformative apps of today and tomorrow.

Enabling developers to build faster with AI built into Azure databases 

Vector search is core to any AI application so we’re adding native capabilities to Azure Cosmos DB with Azure Cosmos DB for NoSQL. Powered by DiskANN, a powerful algorithm library, this makes Azure Cosmos DB the first cloud database to offer lower latency vector search at cloud scale without the need to manage servers. 

Azure Cosmos DB

The database for the era of AI

Learn more

We’re also announcing the availability of Azure Database for PostgreSQL extension for Azure AI to make bringing AI capabilities to data in PostgreSQL data even easier. Now generally available, this enables developers who prefer PostgreSQL to plug data directly into Azure AI for a simplified path to leverage LLMs and build rich PostgreSQL generative AI experiences.   

Embeddings enable AI models to better understand relationships and similarities between data, which is key for intelligent apps. Azure Database for PostgreSQL in-database embedding generation is now in preview so embeddings can be generated right within the database—offering single-digit millisecond latency, predictable costs, and the confidence that data will remain compliant for confidential workloads. 

Making developer life easier through in-database Copilot capabilities

These databases are not only helping you build your own AI experiences. We’re also applying AI directly in the user experience so it’s easier than ever to explore what’s included in a database. Now in preview, Microsoft Copilot capabilities in Azure SQL DB convert queries into SQL language so developers can use natural language to interact with data. And, Copilot capabilities are coming to Azure Database for MySQL to provide summaries of technical documentation in response to user questions—creating an all-around easier and more enjoyable management experience.

Microsoft Copilot capabilities in the database user experience

Microsoft Fabric updates: Build powerful solutions securely and with ease

We have several Fabric updates this week, including the introduction of Real-Time Intelligence. This completely redesigned workload enables you to analyze, explore, and act on your data in real time. Also coming at Build: the Workload Development Kit in preview, making it even easier to design and build apps in Fabric. And our Snowflake partnership expands with support for Iceberg data format and bi-directional read and write between Snowflake and Fabric’s OneLake. Get the details and more in Arun Ulag’s blog: Fuel your business with continuous insights and generative AI. And for an overview of Fabric data security, download the Microsoft Fabric Microsoft Fabric security whitepaper.

Spend a day in the life of a piece of data and learn exactly how it moves from its database home to do more than ever before with the insights of Microsoft Fabric, real-time assistance by Microsoft Copilot, and the innovative power of Azure AI.  

Build on a foundation of safe and responsible AI

What began with our principles and a firm belief that AI must be used responsibly and safely has become an integral part of the tooling, APIs, and software you use to scale AI responsibly. Within Azure AI, we have 20 Responsible AI tools with more than 90 features. And there’s more to come, starting with updates at Build.

New Azure AI Content Safety capabilities

We’re equipping you with advanced guardrails that help protect AI applications and users from harmful content and security risks and this week, we’re announcing new  feature for Azure AI Content Safety. Custom Categories are coming soon so you can create custom filters for specific content filtering needs. This feature also includes a rapid option, enabling you to deploy new custom filters within an hour to protect against emerging threats and incidents.  

Prompt Shields and Groundedness Detection are both available in preview now in Azure OpenAI Service and Azure AI Studio help fortify AI safety. Prompt shields mitigate both indirect and jailbreak prompt injection attacks on LLMs, while Groundedness Detection enables detection of ungrounded materials or hallucinations in generated responses.  

Features to help secure and govern your apps and data

Microsoft Defender for Cloud now extends its cloud-native application protection to AI applications from code to cloud. And, AI security posture management capabilities enable security teams to discover their AI services and tools, identify vulnerabilities, and proactively remediate risks. Threat protection for AI workloads in Defender for Cloud leverages a native integration with Azure AI Content Safety to enable security teams to monitor their Azure OpenAl applications for direct and in-direct prompt injection attacks, sensitive data leaks and other threats so they can quickly investigate and respond.

With easy-to-use APIs, app developers can easily integrate Microsoft Purview into line of business apps to get industry-leading data security and compliance for custom-built AI apps. You can empower your app customers and respective end users to discover data risks in AI interactions, protect sensitive data with encryption, and govern AI activities. These capabilities are available for Copilot Studio in public preview and soon (coming in July) will be available in public preview for Azure AI Studio, and via the Purview SDK, so developers can benefit from the data security and compliance controls for their AI apps built on Azure AI.  Read more here.  

Two final security notes. We’re also announcing a partnership with HiddenLayer to scan open models that we onboard to the catalog, so you can verify that the models are free from malicious code and signs of tampering before you deploy them. We are the first major AI development platform to provide this type of verification to help you feel more confident in your model choice. 

Second, Facial Liveness, a feature of the Azure AI Vision Face API which has been used by Windows Hello for Business for nearly a decade, is now available in preview for browser. Facial Liveness is a key element in multi-factor authentication (MFA) to prevent spoofing attacks, for example, when someone holds a picture up to the camera to thwart facial recognition systems. Developers can now easily add liveness and optional verification to web applications using Face Liveness, with the Azure AI Vision SDK, in preview.

Our belief in the safe and responsible use of AI is unwavering. You can read our recently published Responsible AI Transparency Report for a detailed look at Microsoft’s approach to developing AI responsibly. We’ll continue to deliver more innovation here and our approach will remain firmly rooted in principles and put into action with built-in features.

Move your ideas from a spark to production with Azure

Organizations are rapidly moving beyond AI ideation and into production. We see and hear fresh examples every day of how our customers are unlocking business challenges that have plagued industries for decades, jump-starting the creative process, making it easier to serve their own customers, or even securing a new competitive edge. We’re curating an industry-leading set of developer tools and AI capabilities to help you, as developers, create and deliver the transformational experiences that make this all possible.

Learn more at Microsoft Build 2024

Join us at Microsoft Build 2024 to experience the keynotes and learn more about how AI could shape your future.

Enhance your AI skills.

Discover innovative AI solutions through the Microsoft commercial marketplace.

Read more about Microsoft Fabric updates: Fuel your business with continuous insights and generative AI.

Read more about Azure Infrastructure: Unleashing innovation: How Microsoft Azure powers AI solutions.

Try Microsoft Azure for free

The post From code to production: New ways Azure helps you build transformational AI experiences appeared first on Azure Blog.
Quelle: Azure

Unleashing innovation: The new era of compute powering Azure AI solutions

As AI continues to transform industries, Microsoft is expanding its global cloud infrastructure to meet the needs of developers and customers everywhere. At Microsoft Build 2024, we’re unveiling our latest progress in developing tools and services optimized for powering your AI solutions. Microsoft’s cloud infrastructure is unique in how it provides choice and flexibility in performance and power for customers’ unique AI needs, whether that’s doubling deployment speeds or lowering operating costs.

That’s why we’ve enhanced our adaptive, powerful, and trusted platform with the performance and resilience you’ll need to build intelligent AI applications. We’re delivering on our promise to support our customers by providing them with exceptional cost-performance in compute and advanced generative AI capabilities.

Try Microsoft Azure for free >

Powerful compute for general purpose and AI workloads

Microsoft has the expertise and scale to run the AI supercomputers that power some of the world’s biggest AI services, such as Microsoft Azure OpenAI Service, ChatGPT, Bing, and more. Our focus as we continue to expand our AI infrastructure is on optimizing performance, scalability, and power efficiency.

Microsoft takes a systems approach to cloud infrastructure, optimizing both hardware and software to efficiently handle workloads at scale. In November 2023, Microsoft introduced its first in-house designed cloud compute processor, Azure Cobalt 100, which enables general-purpose workloads on the Microsoft Cloud. We are announcing the preview of Azure virtual machines built to run on Cobalt 100 processors. Cobalt 100-based virtual machines (VMs) are Azure’s most power efficient compute offering, and deliver up to 40% better performance than our previous generation of Arm-based VMs. And we’re delivering that same Arm-based performance and efficiency to customers like Elastic, MongoDB, Siemens, Snowflake, and Teradata. The new Cobalt 100-based VMS are expected to enhance efficiency and performance for both Azure customers and Microsoft products. Additionally, IC3, the platform that powers billions of customer conversations in Microsoft Teams, is adopting Cobalt 100 to serve its growing customer base more efficiently, achieving up to 45% better performance on Cobalt 100 VMs.

We’re combining the best of industry and the best of Microsoft in our AI infrastructure. Alongside our custom Azure Cobalt 100 and Maia series and silicon industry partnerships, we’re also announcing the general availability of the ND MI300X VM series, where Microsoft is the first cloud provider to bring AMD’s most powerful Instinct MI300X Accelerator to Azure. With the addition of the ND MI300X VM combining eight AMD MI300X Instinct accelerators, Azure is delivering customers unprecedented cost-performance for inferencing scenarios of frontier models like GPT-4. Our infrastructure supports different scenarios for AI supercomputing, such as building large models from scratch, running inference on pre-trained models, using model as a service providers, and fine-tuning models for specific domains.

Azure Migrate and Modernize

Curated resources and expert help to migrate or modernize your on-premises infrastructure

Get started

One of Microsoft’s advantages in AI is our ability to combine thousands of virtual machines with tens of thousands of GPUs with the best of InfiniBand and Ethernet based networking topologies for supercomputers in the cloud that can run large scale AI workloads to lower costs. With a diversity of silicon across AMD, NVIDIA, and Microsoft’s Maia AI accelerators, Azure’s AI infrastructure delivers the most complete compute platform for AI workloads. It is this combination of advanced AI accelerators, datacenter designs, and optimized compute and networking topology that drive cost efficiency per workload. That means whether you use Microsoft Copilot or build your own copilot apps, the Azure platform ensures you get the best AI performance with optimized cost.

Microsoft is further extending our cloud infrastructure with the Azure Compute Fleet, a new service that simplifies provisioning of Azure compute capacity across different VM types, availability zones, and pricing models to more easily achieve desired scale, performance, and cost by enabling users to control VM group behaviors automatically and programmatically. As a result, Compute Fleet has the potential to greatly optimize your operational efficiency and increase your core compute flexibility and reliability for both AI and general-purpose workloads together at scale.

AI-enhanced central management and security

As businesses continue to expand their computing estate, managing and governing the entire infrastructure can become overwhelming. We keep hearing from developers and customers that they spend more time searching for information and are less productive. Microsoft is focused on simplifying this process through AI-enhanced central management and security. Our adaptive cloud approach takes innovation to the next level with a single, intelligent control plane that spans from cloud to edge, making it easier for customers to manage their entire computing estate in a consistent way. We’re also aiming to improve your experience with managing these distributed environments through Microsoft Copilot in Azure.

We created Microsoft Copilot in Azure to act as an AI companion, helping your teams manage operations seamlessly across both cloud and edge environments. By using natural language, you can ask Copilot questions and receive personalized recommendations related to Azure services. Simply ask, “Why is my app slow?” or “How do I fix this error?” and Copilot will navigate a customer through potential causes and fixes.

Microsoft Copilot in Azure

Manage operations from cloud to edge with an AI assistant

Learn more

Starting today, we will be opening the preview of Copilot in Azure to all customers over the next couple of weeks. With this update, customers can choose to have all their users access Copilot or grant access to specific users or groups within a tenant. With this flexibility to manage Copilot, you can tailor your approach and control which groups of users or departments within your organization have access to it. You can feel secure knowing you can deploy and use the tool in a controlled manner, ensuring it aligns with your organization’s operational standards and security policies.

We’re continually enhancing Copilot and making the product better with every release to help developers be more productive. One of the ways we’ve simplified the developer’s experience is by making databases and analytics services easier to configure, manage, and optimize through AI-enhanced management. Several new skills are available for Azure Kubernetes Service (AKS) in Copilot for Azure that simplify common management tasks, including the ability to configure AKS backups, change tiers, locate YAML files for editing, and construct kubectl commands.

We’ve also added natural language to SQL conversion and self-help for database administration to support your Azure SQL database-driven applications. Developers can ask questions about their data in plain text, and Copilot generates the corresponding T-SQL query. Database administrators can independently manage databases, resolve issues, and learn more about performance and capabilities. Developers benefit from detailed explanations of the generated queries, helping them write code faster.

Lastly, you’ll notice a few new security enhancements to the tool. Copilot now includes Microsoft Defender for Cloud prompting capabilities to streamline risk exploration, remediation, and code fixes. Defender External Attack Surface Management (EASM) leverages Copilot to help surface risk-related insights and convert natural language to corresponding inventory queries across data discovered by Defender EASM. These features make database queries more user-friendly, enabling our customers to use natural language for any related queries. We’ll continue to expand Copilot capabilities in Azure so you can be more productive and focused on writing code.

Cloud infrastructure built for limitless innovation

Microsoft is committed to helping you stay ahead in this new era by giving you the power, flexibility, and performance you need to achieve your AI ambitions. Our unique approach to cloud and AI infrastructure helps us and developers like you meet the challenges of the ever-changing technological landscape head-on so you can continue working efficiently while innovating at scale.

Discover new ways to transform with AI

Learn how Azure helps build AI experiences

Read more about AI-powered analytics

Key Microsoft Build sessions

BRK126: Adaptive cloud approach: Build and scale apps from cloud to edge

BRK124: Building AI applications that leverage your data in object storage

BRK 129: Building applications at hyper scale with the latest Azure innovations

BRK133: Unlock potential on Azure with Microsoft Copilot

BRK127: Azure Monitor: Observability from code to cloud

The post Unleashing innovation: The new era of compute powering Azure AI solutions appeared first on Azure Blog.
Quelle: Azure