Introducing GPT-4 in Azure OpenAI Service

At Microsoft, we are constantly discovering new ways to unleash creativity, unlock productivity, and uplevel skills so that more people can benefit from using AI. This is allowing our customers to build the future faster and more responsibly by powering their apps using large-scale AI models. Our collaboration with OpenAI, along with the power of Azure have been core to our journey.

Today, we are excited to announce that GPT-4 is available in preview in Azure OpenAI Service. Customers and partners already using Azure OpenAI Service can apply for access to GPT-4 and start building with OpenAI’s most advanced model yet. With this milestone, we are proud to bring the world’s most advanced AI models—including GPT-3.5, ChatGPT, and DALL•E 2—to Azure customers, backed by Azure AI-optimized infrastructure, enterprise-readiness, compliance, data security, and privacy controls, along with many integrations with other Azure services.

Customers can begin applying for access to GPT-4 today. Billing for all GPT-4 usage begins April 1, 2023, at the following prices:

GPT-4

Prompt

Completion

8k context

$0.03 per 1,000 tokens

$0.06 per 1,000 tokens

32k context

$0.06 per 1,000 tokens

$0.12 per 1,000 tokens

GPT-4 for every business

While the recently announced new Bing and Microsoft 365 Copilot products are already powered by GPT-4, today’s announcement allows businesses to take advantage of the same underlying advanced models to build their own applications leveraging Azure OpenAI Service.

With generative AI technologies, we are unlocking new efficiencies for businesses in every industry. For instance, see how Azure OpenAI Service can allow bot developers to create virtual assistants in minutes using natural language with Copilot in Power Virtual Agents.

GPT-4 has the potential to take this experience to a whole new level using its broader knowledge, problem-solving abilities, and domain expertise. With GPT-4 in Azure OpenAI Service, businesses can streamline communications internally as well as with their customers, using a model with additional safety investments to reduce harmful outputs.

Companies of all sizes are putting Azure AI to work for them, many deploying language models into production using Azure OpenAI Service, and knowing that the service is backed by the unique supercomputing and enterprise capabilities of Azure. Solutions include improving customer experiences end-to-end, summarizing long-form content, helping write software, and even reducing risk by predicting the right tax data.

Customers are accelerating the adoption of language models

We are just scratching the surface with generative AI technologies and are working to enable our customers to responsibly adopt Azure OpenAI Service to bring real impact. With GPT-4, Epic Healthcare, Coursera, and Coca-Cola plan to use this advancement in unique ways:

"Our investigation of GPT-4 has shown tremendous potential for its use in healthcare. We'll use it to help physicians and nurses spend less time at the keyboard and to help them investigate data in more conversational, easy-to-use ways."—Seth Hain, Senior Vice President of Research and Development at Epic

"Coursera is using Azure OpenAI Service to create a new AI-powered learning experience on its platform, enabling learners to get high-quality and personalized support throughout their learning journeys. Together, Azure OpenAI Service and the new GPT-4 model will help millions around the world learn even more effectively on Coursera."—Mustafa Furniturewala, Senior Vice President of Engineering at Coursera

"Words cannot express the excitement and gratitude we feel as a consumer package goods company for the boundless opportunities that Azure OpenAI has presented us. With Azure Cognitive Services at the heart of our digital services framework, we have harnessed the transformative power of OpenAI's text and image generation models to solve business problems and build a knowledge hub. But it is the sheer potential of OpenAI's upcoming GPT-4 multimodal capabilities that truly fills us with awe and wonder. The possibilities for marketing, advertising, public relations, and customer relations are endless, and we cannot wait to be at the forefront of this revolutionary technology. We know that our success is not just about technology but also about having the right enterprise features in place. That's why we're proud to have a long-standing partnership with Microsoft Azure, ensuring that we have all the tools we need to deliver exceptional experiences to our customers. Azure OpenAI is more than just cutting-edge technology—it's a true game-changer, and we're honored to be a part of this incredible journey."—Lokesh Reddy Vangala, Senior Director of Engineering, Data and AI, The Coca-Cola Company

Our commitment to responsible AI

As we described in my previous blog, Microsoft has a layered approach for generative models, guided by Microsoft’s Responsible AI Principles. In Azure OpenAI, an integrated safety system provides protection from undesirable inputs and outputs and monitors for misuse. On top of that, we provide guidance and best practices for customers to responsibly build applications using these models, and we expect customers to comply with the Azure OpenAI Code of Conduct. With GPT-4, new research advances from OpenAI have enabled an additional layer of protection. Guided by human feedback, safety is built directly into the GPT-4 model, which enables the model to be more effective at handling harmful inputs, thereby reducing the likelihood that the model will generate a harmful response.

Getting started with GPT-4 in Azure OpenAI Service

Apply for access to GPT-4 by completing this form.
Learn more about Azure OpenAI Service and more about all the latest enhancements.
Get started with GPT-4 in Azure OpenAI Service in Microsoft Learn.
Read our Partner announcement blog, Empowering partners to develop AI-powered apps and experiences with ChatGPT in Azure OpenAI Service.
Learn how to use the new Chat Completions API (preview) and model versions for ChatGPT and GPT-4 models in Azure OpenAI Service.

Quelle: Azure

Azure Data Manager for Energy: Achieve interoperability with Petrel

Microsoft Azure Data Manager for Energy is the first fully managed OSDU™ Data Platform built for the energy industry. This solution is the first step in unraveling the challenge of data—moving from disparate systems and disconnected applications to a holistic approach. The product’s ideation directly reflects the partnership between Microsoft and SLB, capitalizing on each organization’s unique expertise.

As the energy industry works to achieve a sustainable low carbon future, organizations are taking advantage of the cloud to optimize existing assets and de-risk new ventures. Universally, data is at the core of their digital transformation strategies—yet only a small fraction of energy company data is properly tagged and labeled to be searchable. This leads engineers and geoscientists to spend significant time outside of their expertise trying to discover and analyze data. Azure Data Manager for Energy customers can seamlessly connect to an open ecosystem of interoperable applications from other Independent Software Vendors (ISVs) and the Microsoft ecosystem of productivity tools. Ultimately, the open Microsoft platform enables developers, data managers, and geoscientists alike to innovate the next generation of digital solutions for the energy industry.

Enhanced data openness and liberation in Petrel

“We all benefit from making the world more open. As an industry, our shared goal is that openness in data will enable a fully liberated and connected data landscape. This is the natural next step towards data-driven workflows that integrate technologies seamlessly and leverage AI for diverse and creative solutions that take business performance to the next level.”—Trygve Randen, Director, Data & Subsurface at Digital & Integration, SLB.

The co-build partnership between Microsoft and SLB improves customers’ journey and performance, by unlocking data through interoperable applications. Delfi™ digital platform from SLB on Azure features a broad portfolio of applications, including the Petrel E&P Software Platform. The Petrel E&P Software Platform enhanced with AI enables workflows in Petrel to run with significantly faster compute times and include access to new tools, increasing the flexibility and productivity of geoscientists and engineers.

Microsoft and SLB rearchitected Petrel Data Services to allow Petrel Projects and data to be permanently stored in the customer’s instance. Petrel Data Services leverages core services found in OSDU™, such as partition and entitlement services. This change further aligns Petrel Data Services with the OSDU™ Technical Standard schemas and directly integrates with storage as the system of record. Now when geoscientists or engineers create new Petrel projects or domain data, each is liberated from Petrel into its respective Domain Data Management Service (DDMS) provided by OSDU™, like seismic or wellbore, in Azure Data Manager for Energy. These Petrel liberated projects or data become immediately discoverable in Petrel on Delfi™ Digital Platform or any third-party application developed in alignment with the emerging requirements of the OSDU™ Technical Standard such as INT’s IVAAP.

By splitting Petrel and Project Explorer software as a service (SaaS) applications from the data infrastructure, data resides in Azure Data Manager for Energy without any dependencies on an external app to access that data. Users can access and manage Petrel liberated Project Explorer and data in Azure Data Manager for Energy independent of any prerequisite application or license. Microsoft provides a secure, scalable infrastructure that governs data safely in the customer tenant while SLB focuses on delivering continuous updates to Petrel and Project Explorer on Delfi™ Digital Platform which expedites feature delivery.

Petrel and Project Explorer on Azure Data Manager for Energy

1.    Search for and Discover Petrel Projects: Petrel Project Explorer shows all Petrel Project versions liberated from all users and allows the viewing of data associated with each project based on corresponding data entitlements. This includes images of the windows that are created in the project, metadata (coordinate reference systems, time zone, and more), and all data stored in the project. Using Project Explorer allows to preserve every single change throughout the lifetime of a Petrel project and preserve every critical milestone required by regulations or for historical purposes. Data and decisions can be easily shared and connected to other cloud native solutions on Delfi™ Digital Platform, and automatic, full data lineage and project versioning is always available.

2.    Connect Petrel to domain data: Petrel users can consume seismic and wellbore OSDU™ domain data directly from Azure Data Manager for Energy. Furthermore, Petrel Data Services enables the development of diverse and creative solutions for the exploration and production value chain which includes liberated data consumption in other applications like Microsoft Power BI for data analytics.

3.    Data liberation: Petrel Data Services empowers Petrel users to liberate Petrel Project data into Azure Data Manager for Energy where data and project content can be accessed without opening Petrel, providing simpler data overview and in-context access. Data liberation allows for direct consumption into other data analytics applications, generating new data insights into Petrel projects, breaking down data silos, and improving user and corporate data-driven workflows. It relieves users from Petrel project management and improves the focus on domain workflows for value creation.

Figure 1: Project Explorer on Azure Data Manager for Energy: View all Petrel projects within an organization in one place through an intuitive and performant User Interface (UI).

Interoperable workflows that de-risk and drive efficiency

Both traditional and new energy technical workflows are optimized when data and application interoperability are delivered. Geoscientists and engineers, therefore, want to incorporate as much diverse domain data as possible. Customers want to run more scenarios in different applications, compare results with their colleagues, and ultimately liberate the best data and the knowledge associated with it to a data platform for others to discover and consume. With Petrel and Petrel Data Services powered by Azure Data Manager for Energy, customers achieve this interoperability.

Companies can liberate wellbore and seismic data for discovery in any application developed in alignment with the emerging requirements of the OSDU™ Technical Standard. As Petrel and Petrel Data Services use the standard schemas, all data is automatically preserved and indexed for search, discovery, and consumption. This extensibility model enables geoscientists and engineers as well as data managers to seamlessly access data in their own internal applications. SLB apps on Delfi™ Digital Platform such as Techlog, as well as Microsoft productivity tools including Power BI and an extensive ecosystem of partner apps are all available in this model. Additionally, developers can refocus their efforts on innovating and building new apps—taking advantage of Microsoft Power Platform to build low-code or no-code solutions. This creates the full data-driven loop and ultimately enables integrated workflows for any interoperable apps.

Figure 2: Azure Data Manager for Energy Data Flow connects seamlessly to a broad ecosystem of interoperable applications across Delfi™, Azure Synapse, Microsoft Power Platform, and the ISV ecosystem.

Get started today

Azure Data Manager for Energy helps energy companies gain actionable insights, improve operational efficiency, and accelerate time to market on the enterprise-grade, cloud-based OSDU™ Data Platform. Visit the website to get started.
Quelle: Azure

Protect against cyberattacks with the new Azure Firewall Basic

Cyberattacks continue to rise across businesses of all sizes as attackers are adapting their techniques and increasing the complexity of their operations.1 The risk of these attacks is significant for small and medium businesses (SMBs) as they usually don’t have the specialized knowledge or resources to protect against emerging threats and face more challenges when recovering from an attack. In a recent Microsoft survey,2 70 percent of SMBs think cyberthreats are becoming more of a business risk and nearly one in four SMBs stated that they had a security breach in the last year.

SMBs need solutions that are tailored to their unique needs and challenges. Microsoft is committed to delivering security solutions to meet the needs of all our customers. We are excited to announce the general availability of Azure Firewall Basic, a new SKU of Azure Firewall built for SMBs.

Since public preview, we have seen a wide adoption of the Azure Firewall Basic. Customers stated the simplicity and ease of use of the Azure Firewall as one of the key benefits for choosing Azure Firewall Basic.  We have also added the capability to deploy Azure Firewall inside a virtual hub in addition to a virtual network. This gives businesses the flexibility to choose the deployment option that best meets their needs.

Deploying Azure Firewall in a virtual network is recommended for customers who plan to use traditional hub-and-spoke network topology with a Firewall on the hub. Whereas, deploying on a virtual hub is recommended for customers with large or global network deployments in Azure where global transit connectivity across Azure regions and on-premises locations is needed.

Providing SMBs with a highly available Firewall at an affordable price point

Azure Firewall Basic brings the simplicity & security of Azure Firewall to SMBs at a cost-effective price point

It offers Layer 3–Layer 7 filtering and alerts on malicious traffic with built-in threat intelligence from Microsoft threat intelligence. As a cloud-native service, Azure Firewall Basic is simple to deploy with a few clicks and seamlessly integrates with other Azure services, including Microsoft Azure Firewall Manager, Azure Monitor, Azure Events Hub, Microsoft Sentinel, and Microsoft Defender for Cloud.

Key features of Azure Firewall Basic

Comprehensive, cloud-native network firewall security

Network and application traffic filtering—Centrally create, allow, or deny network filtering rules by source and destination IP address, port, and protocol. Azure Firewall is fully stateful, so it can distinguish legitimate packets for different types of connections. Rules are enforced and logged across multiple subscriptions and virtual networks.
Threat intelligence to alert on malicious traffic—Enable threat intelligence-based filtering to alert on traffic from or to known malicious IP addresses and domains. The IP addresses and domains are sourced from the Microsoft threat intelligence feed.
Built-in high availability—Azure Firewall Basic provides built-in high availability to ensure that your network traffic is always protected. Azure Firewall Basic can replicate your firewall instance across two availability zones, ensuring that your traffic is always filtered even if one of the zones goes down.

Simple setup and easy to use

Set up in just a few minutes—Use the Quickstart deployment Azure Resource Manager (ARM) templates to easily deploy Azure Firewall Basic directly to your Azure environment.
Automate deployment (deploy as code)—Azure Firewall Basic provides native support for Infrastructure as Code (IaC). Teams can define declarative ARM templates that specify the infrastructure required to deploy solutions. Third-party platforms like Terraform also support IaC to manage automated infrastructure.
Zero maintenance with automatic updates—Azure Firewall is automatically updated with the latest threat intelligence and security updates to ensure that it stays up-to-date and protected against the latest threats.
Centralized management via Azure Firewall Manager—Azure Firewall Manager is a central management solution that allows you to manage multiple Azure Firewall instances and policies across your organization from a single location, ensuring that your security policies are consistent and up to date across your organization.

Cost-effective

Designed to deliver essential, cost-effective protection of your Azure resources within your virtual networks.

Choose the right Azure Firewall SKU for your business

Azure Firewall is offered in three SKUs to meet a wide range of use cases and needs:

Azure Firewall Premium is recommended for customers looking to secure highly sensitive applications, such as payment processing. In addition to all features of the Azure Firewall standard, it also supports advanced threat protection capabilities like malware and Transport Layer System (TLS) inspection.
Azure Firewall Standard is recommended for customers looking for Layer 3–Layer 7 firewall and require auto-scaling to handle peak traffic periods of up to 30 gigabits per second (Gbps). It supports enterprise features like threat intelligence, Domain Name System (DNS) proxy, custom DNS, and web categories.
Azure Firewall Basic is recommended for SMB customers with throughput needs of less than 250 megabits per second (Mbps).

Let’s take a closer look at the features across the three Azure Firewall SKUs.

Azure Firewall Basic pricing

 

Azure Firewall Basic pricing includes both deployment and data processing charges for both virtual network and virtual hub scenarios. Pricing and billing for Azure Firewall Basic with virtual hub will be effective starting May 1, 2023.

For more details, visit the Azure Firewall pricing page.

Next steps

For more information on everything we covered in this blog post, see the following resources:

Azure Firewall documentation.
Azure Firewall Manager documentation.
Deploy and configure Azure Firewall Basic.

1Microsoft Digital Defense Report 2022

2April 2022: Microsoft Small and Medium Business quantitative survey research: Security in the new environment
Quelle: Azure

Announcing Microsoft Azure Data Manager for Agriculture: Accelerating innovation across the agriculture value chain

The agriculture industry is at a turning point. While food may seem plentiful in many global regions, the number of people going hungry has continued to increase over the last nine years. To feed growing populations sustainably and efficiently, the way we produce food must change. From the soil—where improvements to farming practices can help mitigate climate change to the shelf—where customers look for products with minimal carbon footprint—the agriculture and food value chain is primed for innovation.

That's why we’re thrilled to announce that Microsoft Azure Data Manager for Agriculture is now available in preview. At Microsoft, we’ve long recognized that scaling innovation across the industry starts with data. What began with Project FarmBeats, an ambitious initiative to collect and transform agricultural data, has now evolved into a timely commercial solution.

Azure Data Manager for Agriculture extends the Microsoft Intelligent Data Platform with industry-specific data connectors and capabilities to connect farm data from disparate sources, enabling organizations to leverage high-quality datasets and accelerate the development of digital agriculture solutions. Instead of devoting resources to managing unstructured data, customers and partners can focus on product innovation with the ability to reason over readily available and abundant data. Furthermore, organizations can use in-house, third-party, or Intelligent Data Platform services to speed the path to analytics and business intelligence solutions. With a connected ecosystem of partners building solutions on top of Azure Data Manager for Agriculture, this is another step towards a connected and collaborative agriculture industry.

Accelerate innovation through data

With so much agriculture-relevant data generated across the farm—from sensors in the soil to satellites orbiting the earth—many organizations don’t have the resources to harness it effectively. Azure Data Manager for Agriculture helps break down data silos, allowing organizations to build solutions that provide predictive and prescriptive insights on soil health, changing weather patterns, waste tracking, carbon sequestration, and more.

For example, Bayer used Azure Data Manager for Agriculture to shift from a self-managed data estate to a managed model with Microsoft. Bayer's FieldView platform harnesses data from Azure Data Manager for Agriculture’s satellite and weather pipelines to enable insights on potential yield-limiting factors in growers’ fields. In addition, Bayer is making their industry-leading expertise available to enterprise customers in the form of AgPowered Services—a set of solutions that ingest data from Azure Data Manager for Agriculture to provide timely insights on crop health, weather forecast, crop growth tracking, and more.

Not only is Bayer running their solutions on Azure Data Manager for Agriculture, but they’re also bringing decades of agriculture expertise to help develop a robust Azure Data Manager. Thanks to our strategic partnership, we’re leveraging Bayer’s industry knowledge—as well as data connectors, models, transformations, and workflows—to inform how we strengthen Azure Data Manager for Agriculture and empower organizations to address the challenges in agriculture today.

“Azure Data Manager for Agriculture is an important step towards accelerating the impact of big data and agriculture. With high-quality data fueling insights, we expect to see a value chain that is more predictable, more transparent, and importantly, where the value is shared all the way back to producers.”—Jeremy Williams, Head of Climate and Digital Farming, Bayer Crop Science.

Enable a more sustainable future

Feeding a growing global population and building a healthier world is only possible if sustainability is practiced from farm to fork. With Azure Data Manager for Agriculture, agriculture input providers can accelerate solutions that empower farmers to adopt more sustainable practices. For example, Azure Data Manager for Agriculture is a foundational component for Land O’Lakes’ digital offerings, including the Truterra sustainability tool. Truterra provides insight into how different agricultural practices impact water, nitrogen, and carbon on a farm, and it enables farmers to track their soil’s carbon sequestration and participate in carbon markets.

“Through our collaboration [with Microsoft], we are enabling farmers with new services to improve their operations and to quantify data for their customers that tells a story of sustainability.”—Teddy Bekele, Chief Technology Officer, Land O’Lakes, Inc.

Cultivate trust rooted in transparency

Consumers and investors alike are putting pressure on companies to be transparent about their agricultural and sustainability practices. With a clearer view into farm operations, companies can take an important step to establish trust with investors and customers. Azure Data Manager for Agriculture enables organizations to replace self-reported data with high-quality farm data to provide more accurate information to stakeholders—helping companies build brand trust through ethically and sustainably produced products. And as security and privacy concerns grow, organizations can rest assured knowing their data is stored in the most trusted cloud, built to meet stringent security and compliance requirements.

The Microsoft commitment to sustainability

Azure Data Manager for Agriculture is only one part of the Microsoft commitment to accelerating progress toward a more sustainable planet. With their next initiative, Project FarmVibes, Microsoft Research is building toolkits and AI models that are available in Microsoft Open Source to advance agriculture innovation in the scientific community across academia and business.

Microsoft has also launched Microsoft Cloud for Sustainability, which empowers organizations to accelerate sustainability progress and business growth by bringing together a growing set of environmental, social, and governance (ESG) capabilities from Microsoft and our global ecosystem of partners. Together, we are keeping sustainability at the core of our offerings, and we seek to empower organizations to adopt more regenerative and sustainable farming practices.

Learn more and transform your operations today

If you’re interested in using Azure Data Manager for Agriculture for your business, you can sign up here. Azure Data Manager for Agriculture requires registration and is currently only available to approved customers and partners during the preview period.
Quelle: Azure

Microsoft to showcase purpose-built AI infrastructure at NVIDIA GTC

Join Microsoft at NVIDIA GTC, a free online global technology conference (GTC), March 20 to 23 to learn how organizations of any size can power AI innovation with purpose-built cloud infrastructure from Microsoft.

Microsoft's Azure AI supercomputing infrastructure is uniquely designed for AI workloads and helps build and train some of the industry’s most advanced AI solutions. From data preparation to model and infrastructure performance management, Azure’s comprehensive portfolio of powerful and massively scalable GPU-accelerated virtual machines (VMs) and seamless integration with services like Azure Batch and open-source solutions helps streamline management and automation of large AI models and infrastructure.

Attend NVIDIA GTC to discover how Azure AI infrastructure optimized for AI performance can deliver speed and scale in the cloud and help you reduce the complexity of building, training, and bringing AI models into production.

Don’t miss session S52469 featuring Nidhi Chappell, a recipient of the 2023 People to Watch, recognized as a high-performance computing (HPC) luminary by HPCwire. Nidhi plays a leading role in driving HPC and AI innovation, accelerating the development of science and adoption of AI by enabling access to the best infrastructure and services for Microsoft customers.

Register for NVIDIA GTC today. 

Microsoft sessions at NVIDIA GTC

Add the below Microsoft sessions at GTC to your conference schedule to learn about the latest Azure AI infrastructure and dive deep into a variety of use cases and technologies. 

Featured sessions

Accelerate AI Innovation with Unmatched Cloud Scale and Performance

Thursday, March 23, 2023 | 7:00 to 7:50 AM PT

Nidhi Chappell, General Manager, Azure HPC, AI, SAP and Confidential Computing, Microsoft

Kathleen Mitford, Corporate Vice President, Azure Marketing, Microsoft

Manuvir Das, Vice President of Enterprise Computing, NVIDIA

Alex Kendall, CEO and Co-Founder, Wayve

Azure’s purpose-built AI infrastructure is enabling leading organizations in AI to build a new era of innovative applications and services. The convergence of cloud flexibility and economics, with advances in cloud performance, is paving the way to accelerate AI initiatives across simulations, science, and industry. Whether you need to scale to 80,000 cores for MPI workloads, or you're looking for AI supercomputing capabilities, Azure can support your needs. Learn more about Azures AI platform, our latest updates, and hear about customer experiences.

Azure's Purpose-Built AI Infrastructure Using the Latest NVIDIA GPU Accelerators

On-demand

Matt Vegas, Principal Product Manager, Microsoft

Microsoft offers some of the most powerful and massively scalable Virtual Machines, optimized for AI workloads. Join us for an in-depth look at the latest updates for Azure’s ND-series based on NVIDIA GPUs, engineered to deliver a combination of high-performance, interconnected GPUs, working in parallel that can help you reduce complexity, minimize operational bottlenecks operations, and can deliver reliability at scale. 

Talks and panel sessions

Session ID

Session Title

Speakers

Primary Topic

S51226

Accelerating Large Language Models via Low-Bit Quantization

Young Jin Kim, Principal Researcher, Microsoft

Rawn Henry, Senior AI Developer Technology Engineer, NVIDIA

Deep Learning-Inference

S51204

Transforming Clouds to Cloud-Native Supercomputing: Best Practices with Microsoft Azure

Jithin Jose, Principal Software Engineer, Microsoft Azure

Gilad Shainer, SVP Networking, NVIDIA

HPC- Supercomputing

S51756

Accelerating AI in Federal Cloud Environments

Bill Chappel, Vice President of Mission Systems in Strategic Missions and Technology, Microsoft

Steven H. Walker, Chief Technology Officer, Lockheed Martin

Matthew Benigni, Chief Data Officer, Army Futures Command

Christi DeCuir, Director, Cloud Go-to-Market, NVIDIA

Data Center/ Cloud-Business Strategy

S51703

Accelerating Disentangled Attention Mechanism in Language Models

Pengcheng He, Principal Researcher, Microsoft

Haohang Huang, Senior AI Engineer, NVIDIA

Conversational AI NLP

S51422

SwinTransformer and its Training Acceleration

Han Hu, Principal Research Manager, Microsoft Research Asia

Li Tao, Tech Software Engineer, NVIDIA

Deep Learning– Training+

S51260

Multimodal Deep Learning for Protein Engineering

Kevin Yang, Senior Researcher, Microsoft Research New England

Healthcare- Drug Discovery

S51945

Improving Dense Text Retrieval Accuracy with Approximate Nearest Neighbor Search

Menghao Li, Software Engineer, Microsoft

Akira Naruse, Senior Developer Technology Engineer, NVIDIA

Data Science

S51709

Hopper Confidential Computing: How it Works under the Hood

Antoine Delignat-Lavaud, Principal Researcher Microsoft Research, Microsoft

Phil Rogers, VP of System Software, NVIDIA

Data Center/ Cloud Infrastructure- Technical

S51447

Data-Driven Approaches to Language Diversity

Kalika Bali, Principal Researcher, Microsoft Research India

Caroline Gottlieb, Product Manager, Data Strategy, NVIDIA

Damian Blasi, Harvard Data Science Fellow, Department of Human Evolutionary Biology, Harvard University

Bonaventure Dossou, Ph.D. Student, McGill University and Mila Quebec AI Institute

EM Lewis-Jong, Common Voice – Product Lead, Mozilla Foundation

Conversational AI/NLP

S51756a

Accelerating AI in Federal Cloud Environments, with Q&A from EMEA Region

Bill Chappel, Vice President of Mission Systems in Strategic Missions and Technology, Microsoft

Steven H. Walker, Chief Technology Officer, Lockheed Martin

Larry Brown, SA Manager, NVIDIA

Christi DeCuir, Director, Cloud Go-to-Market, NVIDIA

Data Center/ CloudBusiness Strategy

S51589

Accelerating Wind Energy Forecasts with AceCast

Amirreza Rastegari, Senior Program Manager, Azure Specialized Compute, Microsoft

Gene Pache, TempoQuest

HPC-Climate/ Weather/ Ocean Modeling

S51278

Next-Generation AI for Improving Building Security and Safety

Adina Trufinescu, Senior Program Manager, Azure Specialized Compute, Microsoft

Computer Vision -AI Video Analytics

Deep Learning Institute workshops and labs

We are proud to host NVIDIA’s Deep Learning Institute (DLI) training at NVIDIA GTC. Attend full-day, hands-on, instructor-led workshops or two-hour free training labs to get up to speed on the latest technology and breakthroughs. Hosted on Microsoft Azure, these sessions enable and empower you to leverage NVIDIA GPUs on the Azure platform to solve the world’s most interesting and relevant problems. 

Register for a Deep Learning Institute workshop or lab today.

Learn more about Azure AI infrastructure

Whether your project is big or small, local or global, Microsoft Azure is empowering companies worldwide to push the boundaries of AI innovation. Learn how you can make AI your reality by exploring the following resources. 

Azure AI Infrastructure
Azure AI Solutions 
Accelerating AI and HPC in the Cloud
AI-first Infrastructure and Toolchain at Any Scale
The case for AI in the Azure Cloud
AI Infrastructure for Smart Manufacturing
AI Infrastructure for Smart Retail

Quelle: Azure

Azure previews powerful and scalable virtual machine series to accelerate generative AI

Delivering on the promise of advanced AI for our customers requires supercomputing infrastructure, services, and expertise to address the exponentially increasing size and complexity of the latest models. At Microsoft, we are meeting this challenge by applying a decade of experience in supercomputing and supporting the largest AI training workloads to create AI infrastructure capable of massive performance at scale. The Microsoft Azure cloud, and specifically our graphics processing unit (GPU) accelerated virtual machines (VMs), provide the foundation for many generative AI advancements from both Microsoft and our customers.

“Co-designing supercomputers with Azure has been crucial for scaling our demanding AI training needs, making our research and alignment work on systems like ChatGPT possible.”—Greg Brockman, President and Co-Founder of OpenAI. 

Azure's most powerful and massively scalable AI virtual machine series

Today, Microsoft is introducing the ND H100 v5 VM which enables on-demand in sizes ranging from eight to thousands of NVIDIA H100 GPUs interconnected by NVIDIA Quantum-2 InfiniBand networking. Customers will see significantly faster performance for AI models over our last generation ND A100 v4 VMs with innovative technologies like:

8x NVIDIA H100 Tensor Core GPUs interconnected via next gen NVSwitch and NVLink 4.0
400 Gb/s NVIDIA Quantum-2 CX7 InfiniBand per GPU with 3.2Tb/s per VM in a non-blocking fat-tree network
NVSwitch and NVLink 4.0 with 3.6TB/s bisectional bandwidth between 8 local GPUs within each VM
4th Gen Intel Xeon Scalable processors
PCIE Gen5 host to GPU interconnect with 64GB/s bandwidth per GPU
16 Channels of 4800MHz DDR5 DIMMs

Delivering exascale AI supercomputers to the cloud

Generative AI applications are rapidly evolving and adding unique value across nearly every industry. From reinventing search with a new AI-powered Microsoft Bing and Edge to AI-powered assistance in Microsoft Dynamics 365, AI is rapidly becoming a pervasive component of software and how we interact with it, and our AI Infrastructure will be there to pave the way. With our experience of delivering multiple-ExaOP supercomputers to Azure customers around the world, customers can trust that they can achieve true supercomputer performance with our infrastructure. For Microsoft and organizations like Inflection, NVIDIA, and OpenAI that have committed to large-scale deployments, this offering will enable a new class of large-scale AI models.

"Our focus on conversational AI requires us to develop and train some of the most complex large language models. Azure's AI infrastructure provides us with the necessary performance to efficiently process these models reliably at a huge scale. We are thrilled about the new VMs on Azure and the increased performance they will bring to our AI development efforts."—Mustafa Suleyman, CEO, Inflection.

AI at scale is built into Azure’s DNA. Our initial investments in large language model research, like Turing, and engineering milestones such as building the first AI supercomputer in the cloud prepared us for the moment when generative artificial intelligence became possible. Azure services like Azure Machine Learning make our AI supercomputer accessible to customers for model training and Azure OpenAI Service enables customers to tap into the power of large-scale generative AI models. Scale has always been our north star to optimize Azure for AI. We’re now bringing supercomputing capabilities to startups and companies of all sizes, without requiring the capital for massive physical hardware or software investments.

“NVIDIA and Microsoft Azure have collaborated through multiple generations of products to bring leading AI innovations to enterprises around the world. The NDv5 H100 virtual machines will help power a new era of generative AI applications and services.”—Ian Buck, Vice President of hyperscale and high-performance computing at NVIDIA. 

Today we are announcing that ND H100 v5 is available for preview and will become a standard offering in the Azure portfolio, allowing anyone to unlock the potential of AI at Scale in the cloud. Sign up to request access to the new VMs.

Learn more about AI at Microsoft

Reinventing search with a new AI-powered Microsoft Bing and Edge, your copilot for the web.
Introducing GitHub Copilot: your AI pair programmer.
Introducing Microsoft Dynamics 365 Copilot, bringing next-generation AI to every line of business.
Azure OpenAI Service
Azure Machine Learning Service
Microsoft AI at Scale
NVIDIA Teams With Microsoft to Build Massive Cloud AI Computer

Quelle: Azure

ChatGPT is now available in Azure OpenAI Service

Today, we are thrilled to announce that ChatGPT is available in preview in Azure OpenAI Service. With Azure OpenAI Service, over 1,000 customers are applying the most advanced AI models—including Dall-E 2, GPT-3.5, Codex, and other large language models backed by the unique supercomputing and enterprise capabilities of Azure—to innovate in new ways.

Since ChatGPT was introduced late last year, we’ve seen a variety of scenarios it can be used for, such as summarizing content, generating suggested email copy, and even helping with software programming questions. Now with ChatGPT in preview in Azure OpenAI Service, developers can integrate custom AI-powered experiences directly into their own applications, including enhancing existing bots to handle unexpected questions, recapping call center conversations to enable faster customer support resolutions, creating new ad copy with personalized offers, automating claims processing, and more. Cognitive services can be combined with Azure OpenAI to create compelling use cases for enterprises. For example, see how Azure OpenAI and Azure Cognitive Search can be combined to use conversational language for knowledge base retrieval on enterprise data.

Customers can begin using ChatGPT today. It is priced at $0.002/1k tokens and billing for all ChatGPT usage begins March 13th.

Real business value

Customers across industries are seeing business value from using Azure OpenAI Service, and we’re excited to see how organizations such as The ODP Corporation, Singapore’s Smart Nation Digital Government Office, and Icertis will continue to harness the power of Azure OpenAI and the ChatGPT model to achieve more:

“The ODP Corporation is excited to leverage the powerful AI technology of ChatGPT from Azure OpenAI Service, made possible through our collaboration with Microsoft. This technology will help [The ODP Corporation] drive continued transformation in our business, more effectively explore new possibilities, and design innovative solutions to deliver even greater value to our customers, partners, and associates. [The ODP Corporation] is building a ChatGPT-powered chatbot to support our internal business units, specifically HR. The chatbot has been successful in improving HR's document review process, generating new job descriptions, and enhancing associate communication. By utilizing ChatGPT's natural language processing and machine learning capabilities, [The ODP Corporation] aims to streamline its internal operations and drive business success. Embracing this cutting-edge technology will help increase our competitive edge in the market and enhance our customer experience.”—Carl Brisco, Vice President Product and Technology, The ODP Corporation

“Singapore's Smart Nation Digital Government Office is constantly looking to empower our public officers with technology to deliver better services to Singaporeans and better ideas for Singapore. ChatGPT and large language models more generally, hold the promise of accelerating many kinds of knowledge work in the public sector, and the alignment techniques embedded in ChatGPT help officers interact with these powerful models in more natural and intuitive ways. Azure OpenAI Service’s enterprise controls have been key to enabling exploration of these technologies across policy, operations, and communication use cases.”—Feng-ji Sim, Deputy Secretary, Smart Nation Digital Government Office, under the Prime Minister’s Office, Singapore

“Contracts are the foundation of commerce, governing every dollar in and out of an enterprise. At Icertis, we are applying AI to contracts so businesses globally can drive revenue, reduce costs, ensure compliance, and mitigate risk. The availability of ChatGPT on Microsoft's Azure OpenAI service offers a powerful tool to enable these outcomes when leveraged with our data lake of more than two billion metadata and transactional elements—one of the largest curated repositories of contract data in the world. Generative AI will help businesses fully realize the intent of their commercial agreements by acting as an intelligent assistant that surfaces and unlocks insights throughout the contract lifecycle. Delivering this capability at an enterprise scale, backed by inherent strengths in the security and reliability of Azure, aligns with our tenets of ethical AI and creates incredible new opportunities for innovation with the Icertis contract intelligence platform.”—Monish Darda, Chief Technology Officer at Icertis

In addition to all the ways organizations—large and small—are using Azure OpenAI Service to achieve business value, we’ve also been working internally at Microsoft to blend the power of large language models from OpenAI and the AI-optimized infrastructure of Azure to introduce new experiences across our consumer and enterprise products. For example:

•    GitHub Copilot leverages AI models in Azure OpenAI Service to help developers accelerate code development with its AI pair programmer.
•    Microsoft Teams Premium includes intelligent recap and AI-generated chapters to help individuals, teams, and organizations be more productive.
•    Microsoft Viva Sales’ new AI-powered seller experience offers suggested email content and data-driven insights to help sales teams focus on strategic selling motions to customers.
•    Microsoft Bing introduced an AI-powered chat option to enhance consumers’ search experience in completely new ways.

These are just a few examples of how Microsoft is helping organizations leverage generative AI models to drive AI transformation.

Customers and partners can also create new intelligent apps and solutions to stand out from the competition using a no-code approach in Azure OpenAI Studio. Azure OpenAI Studio, in addition to offering customizability for every model offered through the service, also offers a unique interface to customize ChatGPT and configure response behavior that aligns with your organization.

Watch how you can customize ChatGPT using System message right within Azure OpenAI Studio.

A responsible approach to AI

We’re already seeing the impact AI can have on people and companies, helping improve productivity, amplify creativity, and augment everyday tasks. We’re committed to making sure AI systems are developed responsibly, that they work as intended, and are used in ways that people can trust. Generative models, such as ChatGPT or DALL-E image generation model, are models that generate new artifacts. These types of models create new challenges; for instance, they could be used to create convincing but incorrect text to creating realistic images that never happened.

Microsoft employs a layered set of mitigations at four levels, designed to address these challenges. These are aligned with Microsoft's Responsible AI Standard. First, application-level protections that put the customer in charge, for instance, explaining that text output was generated by AI and making the user approve it. Second, technical protections like input and output content filtering. Third, process and policy protections that range from systems to report abuse to service level agreements. And fourth, documentation such as design guidelines and transparency notes to explain the benefits of a model and what we have tested.

We believe AI will profoundly change how we work, and how organizations operate in the coming months. To meet this moment, we will continue to take a principled approach to ensure our AI systems are used responsibly while listening, learning, and improving to help guide AI in a way that ultimately benefits humanity.

Getting started with Azure OpenAI Service

Learn more about Azure OpenAI Service and more about all the latest enhancements.
Get started with ChatGPT using Azure OpenAI Service.
Get started with the “Introduction to Azure OpenAI Service” course on Microsoft Learn.
Read the Partner announcement blog, Empowering partners to develop AI-powered apps and experiences with ChatGPT in Azure OpenAI Service.

Seth Juarez, Principal Program Manager and co-host of The AI Show, shares top use cases for Azure OpenAI Service and an example chatbot for retail using ChatGPT.
Quelle: Azure

Monitor Azure Virtual Network Manager changes with event logging

Today, our customers establish and manage their Azure virtual networks at scale. As their number of network resources grows, the question of how to maintain connectivity and security among their scale of resources arises. This is where Microsoft Azure Virtual Network Manager comes in—your one-stop shop for managing the connectivity and security of your network resources at scale (currently in preview). And when customers use Azure Virtual Network Manager, they also need visibility into what kind of changes were made so that they can audit those events, analyze those changes over time, and debug issues along the way. This capability is now a reality—Azure Virtual Network Manager event logging is now in preview.

Azure Virtual Network Manager (AVNM) uses Azure Monitor for telemetry collection and analysis like many other Azure services. AVNM now provides event logs that you can interact with through Azure Monitor’s Log Analytics tool in the Azure Portal, as well as through a storage account. You can also send these logs to an event hub or partner solution.

With this preview announcement, Azure Virtual Network Manager will provide a log category for network group membership change. In the context of AVNM, network groups are defined by the user to contain virtual networks. The membership of a network group can be manually provided (such as by selecting VNetA, VNetB, and VNetC to be a part of this network group) as well as conditionally set through Azure Policy (such as by defining that any virtual network within a certain subscription that contains some string in its name will be added to this network group). The network group membership change log category tracks when a particular virtual network is added to or removed from a network group. This can be used to track network group membership changes over time, to capture a snapshot of a particular virtual network’s network group membership, and more.

What attributes are part of this event log category?

This network group membership change category emits one log per network group membership change. So, when a virtual network is added to or removed from a network group, a log is emitted correlating to that single addition or removal for that particular virtual network. If you’re looking at one of these logs from your storage account, you’ll see several attributes:

Attribute
Description

time
Datetime when the event was logged.

resourceId
Resource ID of the network manager.

location
Location of the virtual network resource.

operationName
Operation that resulted in the virtual network being added or removed. Always the “Microsoft.Network/virtualNetworks/networkGroupMembership/write” operation.

category
Category of this log. Always “NetworkGroupMembershipChange.”

resultType
Indicates successful or failed operation.

correlationId
GUID that can help relate or debug logs.

level
Always “Info.”

properties
Collection of properties of the log.

Within the properties attribute are several nested attributes:

properties attribute
Description

Message
Basic success or failure message.

MembershipId
Default membership ID of the virtual network.

GroupMemberships
Collection of what network groups the virtual network belongs to. There may be multiple “NetworkGroupId” and “Sources” listed within this property since a virtual network can belong to multiple network groups simultaneously.

MemberResourceId
Resource ID of the virtual network that was added to or removed from a network group.

Within the GroupMemberships attribute are several nested attributes:

GroupMemberships attribute
Description

NetworkGroupId
ID of a network group the virtual network belongs to.

Sources

Collection of how the virtual network is a member of the network group.

Within the Sources attribute are several nested attributes:

Sources attribute
Description

Type
Denotes whether the virtual network was added manually (“StaticMembership”) or conditionally via Azure Policy (“Policy”).

StaticMemberId
If the “Type” value is “StaticMembership,” this property will appear.

PolicyAssignmentId
If the “Type” value is “Policy,” this property will appear. ID of the Azure Policy assignment that associates the Azure Policy definition to the network group.

PolicyDefinitionId
If the “Type” value is “Policy,” this property will appear. ID of the Azure Policy definition that contains the conditions for the network group’s membership.

How do I get started?

The first step you’ll need to take is to set up your Log Analytics workspace or your storage account, depending on how you want to consume these event logs. You should note that if you’re using a storage account or event hub, it will need to be in the same region of the network manager you’re accessing logs from. If you’re using a Log Analytics workspace, it can be in any region. The network manager you’re accessing the logs of won’t need to belong to the same subscription as your Log Analytics workspace or storage account, but permissions may restrict your ability to access logs cross-subscription.

Note that at least one virtual network must be added or removed from a network group in order to generate logs. A log will generate for this event a couple minutes later.

Accessing Azure Virtual Network Manager’s event logs with Log Analytics

The first step is to navigate to your desired network manager and select the Diagnostic settings blade under the Monitoring section. Then you can select Add diagnostic setting and select the option to send the logs to your Log Analytics workspace.

Then you can navigate to your Log Analytics workspace directly through your network manager by selecting the Logs blade under the Monitoring section.

Alternatively, you can also navigate to your Log Analytics workspace in the Azure Portal and select the Logs blade.

From either place, you can run your own queries on your network manager’s emitted logs for network group membership changes, or you can also run our preloaded queries. Our preloaded queries can fetch the most recent network group membership changes and failed network group membership changes.

Accessing Azure Virtual Network Manager’s event logs with a storage account

The first step is to again navigate to your desired network manager and select the Diagnostic settings blade under the Monitoring section. Then you can select Add diagnostic setting and select the option to archive the logs to your storage account.

Then you can navigate to your storage account and select the Storage browser blade.

Select Blob containers. A blob container will be automatically generated once network group membership changes occur.

Navigate down the blob container’s file path until you reach a JSON file for the datetime specified by that file path.

Download the JSON file to view the raw logs for the file path’s datetime.

Learn more about Azure Virtual Network Manager event logging

In just a few clicks, you’ve set up your network manager to route event logs to your Log Analytics workspace or your storage account. Now, you can get visibility into each occurrence of a virtual network entering or leaving a network group. Additional log categories are in the works, and in the meantime, feel free to check out our public documentation for more on Azure Virtual Network Manager.
Quelle: Azure

4 best-practices to keep your Windows Server estate secure and optimized

Windows Server customers often share with us the challenges of navigating rapid changes in recent years. Many of their IT estates have expanded to support growth, while teams are often changing, with talent coming and going. You may find your organization in a similar situation, with a sprawled IT estate that includes a mix of legacy and new applications and hardware. This can leave room for potential security vulnerabilities and compliance gaps, but also opportunities to optimize.

We are committed to supporting you through the next stages of optimization and growth in your organization, which starts with a secure IT foundation. Here are four best practices to keep your Windows Server estate secure and up-to-date:

1. Watch for update notifications and have a strategy to apply the latest security patches

A critical but often overlooked best practice is having a strategy to apply the latest security patches that are released. Our team continuously monitors and listens to customer feedback on any issues they have encountered and creates patches to address these. These are released on the second Tuesday of every month (known as Patch Tuesday). Keeping your various systems up-to-date with the latest patches will secure workloads and optimize day-to-day performance and operations. Learn more about best practices for software updates.

However, we know that patching also usually means rebooting and ultimately downtime for your workloads. If you are in Microsoft Azure, you can take advantage of Hotpatch, which allows you to keep your Windows Server virtual machines on Azure up-to-date without rebooting, enabling higher availability with faster and more secure delivery of updates.

2. Get deeper visibility and management capabilities at no additional cost

Many Windows Server customers might be familiar with many of the native Windows Server Microsoft Management Consoles (MMC). Windows Admin Center is the modern evolution of “in-box” management tools such as Server Manager and MMC. It has become the solution for managing Windows Server infrastructure, giving you deep management, troubleshooting, configuration, and maintenance capabilities over your server clusters.

It can be locally deployed with no cloud dependency or can be used within the Azure portal through direct integration, enabling you to carry over the simple and familiar UI when you decide to start adopting the cloud. Learn more in the Windows Admin Center documentation or download it today for free.

3. Check for end of support versions and prepare to modernize

Most organizations are likely to have a mix of Windows Server versions that support a variety of applications. Each version of Windows Server is backed by 10 years of support (5 years for mainstream support and 5 years for extended support) that include regular security updates, per the Microsoft lifecycle policy. After the end of support date, a version and its workloads will be vulnerable as they will no longer receive regular security updates. Windows Server 2012/R2 is the upcoming version that will reach the end of support on October 10, 2023.

With this in mind, a critical step towards optimizing performance and tightening security should be to check for Windows Server 2012/R2 versions, which will reach end of support soon. This can be done with various, built-in tools such as Server Manager, PowerShell, or at-scale with tools from Azure such as Azure Migrate or Azure Arc. Additionally, mapping out application and hardware dependencies on Windows Server should be done to determine the next best step:

Upgrading to the latest version such as Windows Server 2022 will provide the latest security, performance, and application modernization innovation. Learn more about how to perform in-place upgrades.
If you are unable to upgrade by the end-of-support date, you can continue to stay secure on current versions by getting extended security updates1 for up to three years free in Azure or purchasing them for deployment on-premises.

4. Utilize cloud-native services for enhanced security and compliance anywhere

Whether your organization has migrated to Azure or is just starting to consider the cloud, here are some steps you can take now, to enhance your security with Microsoft:

Already in Azure: To maximize your security coverage in Azure, be sure to check your secure score and improve it by enabling services such as Microsoft Defender for Cloud, Microsoft Sentinel (cloud-native SIEM), and Azure Network Security.
Have workloads on-premises: Extend Microsoft Defender for Servers to your on-premises Windows Servers by connecting them to Azure Arc.
Ready to migrate to Azure: When you are ready to migrate workloads to Azure, your first step can be an assessment with Azure Migrate or getting expert help and support through the Azure Migration and Modernization Program.

Learn more

We hope these best practices serve as starting points to help you increase security and optimize the performance of your IT platform, so you can focus on supporting business growth. Be sure to explore the resources below for further information:

Learn more about capabilities and offers for Windows Server on Azure.
Watch our recent webinar on-demand titled “Optimizing Windows and SQL Server Security in Azure.”
Register for our upcoming webinar titled “Cloud Migration Stories: Windows and SQL Server with Azure” on March 29, 2023, at 10 AM Pacific Time.
Take the recently available Windows Server Hybrid Administrator Certification to apply your current Windows Server knowledge and learn how to apply it in the current state of hybrid cloud computing.
Learn more about your options for Windows Server 2012/R2 end of support.
Join the Windows Server Tech Community for regular Ask Me Anything (AMA) sessions.

1In alignment with the servicing model for Windows 7 and Windows 8.1 (link to blog), the Windows Server 2012 and 2012/R2 ESU program will only include Monthly Rollup packages; Security Only update packages will not be provided.
Quelle: Azure

Announcing a renaissance in computer vision AI with Microsoft's Florence foundation model

Extract robust insights from image and video content with Azure Cognitive Service for Vision

We are pleased to announce the public preview of Microsoft’s Florence foundation model, trained with billions of text-image pairs and integrated as cost-effective, production-ready computer vision services in Azure Cognitive Service for Vision. The improved Vision Services enables developers to create cutting-edge, market-ready, responsible computer vision applications across various industries. Customers can now seamlessly digitize, analyze, and connect their data to natural language interactions, unlocking powerful insights from their image and video content to support accessibility, drive acquisition through SEO, protect users from harmful content, enhance security, and improve incident response times.

Microsoft was recently named a Leader in the IDC MarketScape: Worldwide General-Purpose Computer Vision AI Software Platforms 2022 Vendor Assessment (doc #US49776422, November 2022). The new Vision Services improves content discoverability with automatic captioning, smart cropping, classifying, background removal, and searching for images. Furthermore, users can track movements, analyze environments, and receive real-time alerts with responsible AI controls. 

Reddit will be using Vision Services to generate captions for hundreds of millions of images on its platform. Tiffany Ong, Reddit Product Manager of Consumer Product has said,

“With Microsoft’s Vision technology, we are making it easier for users to discover and understand our content. The newly created image captions make Reddit more accessible for everyone and give redditors more opportunities to explore our images, engage in conversations, and ultimately build connections and a sense of community."

Microsoft is harnessing the power of the new Vision Services in Microsoft 365 apps like Teams, PowerPoint, Outlook, Word, Designer, OneDrive, in addition to the Microsoft Datacenter. Microsoft Teams is driving innovation in the digital space with the help of segmentation capabilities, taking virtual meetings to the next level. PowerPoint, Outlook, and Word leverage image captioning for automatic alt-text to improve accessibility. Microsoft Designer and OneDrive are using improved image tagging, image search, and background generation to simplify image discoverability and editing. Microsoft Datacenters are leveraging Vision Services to enhance security and infrastructure reliability.

At this week's Microsoft Ability Summit, companies will learn how they can improve the accessibility of their visual content. We’ll share the future of our Seeing AI app and LinkedIn will share the benefits of utilizing Vision Services to deliver automatic alt-text descriptions for image analysis. As a preview, Jennison Asuncion, LinkedIn’s Head of Accessibility Engineering Evangelism has said,

“More than 40 percent of LinkedIn’s feed posts include at least one image. We want every member to have equal access to opportunity and are committed to ensuring that we make images accessible to our members who are blind or who have low vision so they can be a part of the online conversation. With Azure Cognitive Service for Vision, we can provide auto-captioning to edit and support alt. text descriptions. I'm excited about this new experience because now, not only will I know my colleague shared a picture from an event they attended, but that my CEO Ryan Roslansky is also in the picture.”

Try out the new out-of-the-box features our customers are using in Vision Studio:

Dense captions: Automatically deliver rich captions, design suggestions, accessible alt-text, SEO optimization, and intelligent photo curation to support digital content.
Image retrieval: Improve search recommendations and advertisements with natural language queries that seamlessly measure the similarity between images and text.

Background removal: Transform the look and feel of images by easily segmenting people and objects from their original background, replacing them with a preferred background scene.
Model customization: Lower costs and time to deliver custom models that match unique business demands at high precision, and with just a handful of images.
Video summarization (Video TL;DR): Search and interact with video content in the same intuitive way you think and write. Locate relevant content without the need for additional metadata.

Innovate responsibly

Review the responsible AI principles to learn how we are committed to developing AI systems that help make the world more accessible. We are focused on helping organizations take full advantage of AI, and we are investing heavily in programs that provide technology, resources, and expertise to empower those working to create a more sustainable, safe, and accessible world.

Get started today with Azure Cognitive Service for Vision

Revolutionize your computer vision applications with improved efficiency, accuracy, and accessibility in image and video processing, at the same low price. Visit Vision Studio to try out our latest demos.

Learn more about Azure Cognitive Service for Vision:

Get started with Microsoft Learn to build skills.
Watch the Florence showcase shared at the 2022 CVPR conference.
 

Quelle: Azure